Oct 14 06:39:41 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Oct 14 06:39:41 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 14 06:39:41 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Oct 14 06:39:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 14 06:39:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 14 06:39:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 14 06:39:41 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 14 06:39:41 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Oct 14 06:39:41 localhost kernel: signal: max sigframe size: 1776
Oct 14 06:39:41 localhost kernel: BIOS-provided physical RAM map:
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 14 06:39:41 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Oct 14 06:39:41 localhost kernel: NX (Execute Disable) protection: active
Oct 14 06:39:41 localhost kernel: SMBIOS 2.8 present.
Oct 14 06:39:41 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 14 06:39:41 localhost kernel: Hypervisor detected: KVM
Oct 14 06:39:41 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 14 06:39:41 localhost kernel: kvm-clock: using sched offset of 2741316819 cycles
Oct 14 06:39:41 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 14 06:39:41 localhost kernel: tsc: Detected 2799.998 MHz processor
Oct 14 06:39:41 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 14 06:39:41 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 14 06:39:41 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Oct 14 06:39:41 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 14 06:39:41 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 14 06:39:41 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 14 06:39:41 localhost kernel: Using GB pages for direct mapping
Oct 14 06:39:41 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Oct 14 06:39:41 localhost kernel: ACPI: Early table checksum verification disabled
Oct 14 06:39:41 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 14 06:39:41 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 14 06:39:41 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 14 06:39:41 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 14 06:39:41 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 14 06:39:41 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 14 06:39:41 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 14 06:39:41 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 14 06:39:41 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 14 06:39:41 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 14 06:39:41 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 14 06:39:41 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 14 06:39:41 localhost kernel: No NUMA configuration found
Oct 14 06:39:41 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Oct 14 06:39:41 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Oct 14 06:39:41 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Oct 14 06:39:41 localhost kernel: Zone ranges:
Oct 14 06:39:41 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 14 06:39:41 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 14 06:39:41 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Oct 14 06:39:41 localhost kernel:   Device   empty
Oct 14 06:39:41 localhost kernel: Movable zone start for each node
Oct 14 06:39:41 localhost kernel: Early memory node ranges
Oct 14 06:39:41 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 14 06:39:41 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 14 06:39:41 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Oct 14 06:39:41 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Oct 14 06:39:41 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 14 06:39:41 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 14 06:39:41 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 14 06:39:41 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 14 06:39:41 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 14 06:39:41 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 14 06:39:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 14 06:39:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 14 06:39:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 14 06:39:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 14 06:39:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 14 06:39:41 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 14 06:39:41 localhost kernel: TSC deadline timer available
Oct 14 06:39:41 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 14 06:39:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 14 06:39:41 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 14 06:39:41 localhost kernel: Booting paravirtualized kernel on KVM
Oct 14 06:39:41 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 14 06:39:41 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 14 06:39:41 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Oct 14 06:39:41 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Oct 14 06:39:41 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 14 06:39:41 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 14 06:39:41 localhost kernel: Fallback order for Node 0: 0 
Oct 14 06:39:41 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Oct 14 06:39:41 localhost kernel: Policy zone: Normal
Oct 14 06:39:41 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Oct 14 06:39:41 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Oct 14 06:39:41 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Oct 14 06:39:41 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 14 06:39:41 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 14 06:39:41 localhost kernel: software IO TLB: area num 8.
Oct 14 06:39:41 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Oct 14 06:39:41 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Oct 14 06:39:41 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 14 06:39:41 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Oct 14 06:39:41 localhost kernel: ftrace: allocated 176 pages with 3 groups
Oct 14 06:39:41 localhost kernel: Dynamic Preempt: voluntary
Oct 14 06:39:41 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 14 06:39:41 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 14 06:39:41 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 14 06:39:41 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 14 06:39:41 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 14 06:39:41 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 14 06:39:41 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 14 06:39:41 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 14 06:39:41 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 14 06:39:41 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 14 06:39:41 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Oct 14 06:39:41 localhost kernel: Console: colour VGA+ 80x25
Oct 14 06:39:41 localhost kernel: printk: console [tty0] enabled
Oct 14 06:39:41 localhost kernel: printk: console [ttyS0] enabled
Oct 14 06:39:41 localhost kernel: ACPI: Core revision 20211217
Oct 14 06:39:41 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 14 06:39:41 localhost kernel: x2apic enabled
Oct 14 06:39:41 localhost kernel: Switched APIC routing to physical x2apic.
Oct 14 06:39:41 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 14 06:39:41 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Oct 14 06:39:41 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 14 06:39:41 localhost kernel: LSM: Security Framework initializing
Oct 14 06:39:41 localhost kernel: Yama: becoming mindful.
Oct 14 06:39:41 localhost kernel: SELinux:  Initializing.
Oct 14 06:39:41 localhost kernel: LSM support for eBPF active
Oct 14 06:39:41 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Oct 14 06:39:41 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Oct 14 06:39:41 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 14 06:39:41 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 14 06:39:41 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 14 06:39:41 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 14 06:39:41 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 14 06:39:41 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Oct 14 06:39:41 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Oct 14 06:39:41 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 14 06:39:41 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 14 06:39:41 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 14 06:39:41 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 14 06:39:41 localhost kernel: Freeing SMP alternatives memory: 36K
Oct 14 06:39:41 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 14 06:39:41 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Oct 14 06:39:41 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Oct 14 06:39:41 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Oct 14 06:39:41 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Oct 14 06:39:41 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 14 06:39:41 localhost kernel: ... version:                0
Oct 14 06:39:41 localhost kernel: ... bit width:              48
Oct 14 06:39:41 localhost kernel: ... generic registers:      6
Oct 14 06:39:41 localhost kernel: ... value mask:             0000ffffffffffff
Oct 14 06:39:41 localhost kernel: ... max period:             00007fffffffffff
Oct 14 06:39:41 localhost kernel: ... fixed-purpose events:   0
Oct 14 06:39:41 localhost kernel: ... event mask:             000000000000003f
Oct 14 06:39:41 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 14 06:39:41 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 14 06:39:41 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 14 06:39:41 localhost kernel: x86: Booting SMP configuration:
Oct 14 06:39:41 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 14 06:39:41 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 14 06:39:41 localhost kernel: smpboot: Max logical packages: 8
Oct 14 06:39:41 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Oct 14 06:39:41 localhost kernel: node 0 deferred pages initialised in 25ms
Oct 14 06:39:41 localhost kernel: devtmpfs: initialized
Oct 14 06:39:41 localhost kernel: x86/mm: Memory block size: 128MB
Oct 14 06:39:41 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 14 06:39:41 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 14 06:39:41 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 14 06:39:41 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 14 06:39:41 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Oct 14 06:39:41 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 14 06:39:41 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 14 06:39:41 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 14 06:39:41 localhost kernel: audit: type=2000 audit(1760423979.809:1): state=initialized audit_enabled=0 res=1
Oct 14 06:39:41 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 14 06:39:41 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 14 06:39:41 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 14 06:39:41 localhost kernel: cpuidle: using governor menu
Oct 14 06:39:41 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Oct 14 06:39:41 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 14 06:39:41 localhost kernel: PCI: Using configuration type 1 for base access
Oct 14 06:39:41 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 14 06:39:41 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 14 06:39:41 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Oct 14 06:39:41 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Oct 14 06:39:41 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Oct 14 06:39:41 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Oct 14 06:39:41 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Oct 14 06:39:41 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 14 06:39:41 localhost kernel: ACPI: Interpreter enabled
Oct 14 06:39:41 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 14 06:39:41 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 14 06:39:41 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 14 06:39:41 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 14 06:39:41 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 14 06:39:41 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 14 06:39:41 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [3] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [4] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [5] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [6] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [7] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [8] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [9] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [10] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [11] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [12] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [13] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [14] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [15] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [16] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [17] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [18] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [19] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [20] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [21] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [22] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [23] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [24] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [25] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [26] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [27] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [28] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [29] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [30] registered
Oct 14 06:39:41 localhost kernel: acpiphp: Slot [31] registered
Oct 14 06:39:41 localhost kernel: PCI host bridge to bus 0000:00
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Oct 14 06:39:41 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Oct 14 06:39:41 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Oct 14 06:39:41 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Oct 14 06:39:41 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Oct 14 06:39:41 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 14 06:39:41 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Oct 14 06:39:41 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Oct 14 06:39:41 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 14 06:39:41 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 14 06:39:41 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 14 06:39:41 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 14 06:39:41 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 14 06:39:41 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 14 06:39:41 localhost kernel: iommu: Default domain type: Translated 
Oct 14 06:39:41 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Oct 14 06:39:41 localhost kernel: SCSI subsystem initialized
Oct 14 06:39:41 localhost kernel: ACPI: bus type USB registered
Oct 14 06:39:41 localhost kernel: usbcore: registered new interface driver usbfs
Oct 14 06:39:41 localhost kernel: usbcore: registered new interface driver hub
Oct 14 06:39:41 localhost kernel: usbcore: registered new device driver usb
Oct 14 06:39:41 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 14 06:39:41 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 14 06:39:41 localhost kernel: PTP clock support registered
Oct 14 06:39:41 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 14 06:39:41 localhost kernel: NetLabel: Initializing
Oct 14 06:39:41 localhost kernel: NetLabel:  domain hash size = 128
Oct 14 06:39:41 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 14 06:39:41 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 14 06:39:41 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 14 06:39:41 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 14 06:39:41 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 14 06:39:41 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 14 06:39:41 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 14 06:39:41 localhost kernel: vgaarb: loaded
Oct 14 06:39:41 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 14 06:39:41 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 14 06:39:41 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 14 06:39:41 localhost kernel: pnp: PnP ACPI init
Oct 14 06:39:41 localhost kernel: pnp 00:03: [dma 2]
Oct 14 06:39:41 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 14 06:39:41 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 14 06:39:41 localhost kernel: NET: Registered PF_INET protocol family
Oct 14 06:39:41 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Oct 14 06:39:41 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Oct 14 06:39:41 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 14 06:39:41 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 14 06:39:41 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 14 06:39:41 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Oct 14 06:39:41 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Oct 14 06:39:41 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Oct 14 06:39:41 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Oct 14 06:39:41 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 14 06:39:41 localhost kernel: NET: Registered PF_XDP protocol family
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 14 06:39:41 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 14 06:39:41 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 14 06:39:41 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 14 06:39:41 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 28661 usecs
Oct 14 06:39:41 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 14 06:39:41 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 14 06:39:41 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 14 06:39:41 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 14 06:39:41 localhost kernel: ACPI: bus type thunderbolt registered
Oct 14 06:39:41 localhost kernel: Initialise system trusted keyrings
Oct 14 06:39:41 localhost kernel: Key type blacklist registered
Oct 14 06:39:41 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Oct 14 06:39:41 localhost kernel: zbud: loaded
Oct 14 06:39:41 localhost kernel: integrity: Platform Keyring initialized
Oct 14 06:39:41 localhost kernel: NET: Registered PF_ALG protocol family
Oct 14 06:39:41 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 14 06:39:41 localhost kernel: Key type asymmetric registered
Oct 14 06:39:41 localhost kernel: Asymmetric key parser 'x509' registered
Oct 14 06:39:41 localhost kernel: Running certificate verification selftests
Oct 14 06:39:41 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 14 06:39:41 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 14 06:39:41 localhost kernel: io scheduler mq-deadline registered
Oct 14 06:39:41 localhost kernel: io scheduler kyber registered
Oct 14 06:39:41 localhost kernel: io scheduler bfq registered
Oct 14 06:39:41 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 14 06:39:41 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 14 06:39:41 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 14 06:39:41 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 14 06:39:41 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 14 06:39:41 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 14 06:39:41 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 14 06:39:41 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 14 06:39:41 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 14 06:39:41 localhost kernel: Non-volatile memory driver v1.3
Oct 14 06:39:41 localhost kernel: rdac: device handler registered
Oct 14 06:39:41 localhost kernel: hp_sw: device handler registered
Oct 14 06:39:41 localhost kernel: emc: device handler registered
Oct 14 06:39:41 localhost kernel: alua: device handler registered
Oct 14 06:39:41 localhost kernel: libphy: Fixed MDIO Bus: probed
Oct 14 06:39:41 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Oct 14 06:39:41 localhost kernel: ehci-pci: EHCI PCI platform driver
Oct 14 06:39:41 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Oct 14 06:39:41 localhost kernel: ohci-pci: OHCI PCI platform driver
Oct 14 06:39:41 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Oct 14 06:39:41 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 14 06:39:41 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 14 06:39:41 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 14 06:39:41 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 14 06:39:41 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 14 06:39:41 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 14 06:39:41 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 14 06:39:41 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Oct 14 06:39:41 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 14 06:39:41 localhost kernel: hub 1-0:1.0: USB hub found
Oct 14 06:39:41 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 14 06:39:41 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 14 06:39:41 localhost kernel: usbserial: USB Serial support registered for generic
Oct 14 06:39:41 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 14 06:39:41 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 14 06:39:41 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 14 06:39:41 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 14 06:39:41 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 14 06:39:41 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 14 06:39:41 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 14 06:39:41 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-14T06:39:40 UTC (1760423980)
Oct 14 06:39:41 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 14 06:39:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 14 06:39:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 14 06:39:41 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 14 06:39:41 localhost kernel: usbcore: registered new interface driver usbhid
Oct 14 06:39:41 localhost kernel: usbhid: USB HID core driver
Oct 14 06:39:41 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 14 06:39:41 localhost kernel: Initializing XFRM netlink socket
Oct 14 06:39:41 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 14 06:39:41 localhost kernel: Segment Routing with IPv6
Oct 14 06:39:41 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 14 06:39:41 localhost kernel: mpls_gso: MPLS GSO support
Oct 14 06:39:41 localhost kernel: IPI shorthand broadcast: enabled
Oct 14 06:39:41 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 14 06:39:41 localhost kernel: AES CTR mode by8 optimization enabled
Oct 14 06:39:41 localhost kernel: sched_clock: Marking stable (767975598, 183277086)->(1079475105, -128222421)
Oct 14 06:39:41 localhost kernel: registered taskstats version 1
Oct 14 06:39:41 localhost kernel: Loading compiled-in X.509 certificates
Oct 14 06:39:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Oct 14 06:39:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 14 06:39:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 14 06:39:41 localhost kernel: zswap: loaded using pool lzo/zbud
Oct 14 06:39:41 localhost kernel: page_owner is disabled
Oct 14 06:39:41 localhost kernel: Key type big_key registered
Oct 14 06:39:41 localhost kernel: Freeing initrd memory: 74232K
Oct 14 06:39:41 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 14 06:39:41 localhost kernel: Key type encrypted registered
Oct 14 06:39:41 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 14 06:39:41 localhost kernel: Loading compiled-in module X.509 certificates
Oct 14 06:39:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Oct 14 06:39:41 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 14 06:39:41 localhost kernel: ima: No architecture policies found
Oct 14 06:39:41 localhost kernel: evm: Initialising EVM extended attributes:
Oct 14 06:39:41 localhost kernel: evm: security.selinux
Oct 14 06:39:41 localhost kernel: evm: security.SMACK64 (disabled)
Oct 14 06:39:41 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 14 06:39:41 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 14 06:39:41 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 14 06:39:41 localhost kernel: evm: security.apparmor (disabled)
Oct 14 06:39:41 localhost kernel: evm: security.ima
Oct 14 06:39:41 localhost kernel: evm: security.capability
Oct 14 06:39:41 localhost kernel: evm: HMAC attrs: 0x1
Oct 14 06:39:41 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 14 06:39:41 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 14 06:39:41 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 14 06:39:41 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 14 06:39:41 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 14 06:39:41 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 14 06:39:41 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 14 06:39:41 localhost kernel: Freeing unused decrypted memory: 2036K
Oct 14 06:39:41 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Oct 14 06:39:41 localhost kernel: Write protecting the kernel read-only data: 26624k
Oct 14 06:39:41 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Oct 14 06:39:41 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Oct 14 06:39:41 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 14 06:39:41 localhost kernel: Run /init as init process
Oct 14 06:39:41 localhost kernel:   with arguments:
Oct 14 06:39:41 localhost kernel:     /init
Oct 14 06:39:41 localhost kernel:   with environment:
Oct 14 06:39:41 localhost kernel:     HOME=/
Oct 14 06:39:41 localhost kernel:     TERM=linux
Oct 14 06:39:41 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Oct 14 06:39:41 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 14 06:39:41 localhost systemd[1]: Detected virtualization kvm.
Oct 14 06:39:41 localhost systemd[1]: Detected architecture x86-64.
Oct 14 06:39:41 localhost systemd[1]: Running in initrd.
Oct 14 06:39:41 localhost systemd[1]: No hostname configured, using default hostname.
Oct 14 06:39:41 localhost systemd[1]: Hostname set to <localhost>.
Oct 14 06:39:41 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 14 06:39:41 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 14 06:39:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 14 06:39:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 14 06:39:41 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 14 06:39:41 localhost systemd[1]: Reached target Local File Systems.
Oct 14 06:39:41 localhost systemd[1]: Reached target Path Units.
Oct 14 06:39:41 localhost systemd[1]: Reached target Slice Units.
Oct 14 06:39:41 localhost systemd[1]: Reached target Swaps.
Oct 14 06:39:41 localhost systemd[1]: Reached target Timer Units.
Oct 14 06:39:41 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 14 06:39:41 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 14 06:39:41 localhost systemd[1]: Listening on Journal Socket.
Oct 14 06:39:41 localhost systemd[1]: Listening on udev Control Socket.
Oct 14 06:39:41 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 14 06:39:41 localhost systemd[1]: Reached target Socket Units.
Oct 14 06:39:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 14 06:39:41 localhost systemd[1]: Starting Journal Service...
Oct 14 06:39:41 localhost systemd[1]: Starting Load Kernel Modules...
Oct 14 06:39:41 localhost systemd[1]: Starting Create System Users...
Oct 14 06:39:41 localhost systemd[1]: Starting Setup Virtual Console...
Oct 14 06:39:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 14 06:39:41 localhost systemd-journald[283]: Journal started
Oct 14 06:39:41 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/03f5bc8bedfd405c8f420ac9afa0b79f) is 8.0M, max 314.7M, 306.7M free.
Oct 14 06:39:41 localhost systemd-modules-load[284]: Module 'msr' is built in
Oct 14 06:39:41 localhost systemd[1]: Started Journal Service.
Oct 14 06:39:41 localhost systemd[1]: Finished Load Kernel Modules.
Oct 14 06:39:41 localhost systemd[1]: Finished Setup Virtual Console.
Oct 14 06:39:41 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 14 06:39:41 localhost systemd[1]: Starting dracut cmdline hook...
Oct 14 06:39:41 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 14 06:39:41 localhost systemd-sysusers[285]: Creating group 'sgx' with GID 997.
Oct 14 06:39:41 localhost systemd-sysusers[285]: Creating group 'users' with GID 100.
Oct 14 06:39:41 localhost systemd-sysusers[285]: Creating group 'dbus' with GID 81.
Oct 14 06:39:41 localhost systemd-sysusers[285]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 14 06:39:41 localhost systemd[1]: Finished Create System Users.
Oct 14 06:39:41 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 14 06:39:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 14 06:39:41 localhost dracut-cmdline[288]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Oct 14 06:39:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 14 06:39:41 localhost dracut-cmdline[288]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Oct 14 06:39:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 14 06:39:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 14 06:39:41 localhost systemd[1]: Finished dracut cmdline hook.
Oct 14 06:39:41 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 14 06:39:41 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 14 06:39:41 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 14 06:39:41 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Oct 14 06:39:41 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 14 06:39:41 localhost kernel: RPC: Registered udp transport module.
Oct 14 06:39:41 localhost kernel: RPC: Registered tcp transport module.
Oct 14 06:39:41 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 14 06:39:41 localhost rpc.statd[407]: Version 2.5.4 starting
Oct 14 06:39:41 localhost rpc.statd[407]: Initializing NSM state
Oct 14 06:39:41 localhost rpc.idmapd[412]: Setting log level to 0
Oct 14 06:39:41 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 14 06:39:41 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 14 06:39:41 localhost systemd-udevd[425]: Using default interface naming scheme 'rhel-9.0'.
Oct 14 06:39:41 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 14 06:39:41 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 14 06:39:41 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 14 06:39:41 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 14 06:39:41 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 14 06:39:41 localhost systemd[1]: Reached target System Initialization.
Oct 14 06:39:41 localhost systemd[1]: Reached target Basic System.
Oct 14 06:39:41 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 14 06:39:41 localhost systemd[1]: Reached target Network.
Oct 14 06:39:41 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 14 06:39:41 localhost systemd[1]: Starting dracut initqueue hook...
Oct 14 06:39:41 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Oct 14 06:39:41 localhost kernel: libata version 3.00 loaded.
Oct 14 06:39:41 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 14 06:39:41 localhost kernel: scsi host0: ata_piix
Oct 14 06:39:41 localhost kernel: scsi host1: ata_piix
Oct 14 06:39:41 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Oct 14 06:39:41 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Oct 14 06:39:41 localhost kernel: GPT:20971519 != 838860799
Oct 14 06:39:41 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Oct 14 06:39:41 localhost kernel: GPT:20971519 != 838860799
Oct 14 06:39:41 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Oct 14 06:39:41 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Oct 14 06:39:41 localhost systemd-udevd[437]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 06:39:41 localhost kernel:  vda: vda1 vda2 vda3 vda4
Oct 14 06:39:41 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Oct 14 06:39:42 localhost systemd[1]: Reached target Initrd Root Device.
Oct 14 06:39:42 localhost kernel: ata1: found unknown device (class 0)
Oct 14 06:39:42 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 14 06:39:42 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 14 06:39:42 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 14 06:39:42 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 14 06:39:42 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 14 06:39:42 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 14 06:39:42 localhost systemd[1]: Finished dracut initqueue hook.
Oct 14 06:39:42 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 14 06:39:42 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 14 06:39:42 localhost systemd[1]: Reached target Remote File Systems.
Oct 14 06:39:42 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 14 06:39:42 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 14 06:39:42 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Oct 14 06:39:42 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Oct 14 06:39:42 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Oct 14 06:39:42 localhost systemd[1]: Mounting /sysroot...
Oct 14 06:39:42 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 14 06:39:42 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Oct 14 06:39:42 localhost kernel: XFS (vda4): Ending clean mount
Oct 14 06:39:42 localhost systemd[1]: Mounted /sysroot.
Oct 14 06:39:42 localhost systemd[1]: Reached target Initrd Root File System.
Oct 14 06:39:42 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 14 06:39:42 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 14 06:39:42 localhost systemd[1]: Reached target Initrd File Systems.
Oct 14 06:39:42 localhost systemd[1]: Reached target Initrd Default Target.
Oct 14 06:39:42 localhost systemd[1]: Starting dracut mount hook...
Oct 14 06:39:42 localhost systemd[1]: Finished dracut mount hook.
Oct 14 06:39:42 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 14 06:39:42 localhost rpc.idmapd[412]: exiting on signal 15
Oct 14 06:39:42 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 14 06:39:42 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 14 06:39:42 localhost systemd[1]: Stopped target Network.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Timer Units.
Oct 14 06:39:42 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 14 06:39:42 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Basic System.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Path Units.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Remote File Systems.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Slice Units.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Socket Units.
Oct 14 06:39:42 localhost systemd[1]: Stopped target System Initialization.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Local File Systems.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Swaps.
Oct 14 06:39:42 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut mount hook.
Oct 14 06:39:42 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 14 06:39:42 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 14 06:39:42 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 14 06:39:42 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 14 06:39:42 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 14 06:39:42 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Load Kernel Modules.
Oct 14 06:39:42 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 14 06:39:42 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 14 06:39:42 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 14 06:39:42 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 14 06:39:42 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 14 06:39:42 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 14 06:39:42 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Closed udev Control Socket.
Oct 14 06:39:42 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Closed udev Kernel Socket.
Oct 14 06:39:42 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 14 06:39:42 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 14 06:39:42 localhost systemd[1]: Starting Cleanup udev Database...
Oct 14 06:39:42 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 14 06:39:42 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 14 06:39:43 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 14 06:39:43 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 14 06:39:43 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 14 06:39:43 localhost systemd[1]: Stopped Create System Users.
Oct 14 06:39:43 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 14 06:39:43 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 14 06:39:43 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 14 06:39:43 localhost systemd[1]: Finished Cleanup udev Database.
Oct 14 06:39:43 localhost systemd[1]: Reached target Switch Root.
Oct 14 06:39:43 localhost systemd[1]: Starting Switch Root...
Oct 14 06:39:43 localhost systemd[1]: Switching root.
Oct 14 06:39:43 localhost systemd-journald[283]: Journal stopped
Oct 14 06:39:44 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd).
Oct 14 06:39:44 localhost kernel: audit: type=1404 audit(1760423983.162:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability open_perms=1
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability always_check_network=0
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 06:39:44 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 06:39:44 localhost kernel: audit: type=1403 audit(1760423983.296:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 14 06:39:44 localhost systemd[1]: Successfully loaded SELinux policy in 138.536ms.
Oct 14 06:39:44 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 34.812ms.
Oct 14 06:39:44 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 14 06:39:44 localhost systemd[1]: Detected virtualization kvm.
Oct 14 06:39:44 localhost systemd[1]: Detected architecture x86-64.
Oct 14 06:39:44 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 06:39:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 06:39:44 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd[1]: Stopped Switch Root.
Oct 14 06:39:44 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 14 06:39:44 localhost systemd[1]: Created slice Slice /system/getty.
Oct 14 06:39:44 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 14 06:39:44 localhost systemd[1]: Created slice Slice /system/serial-getty.
Oct 14 06:39:44 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 14 06:39:44 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Oct 14 06:39:44 localhost systemd[1]: Created slice User and Session Slice.
Oct 14 06:39:44 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 14 06:39:44 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 14 06:39:44 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 14 06:39:44 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 14 06:39:44 localhost systemd[1]: Stopped target Switch Root.
Oct 14 06:39:44 localhost systemd[1]: Stopped target Initrd File Systems.
Oct 14 06:39:44 localhost systemd[1]: Stopped target Initrd Root File System.
Oct 14 06:39:44 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 14 06:39:44 localhost systemd[1]: Reached target Path Units.
Oct 14 06:39:44 localhost systemd[1]: Reached target rpc_pipefs.target.
Oct 14 06:39:44 localhost systemd[1]: Reached target Slice Units.
Oct 14 06:39:44 localhost systemd[1]: Reached target Swaps.
Oct 14 06:39:44 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Oct 14 06:39:44 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 14 06:39:44 localhost systemd[1]: Reached target RPC Port Mapper.
Oct 14 06:39:44 localhost systemd[1]: Listening on Process Core Dump Socket.
Oct 14 06:39:44 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 14 06:39:44 localhost systemd[1]: Listening on udev Control Socket.
Oct 14 06:39:44 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 14 06:39:44 localhost systemd[1]: Mounting Huge Pages File System...
Oct 14 06:39:44 localhost systemd[1]: Mounting POSIX Message Queue File System...
Oct 14 06:39:44 localhost systemd[1]: Mounting Kernel Debug File System...
Oct 14 06:39:44 localhost systemd[1]: Mounting Kernel Trace File System...
Oct 14 06:39:44 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 14 06:39:44 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 14 06:39:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 14 06:39:44 localhost systemd[1]: Starting Load Kernel Module drm...
Oct 14 06:39:44 localhost systemd[1]: Starting Load Kernel Module fuse...
Oct 14 06:39:44 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 14 06:39:44 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd[1]: Stopped File System Check on Root Device.
Oct 14 06:39:44 localhost systemd[1]: Stopped Journal Service.
Oct 14 06:39:44 localhost systemd[1]: Starting Journal Service...
Oct 14 06:39:44 localhost systemd[1]: Starting Load Kernel Modules...
Oct 14 06:39:44 localhost systemd[1]: Starting Generate network units from Kernel command line...
Oct 14 06:39:44 localhost kernel: fuse: init (API version 7.36)
Oct 14 06:39:44 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 14 06:39:44 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 14 06:39:44 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 14 06:39:44 localhost systemd-journald[618]: Journal started
Oct 14 06:39:44 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/8e1d5208cffec42b50976967e1d1cfd0) is 8.0M, max 314.7M, 306.7M free.
Oct 14 06:39:43 localhost systemd[1]: Queued start job for default target Multi-User System.
Oct 14 06:39:43 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd-modules-load[619]: Module 'msr' is built in
Oct 14 06:39:44 localhost systemd[1]: Started Journal Service.
Oct 14 06:39:44 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 14 06:39:44 localhost systemd[1]: Mounted Huge Pages File System.
Oct 14 06:39:44 localhost systemd[1]: Mounted POSIX Message Queue File System.
Oct 14 06:39:44 localhost systemd[1]: Mounted Kernel Debug File System.
Oct 14 06:39:44 localhost systemd[1]: Mounted Kernel Trace File System.
Oct 14 06:39:44 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 14 06:39:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 14 06:39:44 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd[1]: Finished Load Kernel Module fuse.
Oct 14 06:39:44 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 14 06:39:44 localhost systemd[1]: Finished Load Kernel Modules.
Oct 14 06:39:44 localhost systemd[1]: Finished Generate network units from Kernel command line.
Oct 14 06:39:44 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 14 06:39:44 localhost systemd[1]: Mounting FUSE Control File System...
Oct 14 06:39:44 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 14 06:39:44 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 14 06:39:44 localhost systemd[1]: Starting Rebuild Hardware Database...
Oct 14 06:39:44 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 14 06:39:44 localhost systemd[1]: Starting Load/Save Random Seed...
Oct 14 06:39:44 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 14 06:39:44 localhost kernel: ACPI: bus type drm_connector registered
Oct 14 06:39:44 localhost systemd[1]: Starting Create System Users...
Oct 14 06:39:44 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd[1]: Finished Load Kernel Module drm.
Oct 14 06:39:44 localhost systemd[1]: Mounted FUSE Control File System.
Oct 14 06:39:44 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 14 06:39:44 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/8e1d5208cffec42b50976967e1d1cfd0) is 8.0M, max 314.7M, 306.7M free.
Oct 14 06:39:44 localhost systemd-journald[618]: Received client request to flush runtime journal.
Oct 14 06:39:44 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 14 06:39:44 localhost systemd[1]: Finished Load/Save Random Seed.
Oct 14 06:39:44 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 14 06:39:44 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 14 06:39:44 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989.
Oct 14 06:39:44 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988.
Oct 14 06:39:44 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Oct 14 06:39:44 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 14 06:39:44 localhost systemd[1]: Finished Create System Users.
Oct 14 06:39:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 14 06:39:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 14 06:39:44 localhost systemd[1]: Reached target Preparation for Local File Systems.
Oct 14 06:39:44 localhost systemd[1]: Set up automount EFI System Partition Automount.
Oct 14 06:39:44 localhost systemd[1]: Finished Rebuild Hardware Database.
Oct 14 06:39:44 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 14 06:39:44 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'.
Oct 14 06:39:44 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 14 06:39:44 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 14 06:39:44 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 14 06:39:44 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 14 06:39:44 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 14 06:39:44 localhost systemd-udevd[640]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 06:39:44 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Oct 14 06:39:44 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Oct 14 06:39:44 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Oct 14 06:39:44 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 14 06:39:44 localhost systemd-fsck[681]: fsck.fat 4.2 (2021-01-31)
Oct 14 06:39:44 localhost systemd-fsck[681]: /dev/vda2: 12 files, 1782/51145 clusters
Oct 14 06:39:44 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 14 06:39:44 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Oct 14 06:39:44 localhost kernel: SVM: TSC scaling supported
Oct 14 06:39:44 localhost kernel: kvm: Nested Virtualization enabled
Oct 14 06:39:44 localhost kernel: SVM: kvm: Nested Paging enabled
Oct 14 06:39:44 localhost kernel: SVM: LBR virtualization supported
Oct 14 06:39:44 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 14 06:39:44 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 14 06:39:44 localhost kernel: Console: switching to colour dummy device 80x25
Oct 14 06:39:44 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 14 06:39:44 localhost kernel: [drm] features: -context_init
Oct 14 06:39:44 localhost kernel: [drm] number of scanouts: 1
Oct 14 06:39:44 localhost kernel: [drm] number of cap sets: 0
Oct 14 06:39:44 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Oct 14 06:39:44 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Oct 14 06:39:44 localhost kernel: Console: switching to colour frame buffer device 128x48
Oct 14 06:39:44 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 14 06:39:45 localhost systemd[1]: Mounting /boot...
Oct 14 06:39:45 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Oct 14 06:39:45 localhost kernel: XFS (vda3): Ending clean mount
Oct 14 06:39:45 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Oct 14 06:39:45 localhost systemd[1]: Mounted /boot.
Oct 14 06:39:45 localhost systemd[1]: Mounting /boot/efi...
Oct 14 06:39:45 localhost systemd[1]: Mounted /boot/efi.
Oct 14 06:39:45 localhost systemd[1]: Reached target Local File Systems.
Oct 14 06:39:45 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 14 06:39:45 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 14 06:39:45 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 14 06:39:45 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 14 06:39:45 localhost systemd[1]: Starting Automatic Boot Loader Update...
Oct 14 06:39:45 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 14 06:39:45 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 14 06:39:45 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 714 (bootctl)
Oct 14 06:39:45 localhost systemd[1]: Starting File System Check on /dev/vda2...
Oct 14 06:39:45 localhost systemd[1]: Finished File System Check on /dev/vda2.
Oct 14 06:39:45 localhost systemd[1]: Mounting EFI System Partition Automount...
Oct 14 06:39:45 localhost systemd[1]: Mounted EFI System Partition Automount.
Oct 14 06:39:45 localhost systemd[1]: Finished Automatic Boot Loader Update.
Oct 14 06:39:45 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 14 06:39:45 localhost systemd[1]: Starting Security Auditing Service...
Oct 14 06:39:45 localhost systemd[1]: Starting RPC Bind...
Oct 14 06:39:45 localhost systemd[1]: Starting Rebuild Journal Catalog...
Oct 14 06:39:45 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Oct 14 06:39:45 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Oct 14 06:39:45 localhost systemd[1]: Finished Rebuild Journal Catalog.
Oct 14 06:39:45 localhost systemd[1]: Started RPC Bind.
Oct 14 06:39:45 localhost augenrules[730]: /sbin/augenrules: No change
Oct 14 06:39:45 localhost augenrules[740]: No rules
Oct 14 06:39:45 localhost augenrules[740]: enabled 1
Oct 14 06:39:45 localhost augenrules[740]: failure 1
Oct 14 06:39:45 localhost augenrules[740]: pid 725
Oct 14 06:39:45 localhost augenrules[740]: rate_limit 0
Oct 14 06:39:45 localhost augenrules[740]: backlog_limit 8192
Oct 14 06:39:45 localhost augenrules[740]: lost 0
Oct 14 06:39:45 localhost augenrules[740]: backlog 0
Oct 14 06:39:45 localhost augenrules[740]: backlog_wait_time 60000
Oct 14 06:39:45 localhost augenrules[740]: backlog_wait_time_actual 0
Oct 14 06:39:45 localhost augenrules[740]: enabled 1
Oct 14 06:39:45 localhost augenrules[740]: failure 1
Oct 14 06:39:45 localhost augenrules[740]: pid 725
Oct 14 06:39:45 localhost augenrules[740]: rate_limit 0
Oct 14 06:39:45 localhost augenrules[740]: backlog_limit 8192
Oct 14 06:39:45 localhost augenrules[740]: lost 0
Oct 14 06:39:45 localhost augenrules[740]: backlog 3
Oct 14 06:39:45 localhost augenrules[740]: backlog_wait_time 60000
Oct 14 06:39:45 localhost augenrules[740]: backlog_wait_time_actual 0
Oct 14 06:39:45 localhost augenrules[740]: enabled 1
Oct 14 06:39:45 localhost augenrules[740]: failure 1
Oct 14 06:39:45 localhost augenrules[740]: pid 725
Oct 14 06:39:45 localhost augenrules[740]: rate_limit 0
Oct 14 06:39:45 localhost augenrules[740]: backlog_limit 8192
Oct 14 06:39:45 localhost augenrules[740]: lost 0
Oct 14 06:39:45 localhost augenrules[740]: backlog 2
Oct 14 06:39:45 localhost augenrules[740]: backlog_wait_time 60000
Oct 14 06:39:45 localhost augenrules[740]: backlog_wait_time_actual 0
Oct 14 06:39:45 localhost systemd[1]: Started Security Auditing Service.
Oct 14 06:39:45 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 14 06:39:45 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 14 06:39:45 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 14 06:39:45 localhost systemd[1]: Starting Update is Completed...
Oct 14 06:39:45 localhost systemd[1]: Finished Update is Completed.
Oct 14 06:39:45 localhost systemd[1]: Reached target System Initialization.
Oct 14 06:39:45 localhost systemd[1]: Started dnf makecache --timer.
Oct 14 06:39:45 localhost systemd[1]: Started Daily rotation of log files.
Oct 14 06:39:45 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 14 06:39:45 localhost systemd[1]: Reached target Timer Units.
Oct 14 06:39:45 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 14 06:39:45 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 14 06:39:45 localhost systemd[1]: Reached target Socket Units.
Oct 14 06:39:45 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Oct 14 06:39:45 localhost systemd[1]: Starting D-Bus System Message Bus...
Oct 14 06:39:45 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 14 06:39:45 localhost systemd[1]: Started D-Bus System Message Bus.
Oct 14 06:39:45 localhost systemd[1]: Reached target Basic System.
Oct 14 06:39:45 localhost systemd[1]: Starting NTP client/server...
Oct 14 06:39:45 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 14 06:39:45 localhost dbus-broker-lau[750]: Ready
Oct 14 06:39:45 localhost systemd[1]: Started irqbalance daemon.
Oct 14 06:39:45 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 14 06:39:45 localhost systemd[1]: Starting System Logging Service...
Oct 14 06:39:45 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 06:39:45 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 06:39:45 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 06:39:45 localhost systemd[1]: Reached target sshd-keygen.target.
Oct 14 06:39:45 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 14 06:39:45 localhost systemd[1]: Reached target User and Group Name Lookups.
Oct 14 06:39:45 localhost systemd[1]: Starting User Login Management...
Oct 14 06:39:45 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start
Oct 14 06:39:45 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Oct 14 06:39:45 localhost systemd[1]: Started System Logging Service.
Oct 14 06:39:45 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 14 06:39:45 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 14 06:39:45 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data
Oct 14 06:39:45 localhost chronyd[765]: Loaded seccomp filter (level 2)
Oct 14 06:39:45 localhost systemd[1]: Started NTP client/server.
Oct 14 06:39:45 localhost systemd-logind[759]: New seat seat0.
Oct 14 06:39:45 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 14 06:39:45 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 14 06:39:45 localhost systemd[1]: Started User Login Management.
Oct 14 06:39:45 localhost rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 06:39:46 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Tue, 14 Oct 2025 06:39:46 +0000. Up 6.43 seconds.
Oct 14 06:39:46 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Oct 14 06:39:46 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Oct 14 06:39:46 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp95q902fp.mount: Deactivated successfully.
Oct 14 06:39:46 localhost systemd[1]: Starting Hostname Service...
Oct 14 06:39:46 localhost systemd[1]: Started Hostname Service.
Oct 14 06:39:46 np0005486759.novalocal systemd-hostnamed[783]: Hostname set to <np0005486759.novalocal> (static)
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Reached target Preparation for Network.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting Network Manager...
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8070] NetworkManager (version 1.42.2-1.el9) is starting... (boot:ada6beb4-a7f5-4749-918d-0eab64543903)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8076] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Started Network Manager.
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8120] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Reached target Network.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8218] manager[0x564c33e80020]: monitoring kernel firmware directory '/lib/firmware'.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8271] hostname: hostname: using hostnamed
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8271] hostname: static hostname changed from (none) to "np0005486759.novalocal"
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8284] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Reached target NFS client services.
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8489] manager[0x564c33e80020]: rfkill: Wi-Fi hardware radio set enabled
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8490] manager[0x564c33e80020]: rfkill: WWAN hardware radio set enabled
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Reached target Remote File Systems.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8592] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8593] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8605] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8606] manager: Networking is enabled by state file
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8652] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8653] settings: Loaded settings plugin: keyfile (internal)
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8705] dhcp: init: Using DHCP client 'internal'
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8712] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8742] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8759] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8778] device (lo): Activation: starting connection 'lo' (c18decb1-3b25-4285-a29a-63d4d1c4034a)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8796] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8807] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8859] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8878] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8881] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8883] device (eth0): carrier: link connected
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8887] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8894] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8923] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8927] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8928] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8931] manager: NetworkManager state is now CONNECTING
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8932] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8940] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8943] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8950] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8959] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.8964] device (lo): Activation: successful, device activated.
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9000] dhcp4 (eth0): state changed new lease, address=38.102.83.246
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9003] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9032] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9059] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9060] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9065] manager: NetworkManager state is now CONNECTED_SITE
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9067] device (eth0): Activation: successful, device activated.
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9079] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 14 06:39:46 np0005486759.novalocal NetworkManager[788]: <info>  [1760423986.9086] manager: startup complete
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 14 06:39:46 np0005486759.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: Cloud-init v. 22.1-9.el9 running 'init' at Tue, 14 Oct 2025 06:39:47 +0000. Up 7.32 seconds.
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |  eth0  | True |        38.102.83.246         | 255.255.255.0 | global | fa:16:3e:ba:52:20 |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |  eth0  | True | fe80::f816:3eff:feba:5220/64 |       .       |  link  | fa:16:3e:ba:52:20 |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct 14 06:39:47 np0005486759.novalocal cloud-init[893]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 14 06:39:47 np0005486759.novalocal systemd[1]: Starting Authorization Manager...
Oct 14 06:39:47 np0005486759.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Oct 14 06:39:47 np0005486759.novalocal polkitd[1035]: Started polkitd version 0.117
Oct 14 06:39:47 np0005486759.novalocal polkitd[1035]: Loading rules from directory /etc/polkit-1/rules.d
Oct 14 06:39:47 np0005486759.novalocal polkitd[1035]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 14 06:39:47 np0005486759.novalocal polkitd[1035]: Finished loading, compiling and executing 4 rules
Oct 14 06:39:47 np0005486759.novalocal systemd[1]: Started Authorization Manager.
Oct 14 06:39:47 np0005486759.novalocal polkitd[1035]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 14 06:39:49 np0005486759.novalocal useradd[1118]: new group: name=cloud-user, GID=1001
Oct 14 06:39:49 np0005486759.novalocal useradd[1118]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Oct 14 06:39:49 np0005486759.novalocal useradd[1118]: add 'cloud-user' to group 'adm'
Oct 14 06:39:49 np0005486759.novalocal useradd[1118]: add 'cloud-user' to group 'systemd-journal'
Oct 14 06:39:49 np0005486759.novalocal useradd[1118]: add 'cloud-user' to shadow group 'adm'
Oct 14 06:39:49 np0005486759.novalocal useradd[1118]: add 'cloud-user' to shadow group 'systemd-journal'
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Generating public/private rsa key pair.
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: The key fingerprint is:
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: SHA256:FwJiSnjJaTHUNRxztjF65AL/VoLilOrN+jnRPBFOTqw root@np0005486759.novalocal
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: The key's randomart image is:
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: +---[RSA 3072]----+
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: | +==++*.*        |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |..*+.=*@ +       |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: | o. +*=.* o      |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |   +E.++ + .     |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |  . .o .S .      |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: | . o. +. .       |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |  . o. .         |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |   ...           |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |  ..o.           |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: +----[SHA256]-----+
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Generating public/private ecdsa key pair.
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: The key fingerprint is:
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: SHA256:rauwJP2fLs4QrR7qW00rRMugt8KOpYiMCB0U10e+K44 root@np0005486759.novalocal
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: The key's randomart image is:
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: +---[ECDSA 256]---+
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |  . .. ..        |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |   o  ...        |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |  o .  ..        |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: | o + o   o       |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |. o = o S .      |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |.o = = . o       |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |o.= X + o        |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |X+ * @.. o       |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |B++.E.*==        |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: +----[SHA256]-----+
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Generating public/private ed25519 key pair.
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: The key fingerprint is:
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: SHA256:2Nj1J+/uefwL8G0JBnfIe52raspZApsqmS2BjAmrv6Q root@np0005486759.novalocal
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: The key's randomart image is:
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: +--[ED25519 256]--+
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |                 |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |            . .  |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |          .. + . |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |.      = . .o o o|
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |+o.   o.S  .o+.o.|
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |+o .    +   +++ o|
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |. . =  o . . o.* |
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |.o = .. . +. .+.o|
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: |E.o.o.   +o..+=o+|
Oct 14 06:39:50 np0005486759.novalocal cloud-init[893]: +----[SHA256]-----+
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Reached target Cloud-config availability.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Reached target Network is Online.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting Crash recovery kernel arming...
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting OpenSSH server daemon...
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting Permit User Sessions...
Oct 14 06:39:50 np0005486759.novalocal sm-notify[1131]: Version 2.5.4 starting
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Started Notify NFS peers of a restart.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Finished Permit User Sessions.
Oct 14 06:39:50 np0005486759.novalocal sshd[1132]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Started Command Scheduler.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Started Getty on tty1.
Oct 14 06:39:50 np0005486759.novalocal sshd[1132]: Server listening on 0.0.0.0 port 22.
Oct 14 06:39:50 np0005486759.novalocal sshd[1132]: Server listening on :: port 22.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Started Serial Getty on ttyS0.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Reached target Login Prompts.
Oct 14 06:39:50 np0005486759.novalocal crond[1135]: (CRON) STARTUP (1.5.7)
Oct 14 06:39:50 np0005486759.novalocal crond[1135]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 14 06:39:50 np0005486759.novalocal crond[1135]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 3% if used.)
Oct 14 06:39:50 np0005486759.novalocal crond[1135]: (CRON) INFO (running with inotify support)
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Started OpenSSH server daemon.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Reached target Multi-User System.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 14 06:39:50 np0005486759.novalocal kdumpctl[1136]: kdump: No kdump initial ramdisk found.
Oct 14 06:39:50 np0005486759.novalocal kdumpctl[1136]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Oct 14 06:39:50 np0005486759.novalocal cloud-init[1233]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Tue, 14 Oct 2025 06:39:50 +0000. Up 10.95 seconds.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Oct 14 06:39:50 np0005486759.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Oct 14 06:39:51 np0005486759.novalocal sshd[1412]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal sshd[1412]: Connection reset by 38.102.83.114 port 39286 [preauth]
Oct 14 06:39:51 np0005486759.novalocal dracut[1422]: dracut-057-21.git20230214.el9
Oct 14 06:39:51 np0005486759.novalocal sshd[1420]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal sshd[1420]: Unable to negotiate with 38.102.83.114 port 39294: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Oct 14 06:39:51 np0005486759.novalocal sshd[1427]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1434]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Tue, 14 Oct 2025 06:39:51 +0000. Up 11.33 seconds.
Oct 14 06:39:51 np0005486759.novalocal sshd[1427]: Connection reset by 38.102.83.114 port 39296 [preauth]
Oct 14 06:39:51 np0005486759.novalocal sshd[1442]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal sshd[1442]: Unable to negotiate with 38.102.83.114 port 39310: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Oct 14 06:39:51 np0005486759.novalocal sshd[1444]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal sshd[1444]: Unable to negotiate with 38.102.83.114 port 39318: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Oct 14 06:39:51 np0005486759.novalocal sshd[1450]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Oct 14 06:39:51 np0005486759.novalocal sshd[1450]: Connection reset by 38.102.83.114 port 39332 [preauth]
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1466]: #############################################################
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1470]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1476]: 256 SHA256:rauwJP2fLs4QrR7qW00rRMugt8KOpYiMCB0U10e+K44 root@np0005486759.novalocal (ECDSA)
Oct 14 06:39:51 np0005486759.novalocal sshd[1469]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1483]: 256 SHA256:2Nj1J+/uefwL8G0JBnfIe52raspZApsqmS2BjAmrv6Q root@np0005486759.novalocal (ED25519)
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1491]: 3072 SHA256:FwJiSnjJaTHUNRxztjF65AL/VoLilOrN+jnRPBFOTqw root@np0005486759.novalocal (RSA)
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1493]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1497]: #############################################################
Oct 14 06:39:51 np0005486759.novalocal sshd[1498]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal sshd[1498]: fatal: mm_answer_sign: sign: error in libcrypto
Oct 14 06:39:51 np0005486759.novalocal sshd[1521]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:39:51 np0005486759.novalocal sshd[1521]: Unable to negotiate with 38.102.83.114 port 39348: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Oct 14 06:39:51 np0005486759.novalocal cloud-init[1434]: Cloud-init v. 22.1-9.el9 finished at Tue, 14 Oct 2025 06:39:51 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.59 seconds
Oct 14 06:39:51 np0005486759.novalocal sshd[1469]: Connection closed by 38.102.83.114 port 39336 [preauth]
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Oct 14 06:39:51 np0005486759.novalocal systemd[1]: Reloading Network Manager...
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Oct 14 06:39:51 np0005486759.novalocal NetworkManager[788]: <info>  [1760423991.5064] audit: op="reload" arg="0" pid=1604 uid=0 result="success"
Oct 14 06:39:51 np0005486759.novalocal NetworkManager[788]: <info>  [1760423991.5073] config: signal: SIGHUP (no changes from disk)
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Oct 14 06:39:51 np0005486759.novalocal systemd[1]: Reloaded Network Manager.
Oct 14 06:39:51 np0005486759.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Oct 14 06:39:51 np0005486759.novalocal systemd[1]: Reached target Cloud-init target.
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Oct 14 06:39:51 np0005486759.novalocal chronyd[765]: Selected source 162.159.200.123 (2.rhel.pool.ntp.org)
Oct 14 06:39:51 np0005486759.novalocal chronyd[765]: System clock TAI offset set to 37 seconds
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: memstrack is not available
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Oct 14 06:39:51 np0005486759.novalocal dracut[1424]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: memstrack is not available
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: *** Including module: systemd ***
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: *** Including module: systemd-initrd ***
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: *** Including module: i18n ***
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: No KEYMAP configured.
Oct 14 06:39:52 np0005486759.novalocal dracut[1424]: *** Including module: drm ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: prefixdevname ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: kernel-modules ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: kernel-modules-extra ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: qemu ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: fstab-sys ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: rootfs-block ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: terminfo ***
Oct 14 06:39:53 np0005486759.novalocal dracut[1424]: *** Including module: udev-rules ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: Skipping udev rule: 91-permissions.rules
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: Skipping udev rule: 80-drivers-modprobe.rules
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: virtiofs ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: dracut-systemd ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: usrmount ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: base ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: fs-lib ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: kdumpbase ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]: *** Including module: microcode_ctl-fw_dir_override ***
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:   microcode_ctl module: mangling fw_dir
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel" is ignored
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Oct 14 06:39:54 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]: *** Including module: shutdown ***
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]: *** Including module: squash ***
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]: *** Including modules done ***
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]: *** Installing kernel module dependencies ***
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]: *** Installing kernel module dependencies done ***
Oct 14 06:39:55 np0005486759.novalocal dracut[1424]: *** Resolving executable dependencies ***
Oct 14 06:39:56 np0005486759.novalocal dracut[1424]: *** Resolving executable dependencies done ***
Oct 14 06:39:56 np0005486759.novalocal dracut[1424]: *** Hardlinking files ***
Oct 14 06:39:56 np0005486759.novalocal dracut[1424]: Mode:           real
Oct 14 06:39:56 np0005486759.novalocal dracut[1424]: Files:          1099
Oct 14 06:39:56 np0005486759.novalocal dracut[1424]: Linked:         3 files
Oct 14 06:39:56 np0005486759.novalocal dracut[1424]: Compared:       0 xattrs
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: Compared:       373 files
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: Saved:          61.04 KiB
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: Duration:       0.039774 seconds
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: *** Hardlinking files done ***
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: Could not find 'strip'. Not stripping the initramfs.
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: *** Generating early-microcode cpio image ***
Oct 14 06:39:57 np0005486759.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: *** Constructing AuthenticAMD.bin ***
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: *** Store current command line parameters ***
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: Stored kernel commandline:
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: No dracut internal kernel commandline stored in the initramfs
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: *** Install squash loader ***
Oct 14 06:39:57 np0005486759.novalocal dracut[1424]: *** Squashing the files inside the initramfs ***
Oct 14 06:39:58 np0005486759.novalocal dracut[1424]: *** Squashing the files inside the initramfs done ***
Oct 14 06:39:58 np0005486759.novalocal dracut[1424]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Oct 14 06:39:59 np0005486759.novalocal dracut[1424]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Oct 14 06:39:59 np0005486759.novalocal kdumpctl[1136]: kdump: kexec: loaded kdump kernel
Oct 14 06:39:59 np0005486759.novalocal kdumpctl[1136]: kdump: Starting kdump: [OK]
Oct 14 06:39:59 np0005486759.novalocal systemd[1]: Finished Crash recovery kernel arming.
Oct 14 06:39:59 np0005486759.novalocal systemd[1]: Startup finished in 1.227s (kernel) + 2.120s (initrd) + 16.390s (userspace) = 19.737s.
Oct 14 06:40:03 np0005486759.novalocal sshd[4173]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:40:03 np0005486759.novalocal sshd[4173]: Accepted publickey for zuul from 38.102.83.114 port 38354 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Oct 14 06:40:03 np0005486759.novalocal systemd[1]: Created slice User Slice of UID 1000.
Oct 14 06:40:03 np0005486759.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 14 06:40:03 np0005486759.novalocal systemd-logind[759]: New session 1 of user zuul.
Oct 14 06:40:03 np0005486759.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 14 06:40:03 np0005486759.novalocal systemd[1]: Starting User Manager for UID 1000...
Oct 14 06:40:03 np0005486759.novalocal systemd[4177]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Queued start job for default target Main User Target.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Created slice User Application Slice.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Reached target Paths.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Reached target Timers.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Starting D-Bus User Message Bus Socket...
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Starting Create User's Volatile Files and Directories...
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Finished Create User's Volatile Files and Directories.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Listening on D-Bus User Message Bus Socket.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Reached target Sockets.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Reached target Basic System.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Reached target Main User Target.
Oct 14 06:40:04 np0005486759.novalocal systemd[4177]: Startup finished in 167ms.
Oct 14 06:40:04 np0005486759.novalocal systemd[1]: Started User Manager for UID 1000.
Oct 14 06:40:04 np0005486759.novalocal systemd[1]: Started Session 1 of User zuul.
Oct 14 06:40:04 np0005486759.novalocal sshd[4173]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:40:04 np0005486759.novalocal python3[4229]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 06:40:12 np0005486759.novalocal python3[4247]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 06:40:16 np0005486759.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 14 06:40:19 np0005486759.novalocal python3[4302]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 06:40:20 np0005486759.novalocal python3[4332]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 14 06:40:23 np0005486759.novalocal python3[4348]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:23 np0005486759.novalocal python3[4362]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:24 np0005486759.novalocal python3[4421]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:40:24 np0005486759.novalocal python3[4462]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760424024.2759173-295-235873403081045/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=a74a7f2bbf8048dabf25acf29852887c_id_rsa follow=False checksum=9d8b8aca6ea7b20d043575148e7d25c9f08a33d8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:26 np0005486759.novalocal python3[4536]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:40:26 np0005486759.novalocal python3[4577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760424025.8519804-372-24329299843351/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=a74a7f2bbf8048dabf25acf29852887c_id_rsa.pub follow=False checksum=e1ee6df319d3dc955b7c91f7210e7850de247fd5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:28 np0005486759.novalocal python3[4605]: ansible-ping Invoked with data=pong
Oct 14 06:40:30 np0005486759.novalocal python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 06:40:33 np0005486759.novalocal python3[4672]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 14 06:40:35 np0005486759.novalocal python3[4694]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:35 np0005486759.novalocal python3[4708]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:36 np0005486759.novalocal python3[4722]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:37 np0005486759.novalocal python3[4736]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:37 np0005486759.novalocal python3[4750]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:37 np0005486759.novalocal python3[4764]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:40 np0005486759.novalocal sudo[4778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkzjvyxniejmxcjswwjsaghlykwgsaxv ; /usr/bin/python3
Oct 14 06:40:40 np0005486759.novalocal sudo[4778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:40:40 np0005486759.novalocal python3[4780]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:40 np0005486759.novalocal sudo[4778]: pam_unix(sudo:session): session closed for user root
Oct 14 06:40:41 np0005486759.novalocal sudo[4826]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bohyuixtyzdgqtzaovzmycnovdfdewdd ; /usr/bin/python3
Oct 14 06:40:41 np0005486759.novalocal sudo[4826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:40:41 np0005486759.novalocal python3[4828]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:40:41 np0005486759.novalocal sudo[4826]: pam_unix(sudo:session): session closed for user root
Oct 14 06:40:41 np0005486759.novalocal sudo[4869]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdlszdmwroxwiueloxdupvonahmicaaa ; /usr/bin/python3
Oct 14 06:40:41 np0005486759.novalocal sudo[4869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:40:41 np0005486759.novalocal python3[4871]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760424040.7414577-41-214336246836614/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:41 np0005486759.novalocal sudo[4869]: pam_unix(sudo:session): session closed for user root
Oct 14 06:40:43 np0005486759.novalocal python3[4899]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:43 np0005486759.novalocal python3[4913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:43 np0005486759.novalocal python3[4927]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:43 np0005486759.novalocal python3[4941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:44 np0005486759.novalocal python3[4955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:44 np0005486759.novalocal python3[4969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:44 np0005486759.novalocal python3[4983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:44 np0005486759.novalocal python3[4997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:45 np0005486759.novalocal python3[5011]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:45 np0005486759.novalocal python3[5025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:45 np0005486759.novalocal python3[5039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:45 np0005486759.novalocal python3[5053]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:46 np0005486759.novalocal python3[5067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:46 np0005486759.novalocal python3[5081]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:46 np0005486759.novalocal python3[5095]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:47 np0005486759.novalocal python3[5109]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:47 np0005486759.novalocal python3[5123]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:47 np0005486759.novalocal python3[5137]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:47 np0005486759.novalocal python3[5151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:48 np0005486759.novalocal python3[5165]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:48 np0005486759.novalocal python3[5179]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:48 np0005486759.novalocal python3[5193]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:48 np0005486759.novalocal python3[5207]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:49 np0005486759.novalocal python3[5221]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:49 np0005486759.novalocal python3[5235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:49 np0005486759.novalocal python3[5249]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:40:56 np0005486759.novalocal sudo[5263]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvrcyvdenwubcybluuerzrcvpvemdhto ; /usr/bin/python3
Oct 14 06:40:56 np0005486759.novalocal sudo[5263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:40:57 np0005486759.novalocal python3[5265]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 14 06:40:57 np0005486759.novalocal systemd[1]: Starting Time & Date Service...
Oct 14 06:40:57 np0005486759.novalocal systemd[1]: Started Time & Date Service.
Oct 14 06:40:57 np0005486759.novalocal systemd-timedated[5267]: Changed time zone to 'UTC' (UTC).
Oct 14 06:40:57 np0005486759.novalocal sudo[5263]: pam_unix(sudo:session): session closed for user root
Oct 14 06:40:57 np0005486759.novalocal sudo[5284]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpgnsdgpbgxfwxfjmrgsggbnymonlpaw ; /usr/bin/python3
Oct 14 06:40:57 np0005486759.novalocal sudo[5284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:40:57 np0005486759.novalocal python3[5286]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:40:57 np0005486759.novalocal sudo[5284]: pam_unix(sudo:session): session closed for user root
Oct 14 06:40:58 np0005486759.novalocal python3[5332]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:40:58 np0005486759.novalocal python3[5373]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1760424058.3655627-349-10310971737391/source _original_basename=tmp6qw74fyr follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:41:00 np0005486759.novalocal python3[5433]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:41:00 np0005486759.novalocal python3[5474]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760424059.7996488-419-234738602246239/source _original_basename=tmpp_qui_5d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:41:01 np0005486759.novalocal sudo[5534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpsxzuxeqrxucmdfjqoqkriziuagnfnq ; /usr/bin/python3
Oct 14 06:41:01 np0005486759.novalocal sudo[5534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:41:01 np0005486759.novalocal python3[5536]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:41:01 np0005486759.novalocal sudo[5534]: pam_unix(sudo:session): session closed for user root
Oct 14 06:41:02 np0005486759.novalocal sudo[5577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yztpncmsifkcbovhjgkhdrbswosbbncm ; /usr/bin/python3
Oct 14 06:41:02 np0005486759.novalocal sudo[5577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:41:02 np0005486759.novalocal python3[5579]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760424061.6281996-531-129584775580507/source _original_basename=tmphcdwsxle follow=False checksum=c5dc93499645bc89ff1f2366754d95ffd71986c3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:41:02 np0005486759.novalocal sudo[5577]: pam_unix(sudo:session): session closed for user root
Oct 14 06:41:03 np0005486759.novalocal python3[5607]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:41:03 np0005486759.novalocal python3[5623]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:41:04 np0005486759.novalocal sudo[5671]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgusuyaucujchravzhldfbisspygqlek ; /usr/bin/python3
Oct 14 06:41:04 np0005486759.novalocal sudo[5671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:41:04 np0005486759.novalocal python3[5673]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:41:04 np0005486759.novalocal sudo[5671]: pam_unix(sudo:session): session closed for user root
Oct 14 06:41:04 np0005486759.novalocal sudo[5714]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irjyjdllkrvoafedbvhbjpykwbestteh ; /usr/bin/python3
Oct 14 06:41:04 np0005486759.novalocal sudo[5714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:41:05 np0005486759.novalocal python3[5716]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1760424064.4347982-629-227234392552477/source _original_basename=tmpy317f_cx follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:41:05 np0005486759.novalocal sudo[5714]: pam_unix(sudo:session): session closed for user root
Oct 14 06:41:06 np0005486759.novalocal sudo[5746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqyzgmkocorchisqfkhwvnourwrdspsk ; /usr/bin/python3
Oct 14 06:41:06 np0005486759.novalocal sudo[5746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:41:06 np0005486759.novalocal python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-9572-4a8b-000000000021-1-cell1compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:41:06 np0005486759.novalocal sudo[5746]: pam_unix(sudo:session): session closed for user root
Oct 14 06:41:07 np0005486759.novalocal python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-9572-4a8b-000000000022-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 14 06:41:09 np0005486759.novalocal python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:41:27 np0005486759.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 14 06:41:27 np0005486759.novalocal sudo[5800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpdnqfunxptsdijobsfvnwdkapbltykb ; /usr/bin/python3
Oct 14 06:41:27 np0005486759.novalocal sudo[5800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:41:27 np0005486759.novalocal python3[5802]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:41:27 np0005486759.novalocal sudo[5800]: pam_unix(sudo:session): session closed for user root
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Oct 14 06:42:12 np0005486759.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Oct 14 06:42:12 np0005486759.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6257] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 14 06:42:12 np0005486759.novalocal systemd-udevd[5805]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6383] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Oct 14 06:42:12 np0005486759.novalocal systemd[4177]: Starting Mark boot as successful...
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6418] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6424] device (eth1): carrier: link connected
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6427] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6434] policy: auto-activating connection 'Wired connection 1' (cb03212c-1a47-35fa-a5d1-0c508f511e8f)
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6441] device (eth1): Activation: starting connection 'Wired connection 1' (cb03212c-1a47-35fa-a5d1-0c508f511e8f)
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6443] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6448] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6455] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Oct 14 06:42:12 np0005486759.novalocal NetworkManager[788]: <info>  [1760424132.6460] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 14 06:42:12 np0005486759.novalocal systemd[4177]: Finished Mark boot as successful.
Oct 14 06:42:13 np0005486759.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Oct 14 06:42:13 np0005486759.novalocal python3[5821]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-af62-f6de-000000000154-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:42:26 np0005486759.novalocal sudo[5870]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzlpzeyabgvjweyaeqyfqhlwtntgrsaq ; OS_CLOUD=vexxhost /usr/bin/python3
Oct 14 06:42:26 np0005486759.novalocal sudo[5870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:42:26 np0005486759.novalocal python3[5872]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:42:26 np0005486759.novalocal sudo[5870]: pam_unix(sudo:session): session closed for user root
Oct 14 06:42:27 np0005486759.novalocal sudo[5913]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvrkqihrotfvyjhwchajnitpociutuxg ; OS_CLOUD=vexxhost /usr/bin/python3
Oct 14 06:42:27 np0005486759.novalocal sudo[5913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:42:27 np0005486759.novalocal python3[5915]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760424146.647488-106-135904707429022/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=e921dbe09f8bb7926410da84d6d0427d42e8370b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:42:27 np0005486759.novalocal sudo[5913]: pam_unix(sudo:session): session closed for user root
Oct 14 06:42:27 np0005486759.novalocal sudo[5943]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vblpugudryrkxyvwindddhismvbvkpzz ; OS_CLOUD=vexxhost /usr/bin/python3
Oct 14 06:42:27 np0005486759.novalocal sudo[5943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:42:28 np0005486759.novalocal python3[5945]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Stopped Network Manager Wait Online.
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Stopping Network Manager Wait Online...
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Stopping Network Manager...
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1031] caught SIGTERM, shutting down normally.
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1130] dhcp4 (eth0): canceled DHCP transaction
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1130] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1130] dhcp4 (eth0): state changed no lease
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1134] manager: NetworkManager state is now CONNECTING
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1229] dhcp4 (eth1): canceled DHCP transaction
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1230] dhcp4 (eth1): state changed no lease
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[788]: <info>  [1760424149.1322] exiting (success)
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Stopped Network Manager.
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Starting Network Manager...
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.1917] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:ada6beb4-a7f5-4749-918d-0eab64543903)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.1920] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Started Network Manager.
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.1946] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.1979] manager[0x5579b5386090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Starting Hostname Service...
Oct 14 06:42:29 np0005486759.novalocal sudo[5943]: pam_unix(sudo:session): session closed for user root
Oct 14 06:42:29 np0005486759.novalocal systemd[1]: Started Hostname Service.
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2863] hostname: hostname: using hostnamed
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2864] hostname: static hostname changed from (none) to "np0005486759.novalocal"
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2868] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2872] manager[0x5579b5386090]: rfkill: Wi-Fi hardware radio set enabled
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2872] manager[0x5579b5386090]: rfkill: WWAN hardware radio set enabled
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2898] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2899] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2899] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2900] manager: Networking is enabled by state file
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2905] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2905] settings: Loaded settings plugin: keyfile (internal)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2933] dhcp: init: Using DHCP client 'internal'
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2938] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2943] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2947] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2954] device (lo): Activation: starting connection 'lo' (c18decb1-3b25-4285-a29a-63d4d1c4034a)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2959] device (eth0): carrier: link connected
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2962] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2966] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2967] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2971] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2976] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2981] device (eth1): carrier: link connected
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2984] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2988] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (cb03212c-1a47-35fa-a5d1-0c508f511e8f) (indicated)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2989] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2993] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.2998] device (eth1): Activation: starting connection 'Wired connection 1' (cb03212c-1a47-35fa-a5d1-0c508f511e8f)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3013] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3015] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3017] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3019] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3022] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3023] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3025] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3027] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3031] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3033] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3040] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3042] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3072] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3076] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3079] device (lo): Activation: successful, device activated.
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3113] dhcp4 (eth0): state changed new lease, address=38.102.83.246
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3118] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3186] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3202] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3205] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3209] manager: NetworkManager state is now CONNECTED_SITE
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3212] device (eth0): Activation: successful, device activated.
Oct 14 06:42:29 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424149.3221] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 14 06:42:29 np0005486759.novalocal python3[6016]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-af62-f6de-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:42:39 np0005486759.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 14 06:42:59 np0005486759.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 14 06:43:14 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424194.8054] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Oct 14 06:43:14 np0005486759.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 14 06:43:14 np0005486759.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 14 06:43:14 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424194.8292] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Oct 14 06:43:14 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424194.8297] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Oct 14 06:43:14 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424194.8308] device (eth1): Activation: successful, device activated.
Oct 14 06:43:14 np0005486759.novalocal NetworkManager[5960]: <info>  [1760424194.8317] manager: startup complete
Oct 14 06:43:14 np0005486759.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 14 06:43:24 np0005486759.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 14 06:43:29 np0005486759.novalocal sshd[4186]: Received disconnect from 38.102.83.114 port 38354:11: disconnected by user
Oct 14 06:43:29 np0005486759.novalocal sshd[4186]: Disconnected from user zuul 38.102.83.114 port 38354
Oct 14 06:43:29 np0005486759.novalocal sshd[4173]: pam_unix(sshd:session): session closed for user zuul
Oct 14 06:43:29 np0005486759.novalocal systemd-logind[759]: Session 1 logged out. Waiting for processes to exit.
Oct 14 06:45:07 np0005486759.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Oct 14 06:45:07 np0005486759.novalocal systemd[1]: efi.mount: Deactivated successfully.
Oct 14 06:45:07 np0005486759.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Oct 14 06:45:07 np0005486759.novalocal systemd[4177]: Created slice User Background Tasks Slice.
Oct 14 06:45:07 np0005486759.novalocal systemd[4177]: Starting Cleanup of User's Temporary Files and Directories...
Oct 14 06:45:07 np0005486759.novalocal systemd[4177]: Finished Cleanup of User's Temporary Files and Directories.
Oct 14 06:46:08 np0005486759.novalocal sshd[6050]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:46:08 np0005486759.novalocal sshd[6050]: Accepted publickey for zuul from 38.102.83.114 port 58368 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 06:46:08 np0005486759.novalocal systemd-logind[759]: New session 3 of user zuul.
Oct 14 06:46:08 np0005486759.novalocal systemd[1]: Started Session 3 of User zuul.
Oct 14 06:46:08 np0005486759.novalocal sshd[6050]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:46:09 np0005486759.novalocal sudo[6099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oomokmosxgizprrcwwyxiywzntlgerrb ; OS_CLOUD=vexxhost /usr/bin/python3
Oct 14 06:46:09 np0005486759.novalocal sudo[6099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:46:09 np0005486759.novalocal python3[6101]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:46:09 np0005486759.novalocal sudo[6099]: pam_unix(sudo:session): session closed for user root
Oct 14 06:46:09 np0005486759.novalocal sudo[6142]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhzuonyyvcuerfclfbimibmcfwfvpzgu ; OS_CLOUD=vexxhost /usr/bin/python3
Oct 14 06:46:09 np0005486759.novalocal sudo[6142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:46:09 np0005486759.novalocal python3[6144]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760424368.9389796-519-144525378323310/source _original_basename=tmpzoqwcm9w follow=False checksum=d56738cadd6f2b0e2c4bccb70ee9d0b698856e39 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:46:09 np0005486759.novalocal sudo[6142]: pam_unix(sudo:session): session closed for user root
Oct 14 06:46:15 np0005486759.novalocal sshd[6050]: pam_unix(sshd:session): session closed for user zuul
Oct 14 06:46:15 np0005486759.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Oct 14 06:46:15 np0005486759.novalocal systemd-logind[759]: Session 3 logged out. Waiting for processes to exit.
Oct 14 06:46:15 np0005486759.novalocal systemd-logind[759]: Removed session 3.
Oct 14 06:48:17 np0005486759.novalocal sshd[6161]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:48:19 np0005486759.novalocal sshd[6161]: Connection closed by authenticating user root 45.10.175.77 port 50536 [preauth]
Oct 14 06:52:13 np0005486759.novalocal sshd[6165]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:52:13 np0005486759.novalocal sshd[6165]: Accepted publickey for zuul from 38.102.83.114 port 49994 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 06:52:13 np0005486759.novalocal systemd-logind[759]: New session 4 of user zuul.
Oct 14 06:52:13 np0005486759.novalocal systemd[1]: Started Session 4 of User zuul.
Oct 14 06:52:13 np0005486759.novalocal sshd[6165]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:52:13 np0005486759.novalocal sudo[6182]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scpgfeqwwbzdpbfymnvubfmrrbuqfgut ; /usr/bin/python3
Oct 14 06:52:13 np0005486759.novalocal sudo[6182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:13 np0005486759.novalocal python3[6184]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-43b0-b13c-000000001d12-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:52:13 np0005486759.novalocal sudo[6182]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:14 np0005486759.novalocal sudo[6201]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhxmhnxynhbpikweuajvallfkqsgenjv ; /usr/bin/python3
Oct 14 06:52:14 np0005486759.novalocal sudo[6201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:14 np0005486759.novalocal python3[6203]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:52:14 np0005486759.novalocal sudo[6201]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:15 np0005486759.novalocal sudo[6217]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olsjczcsaayraksrfvlyfedshzlemjah ; /usr/bin/python3
Oct 14 06:52:15 np0005486759.novalocal sudo[6217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:15 np0005486759.novalocal python3[6219]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:52:15 np0005486759.novalocal sudo[6217]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:15 np0005486759.novalocal sudo[6233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dinpfeiuelpsytbjubmbzzbueczvkygn ; /usr/bin/python3
Oct 14 06:52:15 np0005486759.novalocal sudo[6233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:15 np0005486759.novalocal python3[6235]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:52:15 np0005486759.novalocal sudo[6233]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:15 np0005486759.novalocal sudo[6249]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uenqerdbxitdhiqllfcunhytsfvqcjbj ; /usr/bin/python3
Oct 14 06:52:15 np0005486759.novalocal sudo[6249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:15 np0005486759.novalocal python3[6251]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:52:15 np0005486759.novalocal sudo[6249]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:16 np0005486759.novalocal sudo[6265]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lldxlnoqotzcvmogwxcbwridejnclxjl ; /usr/bin/python3
Oct 14 06:52:16 np0005486759.novalocal sudo[6265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:16 np0005486759.novalocal python3[6267]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:52:16 np0005486759.novalocal python3[6267]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 14 06:52:16 np0005486759.novalocal sudo[6265]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:17 np0005486759.novalocal sudo[6281]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qosmcvlytdpszgqrytqfbereahpwbdbx ; /usr/bin/python3
Oct 14 06:52:17 np0005486759.novalocal sudo[6281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:18 np0005486759.novalocal python3[6283]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 06:52:18 np0005486759.novalocal systemd[1]: Reloading.
Oct 14 06:52:18 np0005486759.novalocal systemd-rc-local-generator[6301]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 06:52:18 np0005486759.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 06:52:18 np0005486759.novalocal sudo[6281]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:19 np0005486759.novalocal sudo[6329]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qisdctumnfeqndximzejyzaexgyjxkhq ; /usr/bin/python3
Oct 14 06:52:19 np0005486759.novalocal sudo[6329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:19 np0005486759.novalocal python3[6331]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 14 06:52:19 np0005486759.novalocal sudo[6329]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:20 np0005486759.novalocal sudo[6345]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iswazujllanxxgngmjhwouvtpxtiwcsi ; /usr/bin/python3
Oct 14 06:52:20 np0005486759.novalocal sudo[6345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:20 np0005486759.novalocal python3[6347]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:52:20 np0005486759.novalocal sudo[6345]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:20 np0005486759.novalocal sudo[6363]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roxvtonriqyhduthzzospazzxyxmkrva ; /usr/bin/python3
Oct 14 06:52:20 np0005486759.novalocal sudo[6363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:20 np0005486759.novalocal python3[6365]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:52:20 np0005486759.novalocal sudo[6363]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:20 np0005486759.novalocal sudo[6381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vovrvkmepjnvbagnyybfzxvmvspstydk ; /usr/bin/python3
Oct 14 06:52:20 np0005486759.novalocal sudo[6381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:20 np0005486759.novalocal python3[6383]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:52:20 np0005486759.novalocal sudo[6381]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:21 np0005486759.novalocal sudo[6399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffbgavmfhbwtckrrcimkqqfgfmhqbofx ; /usr/bin/python3
Oct 14 06:52:21 np0005486759.novalocal sudo[6399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:52:21 np0005486759.novalocal python3[6401]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:52:21 np0005486759.novalocal sudo[6399]: pam_unix(sudo:session): session closed for user root
Oct 14 06:52:22 np0005486759.novalocal python3[6418]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-43b0-b13c-000000001d18-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:52:23 np0005486759.novalocal python3[6438]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 06:52:27 np0005486759.novalocal sshd[6165]: pam_unix(sshd:session): session closed for user zuul
Oct 14 06:52:27 np0005486759.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Oct 14 06:52:27 np0005486759.novalocal systemd[1]: session-4.scope: Consumed 3.510s CPU time.
Oct 14 06:52:27 np0005486759.novalocal systemd-logind[759]: Session 4 logged out. Waiting for processes to exit.
Oct 14 06:52:27 np0005486759.novalocal systemd-logind[759]: Removed session 4.
Oct 14 06:53:55 np0005486759.novalocal sshd[6445]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:53:56 np0005486759.novalocal sshd[6445]: Accepted publickey for zuul from 38.102.83.114 port 56480 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 06:53:56 np0005486759.novalocal systemd-logind[759]: New session 5 of user zuul.
Oct 14 06:53:56 np0005486759.novalocal systemd[1]: Started Session 5 of User zuul.
Oct 14 06:53:56 np0005486759.novalocal sshd[6445]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:53:56 np0005486759.novalocal sudo[6462]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yibyztzloexgftqcyymsgpyaadfmjgzu ; /usr/bin/python3
Oct 14 06:53:56 np0005486759.novalocal sudo[6462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:53:56 np0005486759.novalocal systemd[1]: Starting RHSM dbus service...
Oct 14 06:53:57 np0005486759.novalocal systemd[1]: Started RHSM dbus service.
Oct 14 06:53:57 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Oct 14 06:53:57 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Oct 14 06:53:57 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Oct 14 06:53:57 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Oct 14 06:53:58 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005486759.novalocal (00d3e29c-79e6-406a-a1db-b33eef9df3e4)
Oct 14 06:53:58 np0005486759.novalocal subscription-manager[6469]: Registered system with identity: 00d3e29c-79e6-406a-a1db-b33eef9df3e4
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.entcertlib:131] certs updated:
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]: Total updates: 1
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]: Found (local) serial# []
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]: Expected (UEP) serial# [5374646008344713687]
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]: Added (new)
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]:   [sn:5374646008344713687 ( Content Access,) @ /etc/pki/entitlement/5374646008344713687.pem]
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]: Deleted (rogue):
Oct 14 06:53:59 np0005486759.novalocal rhsm-service[6469]:   <NONE>
Oct 14 06:53:59 np0005486759.novalocal subscription-manager[6469]: Added subscription for 'Content Access' contract 'None'
Oct 14 06:53:59 np0005486759.novalocal subscription-manager[6469]: Added subscription for product ' Content Access'
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 06:54:00 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 06:54:01 np0005486759.novalocal sudo[6462]: pam_unix(sudo:session): session closed for user root
Oct 14 06:54:03 np0005486759.novalocal python3[6560]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-6628-7956-00000000000b-1-cell1compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 06:54:04 np0005486759.novalocal sudo[6577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnfjghcpyggkbhrggmmjcetqxnclstkg ; /usr/bin/python3
Oct 14 06:54:04 np0005486759.novalocal sudo[6577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:54:04 np0005486759.novalocal python3[6579]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 06:54:35 np0005486759.novalocal setsebool[6654]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 14 06:54:35 np0005486759.novalocal setsebool[6654]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  Converting 407 SID table entries...
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 06:54:43 np0005486759.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 06:54:44 np0005486759.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Oct 14 06:54:44 np0005486759.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Oct 14 06:54:44 np0005486759.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 14 06:54:44 np0005486759.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Oct 14 06:54:44 np0005486759.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 14 06:54:56 np0005486759.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 06:54:56 np0005486759.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 14 06:54:56 np0005486759.novalocal systemd[1]: Reloading.
Oct 14 06:54:56 np0005486759.novalocal systemd-rc-local-generator[7459]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 06:54:56 np0005486759.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 06:54:56 np0005486759.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 06:54:58 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 06:54:58 np0005486759.novalocal sudo[6577]: pam_unix(sudo:session): session closed for user root
Oct 14 06:54:58 np0005486759.novalocal sudo[10078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdkgwvadzjefxnhiuevcivuuulmlawkx ; /usr/bin/python3
Oct 14 06:54:58 np0005486759.novalocal sudo[10078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:54:59 np0005486759.novalocal podman[10366]: 2025-10-14 06:54:59.202885526 +0000 UTC m=+0.105513068 system refresh
Oct 14 06:54:59 np0005486759.novalocal sudo[10078]: pam_unix(sudo:session): session closed for user root
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: Starting D-Bus User Message Bus...
Oct 14 06:54:59 np0005486759.novalocal dbus-broker-launch[11988]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 14 06:54:59 np0005486759.novalocal dbus-broker-launch[11988]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: Started D-Bus User Message Bus.
Oct 14 06:54:59 np0005486759.novalocal dbus-broker-lau[11988]: Ready
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: Created slice Slice /user.
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: podman-11863.scope: unit configures an IP firewall, but not running as root.
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: (This warning is only shown for the first unit using IP firewalling.)
Oct 14 06:54:59 np0005486759.novalocal systemd[4177]: Started podman-11863.scope.
Oct 14 06:55:00 np0005486759.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 06:55:00 np0005486759.novalocal systemd[4177]: Started podman-pause-6a31d212.scope.
Oct 14 06:55:00 np0005486759.novalocal sshd[6445]: pam_unix(sshd:session): session closed for user zuul
Oct 14 06:55:00 np0005486759.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Oct 14 06:55:00 np0005486759.novalocal systemd[1]: session-5.scope: Consumed 50.630s CPU time.
Oct 14 06:55:00 np0005486759.novalocal systemd-logind[759]: Session 5 logged out. Waiting for processes to exit.
Oct 14 06:55:00 np0005486759.novalocal systemd-logind[759]: Removed session 5.
Oct 14 06:55:05 np0005486759.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 06:55:05 np0005486759.novalocal systemd[1]: Finished man-db-cache-update.service.
Oct 14 06:55:05 np0005486759.novalocal systemd[1]: man-db-cache-update.service: Consumed 9.236s CPU time.
Oct 14 06:55:05 np0005486759.novalocal systemd[1]: run-rfe6ef73fb17e45488d3ebeeae0fe11ba.service: Deactivated successfully.
Oct 14 06:55:15 np0005486759.novalocal sshd[18317]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:55:15 np0005486759.novalocal sshd[18315]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:55:15 np0005486759.novalocal sshd[18314]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:55:15 np0005486759.novalocal sshd[18313]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:55:15 np0005486759.novalocal sshd[18317]: Unable to negotiate with 38.102.83.196 port 42200: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 14 06:55:15 np0005486759.novalocal sshd[18316]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:55:15 np0005486759.novalocal sshd[18315]: Connection closed by 38.102.83.196 port 42180 [preauth]
Oct 14 06:55:15 np0005486759.novalocal sshd[18316]: Connection closed by 38.102.83.196 port 42188 [preauth]
Oct 14 06:55:15 np0005486759.novalocal sshd[18314]: Unable to negotiate with 38.102.83.196 port 42208: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 14 06:55:15 np0005486759.novalocal sshd[18313]: Unable to negotiate with 38.102.83.196 port 42222: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 14 06:55:21 np0005486759.novalocal sshd[18323]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:55:21 np0005486759.novalocal sshd[18323]: Accepted publickey for zuul from 38.102.83.114 port 49102 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 06:55:21 np0005486759.novalocal systemd-logind[759]: New session 6 of user zuul.
Oct 14 06:55:21 np0005486759.novalocal systemd[1]: Started Session 6 of User zuul.
Oct 14 06:55:21 np0005486759.novalocal sshd[18323]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:55:21 np0005486759.novalocal python3[18340]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLOJAHCC2rB40dN0U1Oe26dRdlR0vv8g7YmzXLmc0SKu4ZxQ2+RjeVKEf9uD/V8BCctANgwCUY2f29gb92VSZEc= zuul@np0005486753.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:55:22 np0005486759.novalocal sudo[18354]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znvobnqzzkwpdhcmvnslojiexeuszqzp ; /usr/bin/python3
Oct 14 06:55:22 np0005486759.novalocal sudo[18354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:55:22 np0005486759.novalocal python3[18356]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLOJAHCC2rB40dN0U1Oe26dRdlR0vv8g7YmzXLmc0SKu4ZxQ2+RjeVKEf9uD/V8BCctANgwCUY2f29gb92VSZEc= zuul@np0005486753.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:55:22 np0005486759.novalocal sudo[18354]: pam_unix(sudo:session): session closed for user root
Oct 14 06:55:24 np0005486759.novalocal sshd[18323]: pam_unix(sshd:session): session closed for user zuul
Oct 14 06:55:24 np0005486759.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Oct 14 06:55:24 np0005486759.novalocal systemd-logind[759]: Session 6 logged out. Waiting for processes to exit.
Oct 14 06:55:24 np0005486759.novalocal systemd-logind[759]: Removed session 6.
Oct 14 06:56:53 np0005486759.novalocal sshd[18359]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:56:53 np0005486759.novalocal sshd[18359]: Accepted publickey for zuul from 38.102.83.114 port 35504 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 06:56:53 np0005486759.novalocal systemd-logind[759]: New session 7 of user zuul.
Oct 14 06:56:53 np0005486759.novalocal systemd[1]: Started Session 7 of User zuul.
Oct 14 06:56:53 np0005486759.novalocal sshd[18359]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:56:53 np0005486759.novalocal sudo[18376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzlwlhuzbgbbdvavpvkbhuzohvobbhtu ; /usr/bin/python3
Oct 14 06:56:53 np0005486759.novalocal sudo[18376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:56:53 np0005486759.novalocal python3[18378]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 06:56:53 np0005486759.novalocal sudo[18376]: pam_unix(sudo:session): session closed for user root
Oct 14 06:56:54 np0005486759.novalocal sudo[18392]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yaboiziazglskgfnpktgngsrbnmxhcwt ; /usr/bin/python3
Oct 14 06:56:54 np0005486759.novalocal sudo[18392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:56:54 np0005486759.novalocal python3[18394]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486759.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 14 06:56:54 np0005486759.novalocal sudo[18392]: pam_unix(sudo:session): session closed for user root
Oct 14 06:56:55 np0005486759.novalocal sudo[18442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtlyramflfeqagavzwqtmgrizbjwrgrt ; /usr/bin/python3
Oct 14 06:56:55 np0005486759.novalocal sudo[18442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:56:55 np0005486759.novalocal python3[18444]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:56:55 np0005486759.novalocal sudo[18442]: pam_unix(sudo:session): session closed for user root
Oct 14 06:56:56 np0005486759.novalocal sudo[18485]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvjtkaiopirrddtwheuspewmgzygzqyz ; /usr/bin/python3
Oct 14 06:56:56 np0005486759.novalocal sudo[18485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:56:56 np0005486759.novalocal python3[18487]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760425015.3641808-69-175380278459657/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=a74a7f2bbf8048dabf25acf29852887c_id_rsa follow=False checksum=9d8b8aca6ea7b20d043575148e7d25c9f08a33d8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:56:56 np0005486759.novalocal sudo[18485]: pam_unix(sudo:session): session closed for user root
Oct 14 06:56:57 np0005486759.novalocal sudo[18547]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjgwqhdylcvtivrnihhlpoalsqdvsmtx ; /usr/bin/python3
Oct 14 06:56:57 np0005486759.novalocal sudo[18547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:56:57 np0005486759.novalocal python3[18549]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:56:57 np0005486759.novalocal sudo[18547]: pam_unix(sudo:session): session closed for user root
Oct 14 06:56:57 np0005486759.novalocal sudo[18590]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqrdpslsutusobcpvwjhnymsssuntvyg ; /usr/bin/python3
Oct 14 06:56:57 np0005486759.novalocal sudo[18590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:56:57 np0005486759.novalocal python3[18592]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760425017.172527-139-22108606227188/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=a74a7f2bbf8048dabf25acf29852887c_id_rsa.pub follow=False checksum=e1ee6df319d3dc955b7c91f7210e7850de247fd5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:56:57 np0005486759.novalocal sudo[18590]: pam_unix(sudo:session): session closed for user root
Oct 14 06:56:59 np0005486759.novalocal sudo[18620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpcxanppagyyetftgmqabzrlcndpuavm ; /usr/bin/python3
Oct 14 06:56:59 np0005486759.novalocal sudo[18620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 06:57:00 np0005486759.novalocal python3[18622]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:57:00 np0005486759.novalocal sudo[18620]: pam_unix(sudo:session): session closed for user root
Oct 14 06:57:00 np0005486759.novalocal python3[18668]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:57:00 np0005486759.novalocal python3[18684]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmptx9wu1jz recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:57:01 np0005486759.novalocal python3[18744]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:57:02 np0005486759.novalocal python3[18760]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpdymq3702 recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:57:03 np0005486759.novalocal python3[18820]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 06:57:03 np0005486759.novalocal python3[18836]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmp_w5f6tm_ recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 06:57:04 np0005486759.novalocal sshd[18359]: pam_unix(sshd:session): session closed for user zuul
Oct 14 06:57:04 np0005486759.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Oct 14 06:57:04 np0005486759.novalocal systemd[1]: session-7.scope: Consumed 3.304s CPU time.
Oct 14 06:57:04 np0005486759.novalocal systemd-logind[759]: Session 7 logged out. Waiting for processes to exit.
Oct 14 06:57:04 np0005486759.novalocal systemd-logind[759]: Removed session 7.
Oct 14 06:59:08 np0005486759.novalocal sshd[18851]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 06:59:09 np0005486759.novalocal sshd[18851]: Accepted publickey for zuul from 38.102.83.196 port 32894 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 06:59:09 np0005486759.novalocal systemd-logind[759]: New session 8 of user zuul.
Oct 14 06:59:09 np0005486759.novalocal systemd[1]: Started Session 8 of User zuul.
Oct 14 06:59:09 np0005486759.novalocal sshd[18851]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 06:59:09 np0005486759.novalocal python3[18897]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:01:01 np0005486759.novalocal CROND[18900]: (root) CMD (run-parts /etc/cron.hourly)
Oct 14 07:01:01 np0005486759.novalocal run-parts[18903]: (/etc/cron.hourly) starting 0anacron
Oct 14 07:01:01 np0005486759.novalocal anacron[18911]: Anacron started on 2025-10-14
Oct 14 07:01:01 np0005486759.novalocal anacron[18911]: Will run job `cron.daily' in 10 min.
Oct 14 07:01:01 np0005486759.novalocal anacron[18911]: Will run job `cron.weekly' in 30 min.
Oct 14 07:01:01 np0005486759.novalocal anacron[18911]: Will run job `cron.monthly' in 50 min.
Oct 14 07:01:01 np0005486759.novalocal anacron[18911]: Jobs will be executed sequentially
Oct 14 07:01:01 np0005486759.novalocal run-parts[18913]: (/etc/cron.hourly) finished 0anacron
Oct 14 07:01:01 np0005486759.novalocal CROND[18899]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 14 07:04:09 np0005486759.novalocal sshd[18854]: Received disconnect from 38.102.83.196 port 32894:11: disconnected by user
Oct 14 07:04:09 np0005486759.novalocal sshd[18854]: Disconnected from user zuul 38.102.83.196 port 32894
Oct 14 07:04:09 np0005486759.novalocal sshd[18851]: pam_unix(sshd:session): session closed for user zuul
Oct 14 07:04:09 np0005486759.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Oct 14 07:04:09 np0005486759.novalocal systemd-logind[759]: Session 8 logged out. Waiting for processes to exit.
Oct 14 07:04:09 np0005486759.novalocal systemd-logind[759]: Removed session 8.
Oct 14 07:10:47 np0005486759.novalocal sshd[18919]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:10:47 np0005486759.novalocal sshd[18919]: Accepted publickey for zuul from 38.102.83.114 port 57998 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 07:10:47 np0005486759.novalocal systemd-logind[759]: New session 9 of user zuul.
Oct 14 07:10:47 np0005486759.novalocal systemd[1]: Started Session 9 of User zuul.
Oct 14 07:10:47 np0005486759.novalocal sshd[18919]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 07:10:47 np0005486759.novalocal python3[18936]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-8ed4-2004-00000000000a-1-cell1compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:10:48 np0005486759.novalocal sudo[18954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uikxljogtfvhbvyozmvbaxuwkthworgn ; /usr/bin/python3
Oct 14 07:10:48 np0005486759.novalocal sudo[18954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:10:48 np0005486759.novalocal python3[18956]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-8ed4-2004-00000000000b-1-cell1compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:10:50 np0005486759.novalocal sudo[18954]: pam_unix(sudo:session): session closed for user root
Oct 14 07:10:51 np0005486759.novalocal sudo[18973]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctwnzjdkgkxchybpkelawzokynyygepx ; /usr/bin/python3
Oct 14 07:10:51 np0005486759.novalocal sudo[18973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:10:51 np0005486759.novalocal python3[18975]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Oct 14 07:10:54 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:01 np0005486759.novalocal anacron[18911]: Job `cron.daily' started
Oct 14 07:11:01 np0005486759.novalocal anacron[18911]: Job `cron.daily' terminated
Oct 14 07:11:19 np0005486759.novalocal sudo[18973]: pam_unix(sudo:session): session closed for user root
Oct 14 07:11:20 np0005486759.novalocal sudo[19133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avhhdplcroahotghcozytyrwiienpqhf ; /usr/bin/python3
Oct 14 07:11:20 np0005486759.novalocal sudo[19133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:11:20 np0005486759.novalocal python3[19135]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Oct 14 07:11:23 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:26 np0005486759.novalocal sudo[19133]: pam_unix(sudo:session): session closed for user root
Oct 14 07:11:26 np0005486759.novalocal sudo[19273]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qulesvyniqgijlprtqwvszfabxszbfwx ; /usr/bin/python3
Oct 14 07:11:26 np0005486759.novalocal sudo[19273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:11:26 np0005486759.novalocal python3[19275]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Oct 14 07:11:29 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:29 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:34 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:34 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:42 np0005486759.novalocal sudo[19273]: pam_unix(sudo:session): session closed for user root
Oct 14 07:11:43 np0005486759.novalocal sudo[19549]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhhrqsvlfwctetmweeauannwspihbwip ; /usr/bin/python3
Oct 14 07:11:43 np0005486759.novalocal sudo[19549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:11:43 np0005486759.novalocal python3[19551]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Oct 14 07:11:46 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:46 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:51 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:51 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:11:58 np0005486759.novalocal sudo[19549]: pam_unix(sudo:session): session closed for user root
Oct 14 07:11:58 np0005486759.novalocal sudo[19885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kclerrutfgdkonnmbsneddlwjfritkwr ; /usr/bin/python3
Oct 14 07:11:58 np0005486759.novalocal sudo[19885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:11:59 np0005486759.novalocal python3[19887]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Oct 14 07:12:01 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:12:07 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:12:07 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:12:14 np0005486759.novalocal sudo[19885]: pam_unix(sudo:session): session closed for user root
Oct 14 07:12:15 np0005486759.novalocal sudo[20163]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrwpcwiohtvrkskuxbzliazvjmqrxpsv ; /usr/bin/python3
Oct 14 07:12:15 np0005486759.novalocal sudo[20163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:12:16 np0005486759.novalocal python3[20165]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8ed4-2004-000000000011-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:12:18 np0005486759.novalocal sudo[20163]: pam_unix(sudo:session): session closed for user root
Oct 14 07:12:18 np0005486759.novalocal sudo[20182]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqodpubmhqvxllvxxehhviylrdkahwzg ; /usr/bin/python3
Oct 14 07:12:18 np0005486759.novalocal sudo[20182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:12:18 np0005486759.novalocal python3[20184]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  Converting 489 SID table entries...
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 07:12:38 np0005486759.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 07:12:38 np0005486759.novalocal groupadd[20283]: group added to /etc/group: name=unbound, GID=987
Oct 14 07:12:38 np0005486759.novalocal groupadd[20283]: group added to /etc/gshadow: name=unbound
Oct 14 07:12:38 np0005486759.novalocal groupadd[20283]: new group: name=unbound, GID=987
Oct 14 07:12:38 np0005486759.novalocal useradd[20290]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Oct 14 07:12:38 np0005486759.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Oct 14 07:12:38 np0005486759.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 14 07:12:38 np0005486759.novalocal groupadd[20303]: group added to /etc/group: name=openvswitch, GID=986
Oct 14 07:12:38 np0005486759.novalocal groupadd[20303]: group added to /etc/gshadow: name=openvswitch
Oct 14 07:12:38 np0005486759.novalocal groupadd[20303]: new group: name=openvswitch, GID=986
Oct 14 07:12:38 np0005486759.novalocal useradd[20310]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Oct 14 07:12:38 np0005486759.novalocal groupadd[20318]: group added to /etc/group: name=hugetlbfs, GID=985
Oct 14 07:12:38 np0005486759.novalocal groupadd[20318]: group added to /etc/gshadow: name=hugetlbfs
Oct 14 07:12:39 np0005486759.novalocal groupadd[20318]: new group: name=hugetlbfs, GID=985
Oct 14 07:12:39 np0005486759.novalocal usermod[20326]: add 'openvswitch' to group 'hugetlbfs'
Oct 14 07:12:39 np0005486759.novalocal usermod[20326]: add 'openvswitch' to shadow group 'hugetlbfs'
Oct 14 07:12:42 np0005486759.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 07:12:42 np0005486759.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 14 07:12:42 np0005486759.novalocal systemd[1]: Reloading.
Oct 14 07:12:42 np0005486759.novalocal systemd-sysv-generator[20857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 07:12:42 np0005486759.novalocal systemd-rc-local-generator[20851]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 07:12:42 np0005486759.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 07:12:42 np0005486759.novalocal systemd[1]: Starting dnf makecache...
Oct 14 07:12:42 np0005486759.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 07:12:42 np0005486759.novalocal dnf[21100]: Updating Subscription Management repositories.
Oct 14 07:12:43 np0005486759.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 07:12:43 np0005486759.novalocal systemd[1]: Finished man-db-cache-update.service.
Oct 14 07:12:43 np0005486759.novalocal systemd[1]: run-r1c3af740dc294784937ac717b4102134.service: Deactivated successfully.
Oct 14 07:12:44 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:12:44 np0005486759.novalocal sudo[20182]: pam_unix(sudo:session): session closed for user root
Oct 14 07:12:44 np0005486759.novalocal dnf[21100]: Failed determining last makecache time.
Oct 14 07:12:44 np0005486759.novalocal rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 07:12:44 np0005486759.novalocal dnf[21100]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   45 kB/s | 4.1 kB     00:00
Oct 14 07:12:44 np0005486759.novalocal dnf[21100]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_  54 kB/s | 4.0 kB     00:00
Oct 14 07:12:44 np0005486759.novalocal dnf[21100]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  53 kB/s | 4.5 kB     00:00
Oct 14 07:12:45 np0005486759.novalocal dnf[21100]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  54 kB/s | 4.5 kB     00:00
Oct 14 07:12:45 np0005486759.novalocal sudo[21493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkexoxcholqlapofmauwamibwsjrlbas ; /usr/bin/python3
Oct 14 07:12:45 np0005486759.novalocal sudo[21493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:12:45 np0005486759.novalocal dnf[21100]: Red Hat Enterprise Linux 9 for x86_64 - High Av  55 kB/s | 4.0 kB     00:00
Oct 14 07:12:45 np0005486759.novalocal python3[21495]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8ed4-2004-000000000013-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:12:45 np0005486759.novalocal dnf[21100]: Fast Datapath for RHEL 9 x86_64 (RPMs)           23 kB/s | 4.0 kB     00:00
Oct 14 07:12:45 np0005486759.novalocal dnf[21100]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   36 kB/s | 4.1 kB     00:00
Oct 14 07:12:45 np0005486759.novalocal dnf[21100]: Metadata cache created.
Oct 14 07:12:45 np0005486759.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 14 07:12:45 np0005486759.novalocal systemd[1]: Finished dnf makecache.
Oct 14 07:12:45 np0005486759.novalocal systemd[1]: dnf-makecache.service: Consumed 2.710s CPU time.
Oct 14 07:12:57 np0005486759.novalocal sudo[21493]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:01 np0005486759.novalocal sudo[21516]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnooudsehliiyjalehztgpzsbtteufel ; /usr/bin/python3
Oct 14 07:13:01 np0005486759.novalocal sudo[21516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:01 np0005486759.novalocal python3[21518]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 07:13:01 np0005486759.novalocal sudo[21516]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:02 np0005486759.novalocal sudo[21564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgfwqmovtuxagulchtvuekpiejrqlpfn ; /usr/bin/python3
Oct 14 07:13:02 np0005486759.novalocal sudo[21564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:02 np0005486759.novalocal python3[21566]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 07:13:02 np0005486759.novalocal sudo[21564]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:02 np0005486759.novalocal sudo[21607]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sugwwezvnvwwthczombgzzbtfxmlmobp ; /usr/bin/python3
Oct 14 07:13:02 np0005486759.novalocal sudo[21607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:02 np0005486759.novalocal python3[21609]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760425981.7954009-206-85742657667808/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=9333f42ac4b9baf349a5c32f7bcba3335b5912e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 07:13:02 np0005486759.novalocal sudo[21607]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:02 np0005486759.novalocal sudo[21637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggyfzbpauwfjtmltolqjustwjabfgoxc ; /usr/bin/python3
Oct 14 07:13:02 np0005486759.novalocal sudo[21637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:03 np0005486759.novalocal python3[21639]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Oct 14 07:13:03 np0005486759.novalocal sudo[21637]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:03 np0005486759.novalocal systemd-journald[618]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 91.6 (305 of 333 items), suggesting rotation.
Oct 14 07:13:03 np0005486759.novalocal systemd-journald[618]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 07:13:03 np0005486759.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 07:13:03 np0005486759.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 07:13:03 np0005486759.novalocal sudo[21658]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efyfgomdyakcvwofrrccxymfyimkpysq ; /usr/bin/python3
Oct 14 07:13:03 np0005486759.novalocal sudo[21658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:03 np0005486759.novalocal python3[21660]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Oct 14 07:13:03 np0005486759.novalocal sudo[21658]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:03 np0005486759.novalocal sudo[21678]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfgytsruzimqybvkxnejstddkztdcibv ; /usr/bin/python3
Oct 14 07:13:03 np0005486759.novalocal sudo[21678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:03 np0005486759.novalocal python3[21680]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Oct 14 07:13:03 np0005486759.novalocal sudo[21678]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:03 np0005486759.novalocal sudo[21698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkmrgnptfgcjwzoeavyaeucnuvvdtvxh ; /usr/bin/python3
Oct 14 07:13:03 np0005486759.novalocal sudo[21698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:03 np0005486759.novalocal python3[21700]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Oct 14 07:13:04 np0005486759.novalocal sudo[21698]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:04 np0005486759.novalocal sudo[21718]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cocqivtpwgexiyjhzuadenzghkudxwik ; /usr/bin/python3
Oct 14 07:13:04 np0005486759.novalocal sudo[21718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:04 np0005486759.novalocal python3[21720]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Oct 14 07:13:04 np0005486759.novalocal sudo[21718]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:05 np0005486759.novalocal sudo[21738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebppxmnqyzejzvhmjypvaeygtvtlpqtl ; /usr/bin/python3
Oct 14 07:13:05 np0005486759.novalocal sudo[21738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:06 np0005486759.novalocal python3[21740]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 07:13:06 np0005486759.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Oct 14 07:13:06 np0005486759.novalocal network[21743]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:06 np0005486759.novalocal network[21754]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:06 np0005486759.novalocal network[21743]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:06 np0005486759.novalocal network[21755]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:06 np0005486759.novalocal network[21743]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 07:13:06 np0005486759.novalocal network[21756]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 07:13:06 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425986.2516] audit: op="connections-reload" pid=21784 uid=0 result="success"
Oct 14 07:13:06 np0005486759.novalocal network[21743]: Bringing up loopback interface:  [  OK  ]
Oct 14 07:13:06 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425986.4257] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=21872 uid=0 result="success"
Oct 14 07:13:06 np0005486759.novalocal network[21743]: Bringing up interface eth0:  [  OK  ]
Oct 14 07:13:06 np0005486759.novalocal systemd[1]: Started LSB: Bring up/down networking.
Oct 14 07:13:06 np0005486759.novalocal sudo[21738]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:06 np0005486759.novalocal sudo[21911]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aelgmgsyucxjvlldwmjimpfzgpevucwa ; /usr/bin/python3
Oct 14 07:13:06 np0005486759.novalocal sudo[21911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:06 np0005486759.novalocal python3[21913]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 07:13:06 np0005486759.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Oct 14 07:13:06 np0005486759.novalocal chown[21917]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 14 07:13:06 np0005486759.novalocal ovs-ctl[21922]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 14 07:13:06 np0005486759.novalocal ovs-ctl[21922]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal ovs-ctl[21922]: Starting ovsdb-server [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal ovs-vsctl[21971]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 14 07:13:07 np0005486759.novalocal ovs-vsctl[21991]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-110.el9fdp "external-ids:system-id=\"93d451ec-9a31-4880-9638-030ff3f86e88\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Oct 14 07:13:07 np0005486759.novalocal ovs-ctl[21922]: Configuring Open vSwitch system IDs [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal ovs-ctl[21922]: Enabling remote OVSDB managers [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Started Open vSwitch Database Unit.
Oct 14 07:13:07 np0005486759.novalocal ovs-vsctl[21997]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005486759.novalocal
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 14 07:13:07 np0005486759.novalocal kernel: openvswitch: Open vSwitch switching datapath
Oct 14 07:13:07 np0005486759.novalocal ovs-ctl[22041]: Inserting openvswitch module [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal ovs-ctl[22010]: Starting ovs-vswitchd [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal ovs-vsctl[22059]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005486759.novalocal
Oct 14 07:13:07 np0005486759.novalocal ovs-ctl[22010]: Enabling remote OVSDB managers [  OK  ]
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Starting Open vSwitch...
Oct 14 07:13:07 np0005486759.novalocal systemd[1]: Finished Open vSwitch.
Oct 14 07:13:07 np0005486759.novalocal sudo[21911]: pam_unix(sudo:session): session closed for user root
Oct 14 07:13:08 np0005486759.novalocal sudo[22075]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlekcdbbtkvvjbtlgdqbqjaxryymufxl ; /usr/bin/python3
Oct 14 07:13:08 np0005486759.novalocal sudo[22075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:13:09 np0005486759.novalocal python3[22077]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8ed4-2004-000000000018-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:13:09 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425989.9324] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22235 uid=0 result="success"
Oct 14 07:13:09 np0005486759.novalocal ifup[22236]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:09 np0005486759.novalocal ifup[22237]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:09 np0005486759.novalocal ifup[22238]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:09 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425989.9615] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22244 uid=0 result="success"
Oct 14 07:13:09 np0005486759.novalocal ovs-vsctl[22246]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:21:a1:c6 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Oct 14 07:13:09 np0005486759.novalocal kernel: device ovs-system entered promiscuous mode
Oct 14 07:13:09 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425989.9823] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Oct 14 07:13:09 np0005486759.novalocal systemd-udevd[21984]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 07:13:09 np0005486759.novalocal kernel: Timeout policy base is empty
Oct 14 07:13:09 np0005486759.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Oct 14 07:13:10 np0005486759.novalocal kernel: device br-ex entered promiscuous mode
Oct 14 07:13:10 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425990.0333] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Oct 14 07:13:10 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425990.0595] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22271 uid=0 result="success"
Oct 14 07:13:10 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425990.0804] device (br-ex): carrier: link connected
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.1452] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22300 uid=0 result="success"
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.1900] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22315 uid=0 result="success"
Oct 14 07:13:13 np0005486759.novalocal NET[22340]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.2822] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.2900] dhcp4 (eth1): canceled DHCP transaction
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.2900] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.2900] dhcp4 (eth1): state changed no lease
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.2965] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22349 uid=0 result="success"
Oct 14 07:13:13 np0005486759.novalocal ifup[22350]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:13 np0005486759.novalocal ifup[22351]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:13 np0005486759.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 14 07:13:13 np0005486759.novalocal ifup[22353]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:13 np0005486759.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.3314] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22365 uid=0 result="success"
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.3764] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22377 uid=0 result="success"
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.3835] device (eth1): carrier: link connected
Oct 14 07:13:13 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425993.4096] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22386 uid=0 result="success"
Oct 14 07:13:13 np0005486759.novalocal ipv6_wait_tentative[22398]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Oct 14 07:13:14 np0005486759.novalocal ipv6_wait_tentative[22403]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.4757] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22412 uid=0 result="success"
Oct 14 07:13:15 np0005486759.novalocal ovs-vsctl[22427]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Oct 14 07:13:15 np0005486759.novalocal kernel: device eth1 entered promiscuous mode
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.5487] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22435 uid=0 result="success"
Oct 14 07:13:15 np0005486759.novalocal ifup[22436]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:15 np0005486759.novalocal ifup[22437]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:15 np0005486759.novalocal ifup[22438]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.5796] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22444 uid=0 result="success"
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.6222] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22454 uid=0 result="success"
Oct 14 07:13:15 np0005486759.novalocal ifup[22455]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:15 np0005486759.novalocal ifup[22456]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:15 np0005486759.novalocal ifup[22457]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.6538] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22463 uid=0 result="success"
Oct 14 07:13:15 np0005486759.novalocal ovs-vsctl[22466]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Oct 14 07:13:15 np0005486759.novalocal kernel: device vlan23 entered promiscuous mode
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.6896] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Oct 14 07:13:15 np0005486759.novalocal systemd-udevd[22468]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.7134] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22477 uid=0 result="success"
Oct 14 07:13:15 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425995.7334] device (vlan23): carrier: link connected
Oct 14 07:13:18 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425998.7847] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22507 uid=0 result="success"
Oct 14 07:13:18 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425998.8303] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22522 uid=0 result="success"
Oct 14 07:13:18 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425998.8935] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22543 uid=0 result="success"
Oct 14 07:13:18 np0005486759.novalocal ifup[22544]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:18 np0005486759.novalocal ifup[22545]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:18 np0005486759.novalocal ifup[22546]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:18 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425998.9288] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22552 uid=0 result="success"
Oct 14 07:13:18 np0005486759.novalocal ovs-vsctl[22555]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Oct 14 07:13:19 np0005486759.novalocal kernel: device vlan20 entered promiscuous mode
Oct 14 07:13:19 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425999.0021] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Oct 14 07:13:19 np0005486759.novalocal systemd-udevd[22557]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 07:13:19 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425999.0316] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22567 uid=0 result="success"
Oct 14 07:13:19 np0005486759.novalocal NetworkManager[5960]: <info>  [1760425999.0564] device (vlan20): carrier: link connected
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.1047] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22598 uid=0 result="success"
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.1512] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22613 uid=0 result="success"
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.2063] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22634 uid=0 result="success"
Oct 14 07:13:22 np0005486759.novalocal ifup[22635]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:22 np0005486759.novalocal ifup[22636]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:22 np0005486759.novalocal ifup[22637]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.2385] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22643 uid=0 result="success"
Oct 14 07:13:22 np0005486759.novalocal ovs-vsctl[22646]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Oct 14 07:13:22 np0005486759.novalocal kernel: device vlan22 entered promiscuous mode
Oct 14 07:13:22 np0005486759.novalocal systemd-udevd[22648]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.2735] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.3017] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22658 uid=0 result="success"
Oct 14 07:13:22 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426002.3216] device (vlan22): carrier: link connected
Oct 14 07:13:23 np0005486759.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.3710] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22688 uid=0 result="success"
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.4173] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22703 uid=0 result="success"
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.4738] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22724 uid=0 result="success"
Oct 14 07:13:25 np0005486759.novalocal ifup[22725]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:25 np0005486759.novalocal ifup[22726]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:25 np0005486759.novalocal ifup[22727]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.5029] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22733 uid=0 result="success"
Oct 14 07:13:25 np0005486759.novalocal ovs-vsctl[22736]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.5436] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Oct 14 07:13:25 np0005486759.novalocal kernel: device vlan44 entered promiscuous mode
Oct 14 07:13:25 np0005486759.novalocal systemd-udevd[22738]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.5683] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22748 uid=0 result="success"
Oct 14 07:13:25 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426005.5901] device (vlan44): carrier: link connected
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.6414] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22778 uid=0 result="success"
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.6924] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22793 uid=0 result="success"
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.7563] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22814 uid=0 result="success"
Oct 14 07:13:28 np0005486759.novalocal ifup[22815]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:28 np0005486759.novalocal ifup[22816]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:28 np0005486759.novalocal ifup[22817]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.7934] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22823 uid=0 result="success"
Oct 14 07:13:28 np0005486759.novalocal ovs-vsctl[22826]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Oct 14 07:13:28 np0005486759.novalocal kernel: device vlan21 entered promiscuous mode
Oct 14 07:13:28 np0005486759.novalocal systemd-udevd[22828]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.8386] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.8703] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22838 uid=0 result="success"
Oct 14 07:13:28 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426008.8932] device (vlan21): carrier: link connected
Oct 14 07:13:31 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426011.9534] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22868 uid=0 result="success"
Oct 14 07:13:32 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426012.0020] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22883 uid=0 result="success"
Oct 14 07:13:32 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426012.0634] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22904 uid=0 result="success"
Oct 14 07:13:32 np0005486759.novalocal ifup[22905]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:32 np0005486759.novalocal ifup[22906]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:32 np0005486759.novalocal ifup[22907]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:32 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426012.1023] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22913 uid=0 result="success"
Oct 14 07:13:32 np0005486759.novalocal ovs-vsctl[22916]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Oct 14 07:13:32 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426012.1640] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22923 uid=0 result="success"
Oct 14 07:13:33 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426013.2270] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22950 uid=0 result="success"
Oct 14 07:13:33 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426013.2771] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22965 uid=0 result="success"
Oct 14 07:13:33 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426013.3397] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22986 uid=0 result="success"
Oct 14 07:13:33 np0005486759.novalocal ifup[22987]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:33 np0005486759.novalocal ifup[22988]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:33 np0005486759.novalocal ifup[22989]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:33 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426013.3722] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22995 uid=0 result="success"
Oct 14 07:13:33 np0005486759.novalocal ovs-vsctl[22998]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Oct 14 07:13:33 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426013.4372] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23005 uid=0 result="success"
Oct 14 07:13:34 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426014.4946] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23033 uid=0 result="success"
Oct 14 07:13:34 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426014.5330] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23048 uid=0 result="success"
Oct 14 07:13:34 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426014.5900] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23069 uid=0 result="success"
Oct 14 07:13:34 np0005486759.novalocal ifup[23070]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:34 np0005486759.novalocal ifup[23071]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:34 np0005486759.novalocal ifup[23072]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:34 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426014.6208] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23078 uid=0 result="success"
Oct 14 07:13:34 np0005486759.novalocal ovs-vsctl[23081]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Oct 14 07:13:34 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426014.6739] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23088 uid=0 result="success"
Oct 14 07:13:35 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426015.7363] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23116 uid=0 result="success"
Oct 14 07:13:35 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426015.7825] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23131 uid=0 result="success"
Oct 14 07:13:35 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426015.8397] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23152 uid=0 result="success"
Oct 14 07:13:35 np0005486759.novalocal ifup[23153]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:35 np0005486759.novalocal ifup[23154]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:35 np0005486759.novalocal ifup[23155]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:35 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426015.8738] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23161 uid=0 result="success"
Oct 14 07:13:35 np0005486759.novalocal ovs-vsctl[23164]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Oct 14 07:13:35 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426015.9303] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23171 uid=0 result="success"
Oct 14 07:13:36 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426016.9871] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23199 uid=0 result="success"
Oct 14 07:13:37 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426017.0323] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23214 uid=0 result="success"
Oct 14 07:13:37 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426017.0919] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23235 uid=0 result="success"
Oct 14 07:13:37 np0005486759.novalocal ifup[23236]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Oct 14 07:13:37 np0005486759.novalocal ifup[23237]: 'network-scripts' will be removed from distribution in near future.
Oct 14 07:13:37 np0005486759.novalocal ifup[23238]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Oct 14 07:13:37 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426017.1219] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23244 uid=0 result="success"
Oct 14 07:13:37 np0005486759.novalocal ovs-vsctl[23247]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Oct 14 07:13:37 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426017.1766] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23254 uid=0 result="success"
Oct 14 07:13:38 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426018.2321] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23282 uid=0 result="success"
Oct 14 07:13:38 np0005486759.novalocal NetworkManager[5960]: <info>  [1760426018.2799] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23297 uid=0 result="success"
Oct 14 07:13:38 np0005486759.novalocal sudo[22075]: pam_unix(sudo:session): session closed for user root
Oct 14 07:14:01 np0005486759.novalocal python3[23329]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8ed4-2004-000000000019-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:14:04 np0005486759.novalocal python3[23348]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 07:14:04 np0005486759.novalocal sudo[23362]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvchvqmdwzmenwyoaigcxwfqjszbagbz ; /usr/bin/python3
Oct 14 07:14:04 np0005486759.novalocal sudo[23362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:14:05 np0005486759.novalocal python3[23364]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 07:14:05 np0005486759.novalocal sudo[23362]: pam_unix(sudo:session): session closed for user root
Oct 14 07:14:05 np0005486759.novalocal python3[23378]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 07:14:06 np0005486759.novalocal sudo[23392]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-immhvqkihfwqhinmcftjmodtbgrgxrmj ; /usr/bin/python3
Oct 14 07:14:06 np0005486759.novalocal sudo[23392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:14:06 np0005486759.novalocal python3[23394]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 14 07:14:06 np0005486759.novalocal sudo[23392]: pam_unix(sudo:session): session closed for user root
Oct 14 07:14:06 np0005486759.novalocal python3[23408]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Oct 14 07:14:07 np0005486759.novalocal python3[23423]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005486759.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8ed4-2004-000000000020-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:14:07 np0005486759.novalocal sudo[23441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sszegglhtaezcfcyyckvfneottsmbweu ; /usr/bin/python3
Oct 14 07:14:07 np0005486759.novalocal sudo[23441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 07:14:07 np0005486759.novalocal python3[23443]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.ooo.test"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8ed4-2004-000000000021-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 07:14:07 np0005486759.novalocal systemd[1]: Starting Hostname Service...
Oct 14 07:14:07 np0005486759.novalocal systemd[1]: Started Hostname Service.
Oct 14 07:14:07 np0005486759.ooo.test systemd-hostnamed[23447]: Hostname set to <np0005486759.ooo.test> (static)
Oct 14 07:14:07 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760426047.8264] hostname: static hostname changed from "np0005486759.novalocal" to "np0005486759.ooo.test"
Oct 14 07:14:07 np0005486759.ooo.test systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 14 07:14:07 np0005486759.ooo.test systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 14 07:14:07 np0005486759.ooo.test sudo[23441]: pam_unix(sudo:session): session closed for user root
Oct 14 07:14:08 np0005486759.ooo.test sshd[18919]: pam_unix(sshd:session): session closed for user zuul
Oct 14 07:14:08 np0005486759.ooo.test systemd[1]: session-9.scope: Deactivated successfully.
Oct 14 07:14:08 np0005486759.ooo.test systemd[1]: session-9.scope: Consumed 1min 43.630s CPU time.
Oct 14 07:14:08 np0005486759.ooo.test systemd-logind[759]: Session 9 logged out. Waiting for processes to exit.
Oct 14 07:14:08 np0005486759.ooo.test systemd-logind[759]: Removed session 9.
Oct 14 07:14:10 np0005486759.ooo.test sshd[23458]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:14:10 np0005486759.ooo.test sshd[23458]: Accepted publickey for zuul from 38.102.83.114 port 60204 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 07:14:10 np0005486759.ooo.test systemd-logind[759]: New session 10 of user zuul.
Oct 14 07:14:10 np0005486759.ooo.test systemd[1]: Started Session 10 of User zuul.
Oct 14 07:14:10 np0005486759.ooo.test sshd[23458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 07:14:10 np0005486759.ooo.test python3[23475]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Oct 14 07:14:12 np0005486759.ooo.test sshd[23458]: pam_unix(sshd:session): session closed for user zuul
Oct 14 07:14:12 np0005486759.ooo.test systemd[1]: session-10.scope: Deactivated successfully.
Oct 14 07:14:12 np0005486759.ooo.test systemd-logind[759]: Session 10 logged out. Waiting for processes to exit.
Oct 14 07:14:12 np0005486759.ooo.test systemd-logind[759]: Removed session 10.
Oct 14 07:14:17 np0005486759.ooo.test systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 14 07:14:37 np0005486759.ooo.test systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 14 07:23:21 np0005486759.ooo.test sshd[23481]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:23:21 np0005486759.ooo.test sshd[23481]: error: kex_exchange_identification: banner line contains invalid characters
Oct 14 07:23:21 np0005486759.ooo.test sshd[23481]: banner exchange: Connection from 93.123.109.214 port 54740: invalid format
Oct 14 07:23:21 np0005486759.ooo.test sshd[23482]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:23:21 np0005486759.ooo.test sshd[23482]: error: kex_exchange_identification: client sent invalid protocol identifier "GET / HTTP/1.1"
Oct 14 07:23:21 np0005486759.ooo.test sshd[23482]: banner exchange: Connection from 93.123.109.214 port 54750: invalid format
Oct 14 07:31:01 np0005486759.ooo.test anacron[18911]: Job `cron.weekly' started
Oct 14 07:31:01 np0005486759.ooo.test anacron[18911]: Job `cron.weekly' terminated
Oct 14 07:31:41 np0005486759.ooo.test sshd[23488]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:31:41 np0005486759.ooo.test sshd[23488]: error: kex_exchange_identification: client sent invalid protocol identifier ""
Oct 14 07:31:41 np0005486759.ooo.test sshd[23488]: banner exchange: Connection from 3.134.148.59 port 37744: invalid format
Oct 14 07:31:42 np0005486759.ooo.test sshd[23489]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:31:42 np0005486759.ooo.test sshd[23489]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 07:31:42 np0005486759.ooo.test sshd[23489]: Connection closed by 3.134.148.59 port 37784
Oct 14 07:33:56 np0005486759.ooo.test sshd[23490]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:33:57 np0005486759.ooo.test sshd[23490]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 07:33:57 np0005486759.ooo.test sshd[23490]: Connection closed by 3.134.148.59 port 50690
Oct 14 07:35:54 np0005486759.ooo.test sshd[23492]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:35:54 np0005486759.ooo.test sshd[23492]: error: kex_exchange_identification: banner line contains invalid characters
Oct 14 07:35:54 np0005486759.ooo.test sshd[23492]: banner exchange: Connection from 3.134.148.59 port 60022: invalid format
Oct 14 07:37:11 np0005486759.ooo.test sshd[23493]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:37:21 np0005486759.ooo.test sshd[23493]: Connection closed by 3.134.148.59 port 46570 [preauth]
Oct 14 07:38:41 np0005486759.ooo.test sshd[23496]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:38:41 np0005486759.ooo.test sshd[23496]: error: kex_exchange_identification: banner line contains invalid characters
Oct 14 07:38:41 np0005486759.ooo.test sshd[23496]: error: send_error: write: Broken pipe
Oct 14 07:38:41 np0005486759.ooo.test sshd[23496]: banner exchange: Connection from 3.134.148.59 port 50876: invalid format
Oct 14 07:42:18 np0005486759.ooo.test sshd[23498]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:42:18 np0005486759.ooo.test sshd[23499]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:42:18 np0005486759.ooo.test sshd[23499]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 07:42:18 np0005486759.ooo.test sshd[23499]: Connection closed by 165.227.171.84 port 38670
Oct 14 07:42:18 np0005486759.ooo.test sshd[23498]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 07:42:18 np0005486759.ooo.test sshd[23498]: Connection closed by 165.227.171.84 port 38668
Oct 14 07:42:52 np0005486759.ooo.test sshd[23500]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:42:52 np0005486759.ooo.test sshd[23500]: error: kex_exchange_identification: banner line contains invalid characters
Oct 14 07:42:52 np0005486759.ooo.test sshd[23500]: banner exchange: Connection from 65.49.1.232 port 60978: invalid format
Oct 14 07:43:09 np0005486759.ooo.test sshd[23501]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:10 np0005486759.ooo.test sshd[23501]: Connection closed by authenticating user root 165.227.171.84 port 54572 [preauth]
Oct 14 07:43:10 np0005486759.ooo.test sshd[23503]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:11 np0005486759.ooo.test sshd[23503]: Invalid user admin from 165.227.171.84 port 36720
Oct 14 07:43:11 np0005486759.ooo.test sshd[23503]: Connection closed by invalid user admin 165.227.171.84 port 36720 [preauth]
Oct 14 07:43:14 np0005486759.ooo.test sshd[23505]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:14 np0005486759.ooo.test sshd[23505]: Invalid user test from 165.227.171.84 port 36722
Oct 14 07:43:15 np0005486759.ooo.test sshd[23505]: Connection closed by invalid user test 165.227.171.84 port 36722 [preauth]
Oct 14 07:43:16 np0005486759.ooo.test sshd[23507]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:16 np0005486759.ooo.test sshd[23507]: Invalid user postgres from 165.227.171.84 port 36724
Oct 14 07:43:17 np0005486759.ooo.test sshd[23507]: Connection closed by invalid user postgres 165.227.171.84 port 36724 [preauth]
Oct 14 07:43:19 np0005486759.ooo.test sshd[23509]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:19 np0005486759.ooo.test sshd[23509]: Invalid user guest from 165.227.171.84 port 36740
Oct 14 07:43:19 np0005486759.ooo.test sshd[23509]: Connection closed by invalid user guest 165.227.171.84 port 36740 [preauth]
Oct 14 07:43:31 np0005486759.ooo.test sshd[23511]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:31 np0005486759.ooo.test sshd[23511]: Connection closed by authenticating user root 165.227.171.84 port 59772 [preauth]
Oct 14 07:43:31 np0005486759.ooo.test sshd[23513]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:32 np0005486759.ooo.test sshd[23513]: Invalid user odoo from 165.227.171.84 port 54184
Oct 14 07:43:32 np0005486759.ooo.test sshd[23513]: Connection closed by invalid user odoo 165.227.171.84 port 54184 [preauth]
Oct 14 07:43:32 np0005486759.ooo.test sshd[23515]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:32 np0005486759.ooo.test sshd[23515]: Invalid user elastic from 165.227.171.84 port 54198
Oct 14 07:43:33 np0005486759.ooo.test sshd[23515]: Connection closed by invalid user elastic 165.227.171.84 port 54198 [preauth]
Oct 14 07:43:33 np0005486759.ooo.test sshd[23517]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:33 np0005486759.ooo.test sshd[23517]: Invalid user vyos from 165.227.171.84 port 54208
Oct 14 07:43:33 np0005486759.ooo.test sshd[23517]: Connection closed by invalid user vyos 165.227.171.84 port 54208 [preauth]
Oct 14 07:43:40 np0005486759.ooo.test sshd[23520]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:41 np0005486759.ooo.test sshd[23520]: Connection closed by authenticating user root 165.227.171.84 port 54214 [preauth]
Oct 14 07:43:41 np0005486759.ooo.test sshd[23522]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:42 np0005486759.ooo.test sshd[23522]: Invalid user postgres from 165.227.171.84 port 42936
Oct 14 07:43:42 np0005486759.ooo.test sshd[23522]: Connection closed by invalid user postgres 165.227.171.84 port 42936 [preauth]
Oct 14 07:43:46 np0005486759.ooo.test sshd[23524]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:47 np0005486759.ooo.test sshd[23524]: Connection closed by authenticating user root 165.227.171.84 port 42940 [preauth]
Oct 14 07:43:49 np0005486759.ooo.test sshd[23526]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:50 np0005486759.ooo.test sshd[23526]: Connection closed by authenticating user root 165.227.171.84 port 42942 [preauth]
Oct 14 07:43:52 np0005486759.ooo.test sshd[23528]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:43:52 np0005486759.ooo.test sshd[23528]: Invalid user test from 165.227.171.84 port 33160
Oct 14 07:43:53 np0005486759.ooo.test sshd[23528]: Connection closed by invalid user test 165.227.171.84 port 33160 [preauth]
Oct 14 07:44:00 np0005486759.ooo.test sshd[23530]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:01 np0005486759.ooo.test sshd[23530]: Invalid user odroid from 165.227.171.84 port 33162
Oct 14 07:44:01 np0005486759.ooo.test sshd[23530]: Connection closed by invalid user odroid 165.227.171.84 port 33162 [preauth]
Oct 14 07:44:03 np0005486759.ooo.test sshd[23532]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:03 np0005486759.ooo.test sshd[23532]: Invalid user user from 165.227.171.84 port 45936
Oct 14 07:44:04 np0005486759.ooo.test sshd[23532]: Connection closed by invalid user user 165.227.171.84 port 45936 [preauth]
Oct 14 07:44:05 np0005486759.ooo.test sshd[23534]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:05 np0005486759.ooo.test sshd[23534]: Invalid user admin from 165.227.171.84 port 45942
Oct 14 07:44:05 np0005486759.ooo.test sshd[23534]: Connection closed by invalid user admin 165.227.171.84 port 45942 [preauth]
Oct 14 07:44:07 np0005486759.ooo.test sshd[23536]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:08 np0005486759.ooo.test sshd[23536]: Connection closed by authenticating user root 165.227.171.84 port 45956 [preauth]
Oct 14 07:44:11 np0005486759.ooo.test sshd[23538]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:12 np0005486759.ooo.test sshd[23538]: Invalid user ubuntu from 165.227.171.84 port 45958
Oct 14 07:44:12 np0005486759.ooo.test sshd[23538]: Connection closed by invalid user ubuntu 165.227.171.84 port 45958 [preauth]
Oct 14 07:44:14 np0005486759.ooo.test sshd[23540]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:15 np0005486759.ooo.test sshd[23540]: Connection closed by authenticating user root 165.227.171.84 port 49006 [preauth]
Oct 14 07:44:18 np0005486759.ooo.test sshd[23542]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:19 np0005486759.ooo.test sshd[23542]: Invalid user minecraft from 165.227.171.84 port 49022
Oct 14 07:44:19 np0005486759.ooo.test sshd[23542]: Connection closed by invalid user minecraft 165.227.171.84 port 49022 [preauth]
Oct 14 07:44:19 np0005486759.ooo.test sshd[23544]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:19 np0005486759.ooo.test sshd[23544]: Invalid user jenkins from 165.227.171.84 port 49038
Oct 14 07:44:19 np0005486759.ooo.test sshd[23544]: Connection closed by invalid user jenkins 165.227.171.84 port 49038 [preauth]
Oct 14 07:44:24 np0005486759.ooo.test sshd[23546]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:24 np0005486759.ooo.test sshd[23546]: Connection closed by authenticating user root 165.227.171.84 port 55670 [preauth]
Oct 14 07:44:24 np0005486759.ooo.test sshd[23548]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:25 np0005486759.ooo.test sshd[23548]: Invalid user testuser from 165.227.171.84 port 55680
Oct 14 07:44:25 np0005486759.ooo.test sshd[23548]: Connection closed by invalid user testuser 165.227.171.84 port 55680 [preauth]
Oct 14 07:44:28 np0005486759.ooo.test sshd[23550]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:29 np0005486759.ooo.test sshd[23550]: Invalid user steam from 165.227.171.84 port 55684
Oct 14 07:44:29 np0005486759.ooo.test sshd[23550]: Connection closed by invalid user steam 165.227.171.84 port 55684 [preauth]
Oct 14 07:44:29 np0005486759.ooo.test sshd[23552]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:30 np0005486759.ooo.test sshd[23552]: Invalid user linaro from 165.227.171.84 port 55688
Oct 14 07:44:30 np0005486759.ooo.test sshd[23552]: Connection closed by invalid user linaro 165.227.171.84 port 55688 [preauth]
Oct 14 07:44:30 np0005486759.ooo.test sshd[23554]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:31 np0005486759.ooo.test sshd[23554]: Connection closed by authenticating user root 165.227.171.84 port 35730 [preauth]
Oct 14 07:44:33 np0005486759.ooo.test sshd[23556]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:34 np0005486759.ooo.test sshd[23556]: Invalid user debian from 165.227.171.84 port 35744
Oct 14 07:44:34 np0005486759.ooo.test sshd[23556]: Connection closed by invalid user debian 165.227.171.84 port 35744 [preauth]
Oct 14 07:44:35 np0005486759.ooo.test sshd[23558]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:35 np0005486759.ooo.test sshd[23558]: Invalid user mysql from 165.227.171.84 port 35756
Oct 14 07:44:35 np0005486759.ooo.test sshd[23558]: Connection closed by invalid user mysql 165.227.171.84 port 35756 [preauth]
Oct 14 07:44:37 np0005486759.ooo.test sshd[23560]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:38 np0005486759.ooo.test sshd[23560]: Invalid user oracle from 165.227.171.84 port 35760
Oct 14 07:44:38 np0005486759.ooo.test sshd[23560]: Connection closed by invalid user oracle 165.227.171.84 port 35760 [preauth]
Oct 14 07:44:38 np0005486759.ooo.test sshd[23562]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:38 np0005486759.ooo.test sshd[23562]: Invalid user db2inst1 from 165.227.171.84 port 35762
Oct 14 07:44:38 np0005486759.ooo.test sshd[23562]: Connection closed by invalid user db2inst1 165.227.171.84 port 35762 [preauth]
Oct 14 07:44:39 np0005486759.ooo.test sshd[23564]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:39 np0005486759.ooo.test sshd[23564]: Invalid user admin from 165.227.171.84 port 35768
Oct 14 07:44:39 np0005486759.ooo.test sshd[23564]: Connection closed by invalid user admin 165.227.171.84 port 35768 [preauth]
Oct 14 07:44:42 np0005486759.ooo.test sshd[23567]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:42 np0005486759.ooo.test sshd[23567]: Invalid user ubuntu from 165.227.171.84 port 37672
Oct 14 07:44:42 np0005486759.ooo.test sshd[23567]: Connection closed by invalid user ubuntu 165.227.171.84 port 37672 [preauth]
Oct 14 07:44:42 np0005486759.ooo.test sshd[23569]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:43 np0005486759.ooo.test sshd[23569]: Invalid user git from 165.227.171.84 port 37682
Oct 14 07:44:43 np0005486759.ooo.test sshd[23569]: Connection closed by invalid user git 165.227.171.84 port 37682 [preauth]
Oct 14 07:44:43 np0005486759.ooo.test sshd[23571]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:44 np0005486759.ooo.test sshd[23571]: Connection closed by authenticating user root 165.227.171.84 port 37684 [preauth]
Oct 14 07:44:44 np0005486759.ooo.test sshd[23573]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:44 np0005486759.ooo.test sshd[23573]: Invalid user user from 165.227.171.84 port 37686
Oct 14 07:44:44 np0005486759.ooo.test sshd[23573]: Connection closed by invalid user user 165.227.171.84 port 37686 [preauth]
Oct 14 07:44:44 np0005486759.ooo.test sshd[23575]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:45 np0005486759.ooo.test sshd[23575]: Connection closed by authenticating user root 165.227.171.84 port 37700 [preauth]
Oct 14 07:44:46 np0005486759.ooo.test sshd[23577]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:47 np0005486759.ooo.test sshd[23577]: Invalid user pi from 165.227.171.84 port 37712
Oct 14 07:44:47 np0005486759.ooo.test sshd[23577]: Connection closed by invalid user pi 165.227.171.84 port 37712 [preauth]
Oct 14 07:44:49 np0005486759.ooo.test sshd[23579]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:49 np0005486759.ooo.test sshd[23579]: Invalid user ansible from 165.227.171.84 port 37722
Oct 14 07:44:49 np0005486759.ooo.test sshd[23579]: Connection closed by invalid user ansible 165.227.171.84 port 37722 [preauth]
Oct 14 07:44:50 np0005486759.ooo.test sshd[23581]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:51 np0005486759.ooo.test sshd[23581]: Invalid user ftpuser from 165.227.171.84 port 59012
Oct 14 07:44:51 np0005486759.ooo.test sshd[23581]: Connection closed by invalid user ftpuser 165.227.171.84 port 59012 [preauth]
Oct 14 07:44:55 np0005486759.ooo.test sshd[23583]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:56 np0005486759.ooo.test sshd[23583]: Invalid user ts3 from 165.227.171.84 port 59026
Oct 14 07:44:56 np0005486759.ooo.test sshd[23583]: Connection closed by invalid user ts3 165.227.171.84 port 59026 [preauth]
Oct 14 07:44:57 np0005486759.ooo.test sshd[23585]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:58 np0005486759.ooo.test sshd[23585]: Invalid user deployer from 165.227.171.84 port 59036
Oct 14 07:44:58 np0005486759.ooo.test sshd[23585]: Connection closed by invalid user deployer 165.227.171.84 port 59036 [preauth]
Oct 14 07:44:58 np0005486759.ooo.test sshd[23587]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:44:58 np0005486759.ooo.test sshd[23587]: Connection closed by authenticating user root 165.227.171.84 port 59048 [preauth]
Oct 14 07:45:00 np0005486759.ooo.test sshd[23589]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:00 np0005486759.ooo.test sshd[23589]: Invalid user hadoop from 165.227.171.84 port 59058
Oct 14 07:45:01 np0005486759.ooo.test sshd[23589]: Connection closed by invalid user hadoop 165.227.171.84 port 59058 [preauth]
Oct 14 07:45:02 np0005486759.ooo.test sshd[23591]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:02 np0005486759.ooo.test sshd[23591]: Invalid user dspace from 165.227.171.84 port 44114
Oct 14 07:45:02 np0005486759.ooo.test sshd[23591]: Connection closed by invalid user dspace 165.227.171.84 port 44114 [preauth]
Oct 14 07:45:07 np0005486759.ooo.test sshd[23593]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:07 np0005486759.ooo.test sshd[23593]: Invalid user fa from 165.227.171.84 port 44116
Oct 14 07:45:07 np0005486759.ooo.test sshd[23593]: Connection closed by invalid user fa 165.227.171.84 port 44116 [preauth]
Oct 14 07:45:09 np0005486759.ooo.test sshd[23595]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:11 np0005486759.ooo.test sshd[23595]: Invalid user deploy from 165.227.171.84 port 44120
Oct 14 07:45:11 np0005486759.ooo.test sshd[23595]: Connection closed by invalid user deploy 165.227.171.84 port 44120 [preauth]
Oct 14 07:45:15 np0005486759.ooo.test sshd[23597]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:16 np0005486759.ooo.test sshd[23597]: Invalid user user from 165.227.171.84 port 41330
Oct 14 07:45:16 np0005486759.ooo.test sshd[23597]: Connection closed by invalid user user 165.227.171.84 port 41330 [preauth]
Oct 14 07:45:21 np0005486759.ooo.test sshd[23600]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:22 np0005486759.ooo.test sshd[23600]: Connection closed by authenticating user root 165.227.171.84 port 41342 [preauth]
Oct 14 07:45:22 np0005486759.ooo.test sshd[23602]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:22 np0005486759.ooo.test sshd[23602]: Invalid user es from 165.227.171.84 port 38960
Oct 14 07:45:23 np0005486759.ooo.test sshd[23602]: Connection closed by invalid user es 165.227.171.84 port 38960 [preauth]
Oct 14 07:45:24 np0005486759.ooo.test sshd[23604]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:24 np0005486759.ooo.test sshd[23604]: Invalid user vagrant from 165.227.171.84 port 38976
Oct 14 07:45:24 np0005486759.ooo.test sshd[23604]: Connection closed by invalid user vagrant 165.227.171.84 port 38976 [preauth]
Oct 14 07:45:28 np0005486759.ooo.test sshd[23606]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:28 np0005486759.ooo.test sshd[23606]: Connection closed by authenticating user root 165.227.171.84 port 38992 [preauth]
Oct 14 07:45:41 np0005486759.ooo.test sshd[23608]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:42 np0005486759.ooo.test sshd[23608]: Connection closed by authenticating user root 165.227.171.84 port 48032 [preauth]
Oct 14 07:45:51 np0005486759.ooo.test sshd[23610]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:51 np0005486759.ooo.test sshd[23610]: Invalid user odroid from 165.227.171.84 port 42510
Oct 14 07:45:52 np0005486759.ooo.test sshd[23610]: Connection closed by invalid user odroid 165.227.171.84 port 42510 [preauth]
Oct 14 07:45:52 np0005486759.ooo.test sshd[23612]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:52 np0005486759.ooo.test sshd[23612]: Invalid user vpn from 165.227.171.84 port 48840
Oct 14 07:45:53 np0005486759.ooo.test sshd[23612]: Connection closed by invalid user vpn 165.227.171.84 port 48840 [preauth]
Oct 14 07:45:53 np0005486759.ooo.test sshd[23614]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:45:53 np0005486759.ooo.test sshd[23614]: Invalid user test from 165.227.171.84 port 48850
Oct 14 07:45:53 np0005486759.ooo.test sshd[23614]: Connection closed by invalid user test 165.227.171.84 port 48850 [preauth]
Oct 14 07:46:15 np0005486759.ooo.test sshd[23617]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:16 np0005486759.ooo.test sshd[23617]: Invalid user hadoop from 165.227.171.84 port 33298
Oct 14 07:46:17 np0005486759.ooo.test sshd[23617]: Connection closed by invalid user hadoop 165.227.171.84 port 33298 [preauth]
Oct 14 07:46:18 np0005486759.ooo.test sshd[23619]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:18 np0005486759.ooo.test sshd[23619]: Connection closed by authenticating user root 165.227.171.84 port 33306 [preauth]
Oct 14 07:46:18 np0005486759.ooo.test sshd[23621]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:19 np0005486759.ooo.test sshd[23621]: Invalid user pi from 165.227.171.84 port 33316
Oct 14 07:46:19 np0005486759.ooo.test sshd[23621]: Connection closed by invalid user pi 165.227.171.84 port 33316 [preauth]
Oct 14 07:46:19 np0005486759.ooo.test sshd[23623]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:20 np0005486759.ooo.test sshd[23623]: Connection closed by authenticating user root 165.227.171.84 port 38962 [preauth]
Oct 14 07:46:20 np0005486759.ooo.test sshd[23625]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:21 np0005486759.ooo.test sshd[23625]: Invalid user postgres from 165.227.171.84 port 38968
Oct 14 07:46:21 np0005486759.ooo.test sshd[23625]: Connection closed by invalid user postgres 165.227.171.84 port 38968 [preauth]
Oct 14 07:46:25 np0005486759.ooo.test sshd[23627]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:25 np0005486759.ooo.test sshd[23627]: Invalid user mysql from 165.227.171.84 port 38970
Oct 14 07:46:26 np0005486759.ooo.test sshd[23627]: Connection closed by invalid user mysql 165.227.171.84 port 38970 [preauth]
Oct 14 07:46:28 np0005486759.ooo.test sshd[23629]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:28 np0005486759.ooo.test sshd[23629]: Invalid user odoo from 165.227.171.84 port 38978
Oct 14 07:46:28 np0005486759.ooo.test sshd[23629]: Connection closed by invalid user odoo 165.227.171.84 port 38978 [preauth]
Oct 14 07:46:30 np0005486759.ooo.test sshd[23631]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:31 np0005486759.ooo.test sshd[23631]: Invalid user oracle from 165.227.171.84 port 38988
Oct 14 07:46:31 np0005486759.ooo.test sshd[23631]: Connection closed by invalid user oracle 165.227.171.84 port 38988 [preauth]
Oct 14 07:46:51 np0005486759.ooo.test sshd[23633]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:51 np0005486759.ooo.test sshd[23633]: Invalid user user from 165.227.171.84 port 50154
Oct 14 07:46:52 np0005486759.ooo.test sshd[23633]: Connection closed by invalid user user 165.227.171.84 port 50154 [preauth]
Oct 14 07:46:52 np0005486759.ooo.test sshd[23635]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:52 np0005486759.ooo.test sshd[23635]: Invalid user db2inst1 from 165.227.171.84 port 38976
Oct 14 07:46:52 np0005486759.ooo.test sshd[23635]: Connection closed by invalid user db2inst1 165.227.171.84 port 38976 [preauth]
Oct 14 07:46:53 np0005486759.ooo.test sshd[23637]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:54 np0005486759.ooo.test sshd[23637]: Invalid user deploy from 165.227.171.84 port 38984
Oct 14 07:46:54 np0005486759.ooo.test sshd[23637]: Connection closed by invalid user deploy 165.227.171.84 port 38984 [preauth]
Oct 14 07:46:54 np0005486759.ooo.test sshd[23639]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:54 np0005486759.ooo.test sshd[23639]: Invalid user ubnt from 165.227.171.84 port 38990
Oct 14 07:46:55 np0005486759.ooo.test sshd[23639]: Connection closed by invalid user ubnt 165.227.171.84 port 38990 [preauth]
Oct 14 07:46:59 np0005486759.ooo.test sshd[23641]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:46:59 np0005486759.ooo.test sshd[23641]: Invalid user guest from 165.227.171.84 port 39000
Oct 14 07:47:00 np0005486759.ooo.test sshd[23641]: Connection closed by invalid user guest 165.227.171.84 port 39000 [preauth]
Oct 14 07:47:00 np0005486759.ooo.test sshd[23643]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:00 np0005486759.ooo.test sshd[23643]: Invalid user odoo18 from 165.227.171.84 port 48030
Oct 14 07:47:00 np0005486759.ooo.test sshd[23643]: Connection closed by invalid user odoo18 165.227.171.84 port 48030 [preauth]
Oct 14 07:47:08 np0005486759.ooo.test sshd[23645]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:08 np0005486759.ooo.test sshd[23645]: Connection closed by authenticating user root 165.227.171.84 port 48036 [preauth]
Oct 14 07:47:11 np0005486759.ooo.test sshd[23647]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:12 np0005486759.ooo.test sshd[23647]: Invalid user devopsuser from 165.227.171.84 port 48044
Oct 14 07:47:12 np0005486759.ooo.test sshd[23647]: Connection closed by invalid user devopsuser 165.227.171.84 port 48044 [preauth]
Oct 14 07:47:31 np0005486759.ooo.test sshd[23649]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:32 np0005486759.ooo.test sshd[23649]: Connection closed by authenticating user root 165.227.171.84 port 46920 [preauth]
Oct 14 07:47:34 np0005486759.ooo.test sshd[23651]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:34 np0005486759.ooo.test sshd[23651]: Invalid user es from 165.227.171.84 port 46440
Oct 14 07:47:34 np0005486759.ooo.test sshd[23651]: Connection closed by invalid user es 165.227.171.84 port 46440 [preauth]
Oct 14 07:47:34 np0005486759.ooo.test sshd[23653]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:34 np0005486759.ooo.test sshd[23653]: Invalid user ts3 from 165.227.171.84 port 46452
Oct 14 07:47:35 np0005486759.ooo.test sshd[23653]: Connection closed by invalid user ts3 165.227.171.84 port 46452 [preauth]
Oct 14 07:47:37 np0005486759.ooo.test sshd[23655]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:37 np0005486759.ooo.test sshd[23655]: Connection closed by authenticating user root 165.227.171.84 port 46458 [preauth]
Oct 14 07:47:37 np0005486759.ooo.test sshd[23657]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:38 np0005486759.ooo.test sshd[23657]: Connection closed by authenticating user root 165.227.171.84 port 46474 [preauth]
Oct 14 07:47:38 np0005486759.ooo.test sshd[23659]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:38 np0005486759.ooo.test sshd[23659]: Invalid user orangepi from 165.227.171.84 port 46488
Oct 14 07:47:39 np0005486759.ooo.test sshd[23659]: Connection closed by invalid user orangepi 165.227.171.84 port 46488 [preauth]
Oct 14 07:47:39 np0005486759.ooo.test sshd[23661]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:39 np0005486759.ooo.test sshd[23661]: Invalid user ubuntu from 165.227.171.84 port 46502
Oct 14 07:47:39 np0005486759.ooo.test sshd[23661]: Connection closed by invalid user ubuntu 165.227.171.84 port 46502 [preauth]
Oct 14 07:47:44 np0005486759.ooo.test sshd[23663]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:45 np0005486759.ooo.test sshd[23663]: Invalid user postgres from 165.227.171.84 port 40384
Oct 14 07:47:45 np0005486759.ooo.test sshd[23663]: Connection closed by invalid user postgres 165.227.171.84 port 40384 [preauth]
Oct 14 07:47:45 np0005486759.ooo.test sshd[23665]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:46 np0005486759.ooo.test sshd[23665]: Invalid user git from 165.227.171.84 port 40392
Oct 14 07:47:46 np0005486759.ooo.test sshd[23665]: Connection closed by invalid user git 165.227.171.84 port 40392 [preauth]
Oct 14 07:47:46 np0005486759.ooo.test sshd[23667]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:46 np0005486759.ooo.test sshd[23667]: Invalid user linaro from 165.227.171.84 port 40400
Oct 14 07:47:46 np0005486759.ooo.test sshd[23667]: Connection closed by invalid user linaro 165.227.171.84 port 40400 [preauth]
Oct 14 07:47:46 np0005486759.ooo.test sshd[23669]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:47 np0005486759.ooo.test sshd[23669]: Invalid user admin from 165.227.171.84 port 40406
Oct 14 07:47:47 np0005486759.ooo.test sshd[23669]: Connection closed by invalid user admin 165.227.171.84 port 40406 [preauth]
Oct 14 07:47:47 np0005486759.ooo.test sshd[23671]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:47 np0005486759.ooo.test sshd[23671]: Invalid user ftpuser from 165.227.171.84 port 40418
Oct 14 07:47:48 np0005486759.ooo.test sshd[23671]: Connection closed by invalid user ftpuser 165.227.171.84 port 40418 [preauth]
Oct 14 07:47:48 np0005486759.ooo.test sshd[23673]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:48 np0005486759.ooo.test sshd[23673]: Invalid user deploy from 165.227.171.84 port 40420
Oct 14 07:47:48 np0005486759.ooo.test sshd[23673]: Connection closed by invalid user deploy 165.227.171.84 port 40420 [preauth]
Oct 14 07:47:51 np0005486759.ooo.test sshd[23675]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:52 np0005486759.ooo.test sshd[23675]: Connection closed by authenticating user root 165.227.171.84 port 40426 [preauth]
Oct 14 07:47:52 np0005486759.ooo.test sshd[23677]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:53 np0005486759.ooo.test sshd[23677]: Connection closed by authenticating user root 165.227.171.84 port 59494 [preauth]
Oct 14 07:47:53 np0005486759.ooo.test sshd[23679]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:47:53 np0005486759.ooo.test sshd[23679]: Invalid user oracle from 165.227.171.84 port 59500
Oct 14 07:47:54 np0005486759.ooo.test sshd[23679]: Connection closed by invalid user oracle 165.227.171.84 port 59500 [preauth]
Oct 14 07:48:01 np0005486759.ooo.test sshd[23681]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:02 np0005486759.ooo.test sshd[23681]: Connection closed by authenticating user root 165.227.171.84 port 59502 [preauth]
Oct 14 07:48:07 np0005486759.ooo.test sshd[23683]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:07 np0005486759.ooo.test sshd[23683]: Invalid user jenkins from 165.227.171.84 port 51024
Oct 14 07:48:07 np0005486759.ooo.test sshd[23683]: Connection closed by invalid user jenkins 165.227.171.84 port 51024 [preauth]
Oct 14 07:48:07 np0005486759.ooo.test sshd[23685]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:08 np0005486759.ooo.test sshd[23685]: Invalid user devops from 165.227.171.84 port 51032
Oct 14 07:48:08 np0005486759.ooo.test sshd[23685]: Connection closed by invalid user devops 165.227.171.84 port 51032 [preauth]
Oct 14 07:48:15 np0005486759.ooo.test sshd[23687]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:16 np0005486759.ooo.test sshd[23687]: Connection closed by authenticating user root 165.227.171.84 port 51044 [preauth]
Oct 14 07:48:19 np0005486759.ooo.test sshd[23689]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:20 np0005486759.ooo.test sshd[23689]: Connection closed by authenticating user root 165.227.171.84 port 49806 [preauth]
Oct 14 07:48:27 np0005486759.ooo.test sshd[23691]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:28 np0005486759.ooo.test sshd[23691]: Connection closed by authenticating user root 165.227.171.84 port 43040 [preauth]
Oct 14 07:48:28 np0005486759.ooo.test sshd[23693]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:28 np0005486759.ooo.test sshd[23693]: Invalid user postgres from 165.227.171.84 port 43048
Oct 14 07:48:28 np0005486759.ooo.test sshd[23693]: Connection closed by invalid user postgres 165.227.171.84 port 43048 [preauth]
Oct 14 07:48:28 np0005486759.ooo.test sshd[23695]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:29 np0005486759.ooo.test sshd[23695]: Invalid user debian from 165.227.171.84 port 43050
Oct 14 07:48:29 np0005486759.ooo.test sshd[23695]: Connection closed by invalid user debian 165.227.171.84 port 43050 [preauth]
Oct 14 07:48:29 np0005486759.ooo.test sshd[23697]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:29 np0005486759.ooo.test sshd[23697]: Connection closed by authenticating user root 165.227.171.84 port 43058 [preauth]
Oct 14 07:48:31 np0005486759.ooo.test sshd[23699]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:31 np0005486759.ooo.test sshd[23699]: Invalid user user from 165.227.171.84 port 51156
Oct 14 07:48:31 np0005486759.ooo.test sshd[23699]: Connection closed by invalid user user 165.227.171.84 port 51156 [preauth]
Oct 14 07:48:34 np0005486759.ooo.test sshd[23701]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:35 np0005486759.ooo.test sshd[23701]: Invalid user steam from 165.227.171.84 port 51172
Oct 14 07:48:35 np0005486759.ooo.test sshd[23701]: Connection closed by invalid user steam 165.227.171.84 port 51172 [preauth]
Oct 14 07:48:35 np0005486759.ooo.test sshd[23703]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:35 np0005486759.ooo.test sshd[23703]: Invalid user dspace from 165.227.171.84 port 51174
Oct 14 07:48:35 np0005486759.ooo.test sshd[23703]: Connection closed by invalid user dspace 165.227.171.84 port 51174 [preauth]
Oct 14 07:48:36 np0005486759.ooo.test sshd[23705]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:36 np0005486759.ooo.test sshd[23705]: Invalid user user from 165.227.171.84 port 51188
Oct 14 07:48:36 np0005486759.ooo.test sshd[23705]: Connection closed by invalid user user 165.227.171.84 port 51188 [preauth]
Oct 14 07:48:44 np0005486759.ooo.test sshd[23707]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:48:44 np0005486759.ooo.test sshd[23707]: Connection closed by authenticating user root 165.227.171.84 port 51196 [preauth]
Oct 14 07:51:01 np0005486759.ooo.test anacron[18911]: Job `cron.monthly' started
Oct 14 07:51:01 np0005486759.ooo.test anacron[18911]: Job `cron.monthly' terminated
Oct 14 07:51:01 np0005486759.ooo.test anacron[18911]: Normal exit (3 jobs run)
Oct 14 07:51:29 np0005486759.ooo.test sshd[23712]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:51:33 np0005486759.ooo.test sshd[23712]: error: maximum authentication attempts exceeded for root from 49.164.171.171 port 44335 ssh2 [preauth]
Oct 14 07:51:33 np0005486759.ooo.test sshd[23712]: Disconnecting authenticating user root 49.164.171.171 port 44335: Too many authentication failures [preauth]
Oct 14 07:51:34 np0005486759.ooo.test sshd[23714]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:51:38 np0005486759.ooo.test sshd[23714]: error: maximum authentication attempts exceeded for root from 49.164.171.171 port 44537 ssh2 [preauth]
Oct 14 07:51:38 np0005486759.ooo.test sshd[23714]: Disconnecting authenticating user root 49.164.171.171 port 44537: Too many authentication failures [preauth]
Oct 14 07:51:39 np0005486759.ooo.test sshd[23716]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:51:43 np0005486759.ooo.test sshd[23716]: error: maximum authentication attempts exceeded for root from 49.164.171.171 port 44730 ssh2 [preauth]
Oct 14 07:51:43 np0005486759.ooo.test sshd[23716]: Disconnecting authenticating user root 49.164.171.171 port 44730: Too many authentication failures [preauth]
Oct 14 07:51:43 np0005486759.ooo.test sshd[23719]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:51:47 np0005486759.ooo.test sshd[23719]: Received disconnect from 49.164.171.171 port 44931:11: disconnected by user [preauth]
Oct 14 07:51:47 np0005486759.ooo.test sshd[23719]: Disconnected from authenticating user root 49.164.171.171 port 44931 [preauth]
Oct 14 07:51:48 np0005486759.ooo.test sshd[23721]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:51:52 np0005486759.ooo.test sshd[23721]: Invalid user admin from 49.164.171.171 port 45109
Oct 14 07:51:53 np0005486759.ooo.test sshd[23721]: error: maximum authentication attempts exceeded for invalid user admin from 49.164.171.171 port 45109 ssh2 [preauth]
Oct 14 07:51:53 np0005486759.ooo.test sshd[23721]: Disconnecting invalid user admin 49.164.171.171 port 45109: Too many authentication failures [preauth]
Oct 14 07:51:54 np0005486759.ooo.test sshd[23723]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:51:59 np0005486759.ooo.test sshd[23723]: Invalid user admin from 49.164.171.171 port 45333
Oct 14 07:52:00 np0005486759.ooo.test sshd[23723]: error: maximum authentication attempts exceeded for invalid user admin from 49.164.171.171 port 45333 ssh2 [preauth]
Oct 14 07:52:00 np0005486759.ooo.test sshd[23723]: Disconnecting invalid user admin 49.164.171.171 port 45333: Too many authentication failures [preauth]
Oct 14 07:52:01 np0005486759.ooo.test sshd[23725]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:07 np0005486759.ooo.test sshd[23725]: Invalid user admin from 49.164.171.171 port 45661
Oct 14 07:52:08 np0005486759.ooo.test sshd[23725]: Received disconnect from 49.164.171.171 port 45661:11: disconnected by user [preauth]
Oct 14 07:52:08 np0005486759.ooo.test sshd[23725]: Disconnected from invalid user admin 49.164.171.171 port 45661 [preauth]
Oct 14 07:52:08 np0005486759.ooo.test sshd[23727]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:14 np0005486759.ooo.test sshd[23727]: Invalid user oracle from 49.164.171.171 port 45916
Oct 14 07:52:15 np0005486759.ooo.test sshd[23727]: error: maximum authentication attempts exceeded for invalid user oracle from 49.164.171.171 port 45916 ssh2 [preauth]
Oct 14 07:52:15 np0005486759.ooo.test sshd[23727]: Disconnecting invalid user oracle 49.164.171.171 port 45916: Too many authentication failures [preauth]
Oct 14 07:52:16 np0005486759.ooo.test sshd[23729]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:20 np0005486759.ooo.test sshd[23729]: Invalid user oracle from 49.164.171.171 port 46229
Oct 14 07:52:21 np0005486759.ooo.test sshd[23729]: error: maximum authentication attempts exceeded for invalid user oracle from 49.164.171.171 port 46229 ssh2 [preauth]
Oct 14 07:52:21 np0005486759.ooo.test sshd[23729]: Disconnecting invalid user oracle 49.164.171.171 port 46229: Too many authentication failures [preauth]
Oct 14 07:52:22 np0005486759.ooo.test sshd[23731]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:25 np0005486759.ooo.test sshd[23731]: Invalid user oracle from 49.164.171.171 port 46455
Oct 14 07:52:26 np0005486759.ooo.test sshd[23731]: Received disconnect from 49.164.171.171 port 46455:11: disconnected by user [preauth]
Oct 14 07:52:26 np0005486759.ooo.test sshd[23731]: Disconnected from invalid user oracle 49.164.171.171 port 46455 [preauth]
Oct 14 07:52:26 np0005486759.ooo.test sshd[23733]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:31 np0005486759.ooo.test sshd[23733]: Invalid user usuario from 49.164.171.171 port 46645
Oct 14 07:52:32 np0005486759.ooo.test sshd[23733]: error: maximum authentication attempts exceeded for invalid user usuario from 49.164.171.171 port 46645 ssh2 [preauth]
Oct 14 07:52:32 np0005486759.ooo.test sshd[23733]: Disconnecting invalid user usuario 49.164.171.171 port 46645: Too many authentication failures [preauth]
Oct 14 07:52:32 np0005486759.ooo.test sshd[23735]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:38 np0005486759.ooo.test sshd[23735]: Invalid user usuario from 49.164.171.171 port 46900
Oct 14 07:52:39 np0005486759.ooo.test sshd[23735]: error: maximum authentication attempts exceeded for invalid user usuario from 49.164.171.171 port 46900 ssh2 [preauth]
Oct 14 07:52:39 np0005486759.ooo.test sshd[23735]: Disconnecting invalid user usuario 49.164.171.171 port 46900: Too many authentication failures [preauth]
Oct 14 07:52:39 np0005486759.ooo.test sshd[23737]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:46 np0005486759.ooo.test sshd[23737]: Invalid user usuario from 49.164.171.171 port 47190
Oct 14 07:52:46 np0005486759.ooo.test sshd[23737]: Received disconnect from 49.164.171.171 port 47190:11: disconnected by user [preauth]
Oct 14 07:52:46 np0005486759.ooo.test sshd[23737]: Disconnected from invalid user usuario 49.164.171.171 port 47190 [preauth]
Oct 14 07:52:47 np0005486759.ooo.test sshd[23739]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:51 np0005486759.ooo.test sshd[23739]: Invalid user test from 49.164.171.171 port 47480
Oct 14 07:52:52 np0005486759.ooo.test sshd[23739]: error: maximum authentication attempts exceeded for invalid user test from 49.164.171.171 port 47480 ssh2 [preauth]
Oct 14 07:52:52 np0005486759.ooo.test sshd[23739]: Disconnecting invalid user test 49.164.171.171 port 47480: Too many authentication failures [preauth]
Oct 14 07:52:52 np0005486759.ooo.test sshd[23741]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:52:58 np0005486759.ooo.test sshd[23741]: Invalid user test from 49.164.171.171 port 47701
Oct 14 07:52:59 np0005486759.ooo.test sshd[23741]: error: maximum authentication attempts exceeded for invalid user test from 49.164.171.171 port 47701 ssh2 [preauth]
Oct 14 07:52:59 np0005486759.ooo.test sshd[23741]: Disconnecting invalid user test 49.164.171.171 port 47701: Too many authentication failures [preauth]
Oct 14 07:52:59 np0005486759.ooo.test sshd[23743]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:04 np0005486759.ooo.test sshd[23743]: Invalid user test from 49.164.171.171 port 48003
Oct 14 07:53:04 np0005486759.ooo.test sshd[23743]: Received disconnect from 49.164.171.171 port 48003:11: disconnected by user [preauth]
Oct 14 07:53:04 np0005486759.ooo.test sshd[23743]: Disconnected from invalid user test 49.164.171.171 port 48003 [preauth]
Oct 14 07:53:05 np0005486759.ooo.test sshd[23745]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:09 np0005486759.ooo.test sshd[23745]: Invalid user user from 49.164.171.171 port 48201
Oct 14 07:53:10 np0005486759.ooo.test sshd[23745]: error: maximum authentication attempts exceeded for invalid user user from 49.164.171.171 port 48201 ssh2 [preauth]
Oct 14 07:53:10 np0005486759.ooo.test sshd[23745]: Disconnecting invalid user user 49.164.171.171 port 48201: Too many authentication failures [preauth]
Oct 14 07:53:11 np0005486759.ooo.test sshd[23747]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:15 np0005486759.ooo.test sshd[23747]: Invalid user user from 49.164.171.171 port 48461
Oct 14 07:53:16 np0005486759.ooo.test sshd[23747]: error: maximum authentication attempts exceeded for invalid user user from 49.164.171.171 port 48461 ssh2 [preauth]
Oct 14 07:53:16 np0005486759.ooo.test sshd[23747]: Disconnecting invalid user user 49.164.171.171 port 48461: Too many authentication failures [preauth]
Oct 14 07:53:17 np0005486759.ooo.test sshd[23749]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:21 np0005486759.ooo.test sshd[23749]: Invalid user user from 49.164.171.171 port 48699
Oct 14 07:53:22 np0005486759.ooo.test sshd[23749]: Received disconnect from 49.164.171.171 port 48699:11: disconnected by user [preauth]
Oct 14 07:53:22 np0005486759.ooo.test sshd[23749]: Disconnected from invalid user user 49.164.171.171 port 48699 [preauth]
Oct 14 07:53:22 np0005486759.ooo.test sshd[23751]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:26 np0005486759.ooo.test sshd[23751]: Invalid user ftpuser from 49.164.171.171 port 48899
Oct 14 07:53:27 np0005486759.ooo.test sshd[23751]: error: maximum authentication attempts exceeded for invalid user ftpuser from 49.164.171.171 port 48899 ssh2 [preauth]
Oct 14 07:53:27 np0005486759.ooo.test sshd[23751]: Disconnecting invalid user ftpuser 49.164.171.171 port 48899: Too many authentication failures [preauth]
Oct 14 07:53:27 np0005486759.ooo.test sshd[23753]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:31 np0005486759.ooo.test sshd[23753]: Invalid user ftpuser from 49.164.171.171 port 49117
Oct 14 07:53:32 np0005486759.ooo.test sshd[23753]: error: maximum authentication attempts exceeded for invalid user ftpuser from 49.164.171.171 port 49117 ssh2 [preauth]
Oct 14 07:53:32 np0005486759.ooo.test sshd[23753]: Disconnecting invalid user ftpuser 49.164.171.171 port 49117: Too many authentication failures [preauth]
Oct 14 07:53:33 np0005486759.ooo.test sshd[23755]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:37 np0005486759.ooo.test sshd[23755]: Invalid user ftpuser from 49.164.171.171 port 49360
Oct 14 07:53:38 np0005486759.ooo.test sshd[23755]: Received disconnect from 49.164.171.171 port 49360:11: disconnected by user [preauth]
Oct 14 07:53:38 np0005486759.ooo.test sshd[23755]: Disconnected from invalid user ftpuser 49.164.171.171 port 49360 [preauth]
Oct 14 07:53:38 np0005486759.ooo.test sshd[23757]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:43 np0005486759.ooo.test sshd[23757]: Invalid user test1 from 49.164.171.171 port 49545
Oct 14 07:53:44 np0005486759.ooo.test sshd[23757]: error: maximum authentication attempts exceeded for invalid user test1 from 49.164.171.171 port 49545 ssh2 [preauth]
Oct 14 07:53:44 np0005486759.ooo.test sshd[23757]: Disconnecting invalid user test1 49.164.171.171 port 49545: Too many authentication failures [preauth]
Oct 14 07:53:45 np0005486759.ooo.test sshd[23759]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:49 np0005486759.ooo.test sshd[23759]: Invalid user test1 from 49.164.171.171 port 49802
Oct 14 07:53:50 np0005486759.ooo.test sshd[23759]: error: maximum authentication attempts exceeded for invalid user test1 from 49.164.171.171 port 49802 ssh2 [preauth]
Oct 14 07:53:50 np0005486759.ooo.test sshd[23759]: Disconnecting invalid user test1 49.164.171.171 port 49802: Too many authentication failures [preauth]
Oct 14 07:53:51 np0005486759.ooo.test sshd[23762]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:53:57 np0005486759.ooo.test sshd[23762]: Invalid user test1 from 49.164.171.171 port 50045
Oct 14 07:53:57 np0005486759.ooo.test sshd[23762]: Received disconnect from 49.164.171.171 port 50045:11: disconnected by user [preauth]
Oct 14 07:53:57 np0005486759.ooo.test sshd[23762]: Disconnected from invalid user test1 49.164.171.171 port 50045 [preauth]
Oct 14 07:53:58 np0005486759.ooo.test sshd[23764]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:02 np0005486759.ooo.test sshd[23764]: Invalid user test2 from 49.164.171.171 port 50337
Oct 14 07:54:03 np0005486759.ooo.test sshd[23764]: error: maximum authentication attempts exceeded for invalid user test2 from 49.164.171.171 port 50337 ssh2 [preauth]
Oct 14 07:54:03 np0005486759.ooo.test sshd[23764]: Disconnecting invalid user test2 49.164.171.171 port 50337: Too many authentication failures [preauth]
Oct 14 07:54:03 np0005486759.ooo.test sshd[23766]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:07 np0005486759.ooo.test sshd[23766]: Invalid user test2 from 49.164.171.171 port 50542
Oct 14 07:54:08 np0005486759.ooo.test sshd[23766]: error: maximum authentication attempts exceeded for invalid user test2 from 49.164.171.171 port 50542 ssh2 [preauth]
Oct 14 07:54:08 np0005486759.ooo.test sshd[23766]: Disconnecting invalid user test2 49.164.171.171 port 50542: Too many authentication failures [preauth]
Oct 14 07:54:09 np0005486759.ooo.test sshd[23768]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:13 np0005486759.ooo.test sshd[23768]: Invalid user test2 from 49.164.171.171 port 50792
Oct 14 07:54:14 np0005486759.ooo.test sshd[23768]: Received disconnect from 49.164.171.171 port 50792:11: disconnected by user [preauth]
Oct 14 07:54:14 np0005486759.ooo.test sshd[23768]: Disconnected from invalid user test2 49.164.171.171 port 50792 [preauth]
Oct 14 07:54:14 np0005486759.ooo.test sshd[23770]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:20 np0005486759.ooo.test sshd[23770]: Invalid user ubuntu from 49.164.171.171 port 50990
Oct 14 07:54:21 np0005486759.ooo.test sshd[23770]: error: maximum authentication attempts exceeded for invalid user ubuntu from 49.164.171.171 port 50990 ssh2 [preauth]
Oct 14 07:54:21 np0005486759.ooo.test sshd[23770]: Disconnecting invalid user ubuntu 49.164.171.171 port 50990: Too many authentication failures [preauth]
Oct 14 07:54:21 np0005486759.ooo.test sshd[23772]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:27 np0005486759.ooo.test sshd[23772]: Invalid user ubuntu from 49.164.171.171 port 51301
Oct 14 07:54:28 np0005486759.ooo.test sshd[23772]: error: maximum authentication attempts exceeded for invalid user ubuntu from 49.164.171.171 port 51301 ssh2 [preauth]
Oct 14 07:54:28 np0005486759.ooo.test sshd[23772]: Disconnecting invalid user ubuntu 49.164.171.171 port 51301: Too many authentication failures [preauth]
Oct 14 07:54:29 np0005486759.ooo.test sshd[23774]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:34 np0005486759.ooo.test sshd[23774]: Invalid user ubuntu from 49.164.171.171 port 51575
Oct 14 07:54:35 np0005486759.ooo.test sshd[23774]: Received disconnect from 49.164.171.171 port 51575:11: disconnected by user [preauth]
Oct 14 07:54:35 np0005486759.ooo.test sshd[23774]: Disconnected from invalid user ubuntu 49.164.171.171 port 51575 [preauth]
Oct 14 07:54:35 np0005486759.ooo.test sshd[23776]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:41 np0005486759.ooo.test sshd[23776]: Invalid user pi from 49.164.171.171 port 51824
Oct 14 07:54:42 np0005486759.ooo.test sshd[23776]: Received disconnect from 49.164.171.171 port 51824:11: disconnected by user [preauth]
Oct 14 07:54:42 np0005486759.ooo.test sshd[23776]: Disconnected from invalid user pi 49.164.171.171 port 51824 [preauth]
Oct 14 07:54:42 np0005486759.ooo.test sshd[23778]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 07:54:46 np0005486759.ooo.test sshd[23778]: Invalid user baikal from 49.164.171.171 port 52117
Oct 14 07:54:46 np0005486759.ooo.test sshd[23778]: Received disconnect from 49.164.171.171 port 52117:11: disconnected by user [preauth]
Oct 14 07:54:46 np0005486759.ooo.test sshd[23778]: Disconnected from invalid user baikal 49.164.171.171 port 52117 [preauth]
Oct 14 08:01:01 np0005486759.ooo.test CROND[23783]: (root) CMD (run-parts /etc/cron.hourly)
Oct 14 08:01:01 np0005486759.ooo.test run-parts[23786]: (/etc/cron.hourly) starting 0anacron
Oct 14 08:01:01 np0005486759.ooo.test run-parts[23792]: (/etc/cron.hourly) finished 0anacron
Oct 14 08:01:01 np0005486759.ooo.test CROND[23782]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 14 08:04:46 np0005486759.ooo.test sshd[23794]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 08:04:47 np0005486759.ooo.test sshd[23794]: Accepted publickey for zuul from 192.168.122.100 port 53598 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 08:04:47 np0005486759.ooo.test systemd-logind[759]: New session 11 of user zuul.
Oct 14 08:04:47 np0005486759.ooo.test systemd[1]: Started Session 11 of User zuul.
Oct 14 08:04:47 np0005486759.ooo.test sshd[23794]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 08:04:47 np0005486759.ooo.test sudo[23840]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqusenpuainbqcrnwcqatqwtvychmqse ; /usr/bin/python3
Oct 14 08:04:47 np0005486759.ooo.test sudo[23840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:47 np0005486759.ooo.test python3[23842]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 14 08:04:47 np0005486759.ooo.test sudo[23840]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:48 np0005486759.ooo.test sudo[23885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umkumomkicxfhucbbmyvfafovwmusyoq ; /usr/bin/python3
Oct 14 08:04:48 np0005486759.ooo.test sudo[23885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:48 np0005486759.ooo.test python3[23887]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 08:04:48 np0005486759.ooo.test sudo[23885]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:48 np0005486759.ooo.test sudo[23905]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjovxrqsntisqvmxcizjdeilbfbmxeaa ; /usr/bin/python3
Oct 14 08:04:48 np0005486759.ooo.test sudo[23905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:48 np0005486759.ooo.test python3[23907]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486759.ooo.test update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 14 08:04:48 np0005486759.ooo.test useradd[23909]: new group: name=tripleo-admin, GID=1002
Oct 14 08:04:48 np0005486759.ooo.test useradd[23909]: new user: name=tripleo-admin, UID=1002, GID=1002, home=/home/tripleo-admin, shell=/bin/bash, from=none
Oct 14 08:04:48 np0005486759.ooo.test sudo[23905]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:49 np0005486759.ooo.test sudo[23961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkecsyzojakyohjqyftoizoiuepzvudh ; /usr/bin/python3
Oct 14 08:04:49 np0005486759.ooo.test sudo[23961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:49 np0005486759.ooo.test python3[23963]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:04:49 np0005486759.ooo.test sudo[23961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:49 np0005486759.ooo.test sudo[24004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cunmnfqjnmibcsqmdbuqunsknhjuryyo ; /usr/bin/python3
Oct 14 08:04:49 np0005486759.ooo.test sudo[24004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:49 np0005486759.ooo.test python3[24006]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760429088.9879215-94220-249871567505956/source _original_basename=tmpdi_8_h12 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:04:49 np0005486759.ooo.test sudo[24004]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:49 np0005486759.ooo.test sudo[24034]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpiwgpppwbaewsmsvdiboinbkingsphn ; /usr/bin/python3
Oct 14 08:04:49 np0005486759.ooo.test sudo[24034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:50 np0005486759.ooo.test python3[24036]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:04:50 np0005486759.ooo.test sudo[24034]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:50 np0005486759.ooo.test sudo[24050]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-endxjemwgbijwfbdurhfaescqmwrrrbb ; /usr/bin/python3
Oct 14 08:04:50 np0005486759.ooo.test sudo[24050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:50 np0005486759.ooo.test python3[24052]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:04:50 np0005486759.ooo.test sudo[24050]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:50 np0005486759.ooo.test sudo[24066]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swcfheezvvnkgxtjixgxzffiinvsvage ; /usr/bin/python3
Oct 14 08:04:50 np0005486759.ooo.test sudo[24066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:50 np0005486759.ooo.test python3[24068]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:04:50 np0005486759.ooo.test sudo[24066]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:51 np0005486759.ooo.test sudo[24082]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nayrmcdfemitlwyqlhjwwsgcnvedcarl ; /usr/bin/python3
Oct 14 08:04:51 np0005486759.ooo.test sudo[24082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:04:51 np0005486759.ooo.test python3[24084]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpCkKHtdAJvqUoWfry6wT9BiEt8oegJcZFI/9galMv8ZYmo/NBcS3vjEuF9385qAETdPLU+rGztzEvbgOXTGalOiMOoN+F7ELwARQwPYS2b6JDoalDqgTJD2+XWrLKXsBBc4d7YOy0D+cJQ+YvlxXj73YP/7+B/cwxaWftnlTUXfyLIH79jw7oqPg1EpUSVIbSmItL2s/1CNxeNHq6AeV04V+vyKgfzdbglEGmnDHnNMnJYbkoYZs0GcsOCkKZV5fht0OYKRAfYo2a/CuQrfpt2iBcPznSWUllp59WlSF3mtiL9taksr5HpRpvMv9e5Rg1dYebt+6vi2OPhqCD/rqcYfmfhceMZ9qMpS6ffDt5NpHT7rvn0vBtHqb6PxQng5BvynCqAE8WGLej9EhoXfu7xiTuOWvdrrSynaQIM4JhvTCCBJmWHCoHV+70bsoqNNEd3ciEKNYqLWuCMksS9F9LTSoOpBhX4gYl+VaFGdH/WTKe0Ae2uUq0Cz/GmuiFVtE= zuul-build-sshkey
                                                       regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:04:51 np0005486759.ooo.test sudo[24082]: pam_unix(sudo:session): session closed for user root
Oct 14 08:04:51 np0005486759.ooo.test python3[24098]: ansible-ping Invoked with data=pong
Oct 14 08:04:59 np0005486759.ooo.test sshd[24100]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 08:04:59 np0005486759.ooo.test sshd[24100]: Accepted publickey for tripleo-admin from 192.168.122.100 port 39458 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 08:04:59 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 1002.
Oct 14 08:04:59 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/1002...
Oct 14 08:04:59 np0005486759.ooo.test systemd-logind[759]: New session 12 of user tripleo-admin.
Oct 14 08:04:59 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/1002.
Oct 14 08:04:59 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 1002...
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1002) by (uid=0)
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Queued start job for default target Main User Target.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Created slice User Application Slice.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Reached target Paths.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Reached target Timers.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Starting D-Bus User Message Bus Socket...
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Starting Create User's Volatile Files and Directories...
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Finished Create User's Volatile Files and Directories.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Listening on D-Bus User Message Bus Socket.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Reached target Sockets.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Reached target Basic System.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Reached target Main User Target.
Oct 14 08:04:59 np0005486759.ooo.test systemd[24104]: Startup finished in 85ms.
Oct 14 08:04:59 np0005486759.ooo.test systemd[1]: Started User Manager for UID 1002.
Oct 14 08:04:59 np0005486759.ooo.test systemd[1]: Started Session 12 of User tripleo-admin.
Oct 14 08:04:59 np0005486759.ooo.test sshd[24100]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1002) by (uid=0)
Oct 14 08:05:00 np0005486759.ooo.test sudo[24163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jttiyuoddkegnywluhsdeofsawikmnyn ; /usr/bin/python3
Oct 14 08:05:00 np0005486759.ooo.test sudo[24163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:00 np0005486759.ooo.test python3[24165]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 08:05:00 np0005486759.ooo.test sudo[24163]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:02 np0005486759.ooo.test sudo[24183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdzkgojbqpcwfvjeiimupyehigtuxejz ; /usr/bin/python3
Oct 14 08:05:02 np0005486759.ooo.test sudo[24183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:02 np0005486759.ooo.test python3[24185]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Oct 14 08:05:02 np0005486759.ooo.test sudo[24183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:02 np0005486759.ooo.test sudo[24199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qufkaoxgenazusthbuuuvalezkxoraws ; /usr/bin/python3
Oct 14 08:05:02 np0005486759.ooo.test sudo[24199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:02 np0005486759.ooo.test python3[24201]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Oct 14 08:05:02 np0005486759.ooo.test sudo[24199]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:03 np0005486759.ooo.test sudo[24247]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgexpymlolbybxvvncrbxaewkmgczbvx ; /usr/bin/python3
Oct 14 08:05:03 np0005486759.ooo.test sudo[24247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:03 np0005486759.ooo.test python3[24249]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.oqcgz8r5tmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:05:03 np0005486759.ooo.test sudo[24247]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:03 np0005486759.ooo.test sudo[24277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tznromtvllwohkgclzwqveqsmjwjkwwg ; /usr/bin/python3
Oct 14 08:05:03 np0005486759.ooo.test sudo[24277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:03 np0005486759.ooo.test python3[24279]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.oqcgz8r5tmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:05:03 np0005486759.ooo.test sudo[24277]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:04 np0005486759.ooo.test sudo[24293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwhtcukcdrmxblisfoipalterffnvoco ; /usr/bin/python3
Oct 14 08:05:04 np0005486759.ooo.test sudo[24293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:04 np0005486759.ooo.test python3[24295]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.oqcgz8r5tmphosts insertbefore=BOF block=172.17.0.106 np0005486757.ooo.test np0005486757
                                                      172.18.0.106 np0005486757.storage.ooo.test np0005486757.storage
                                                      172.20.0.106 np0005486757.storagemgmt.ooo.test np0005486757.storagemgmt
                                                      172.17.0.106 np0005486757.internalapi.ooo.test np0005486757.internalapi
                                                      172.19.0.106 np0005486757.tenant.ooo.test np0005486757.tenant
                                                      192.168.122.106 np0005486757.ctlplane.ooo.test np0005486757.ctlplane
                                                      172.17.0.107 np0005486759.ooo.test np0005486759
                                                      172.18.0.107 np0005486759.storage.ooo.test np0005486759.storage
                                                      172.17.0.107 np0005486759.internalapi.ooo.test np0005486759.internalapi
                                                      172.19.0.107 np0005486759.tenant.ooo.test np0005486759.tenant
                                                      192.168.122.107 np0005486759.ctlplane.ooo.test np0005486759.ctlplane
                                                      
                                                      192.168.122.100 undercloud.ctlplane.ooo.test undercloud.ctlplane
                                                      192.168.122.93  multicell.ctlplane.localdomain
                                                      172.18.0.143  multicell.storage.localdomain
                                                      172.20.0.153  multicell.storagemgmt.localdomain
                                                      172.17.0.133  multicell.internalapi.localdomain
                                                      172.21.0.83  multicell.ooo.test
                                                      192.168.122.100 undercloud.ctlplane.ooo.test undercloud.ctlplane
                                                      192.168.122.90  multicell.ctlplane.localdomain
                                                      172.18.0.140  multicell.storage.localdomain
                                                      172.20.0.150  multicell.storagemgmt.localdomain
                                                      172.17.0.130  multicell.internalapi.localdomain
                                                      172.21.0.80  multicell.ooo.test
                                                      172.17.0.103 np0005486756.ooo.test np0005486756
                                                      172.18.0.103 np0005486756.storage.ooo.test np0005486756.storage
                                                      172.20.0.103 np0005486756.storagemgmt.ooo.test np0005486756.storagemgmt
                                                      172.17.0.103 np0005486756.internalapi.ooo.test np0005486756.internalapi
                                                      172.19.0.103 np0005486756.tenant.ooo.test np0005486756.tenant
                                                      192.168.122.103 np0005486756.ctlplane.ooo.test np0005486756.ctlplane
                                                       marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: cell1 marker_end=END_HOST_ENTRIES_FOR_STACK: cell1 state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:05:04 np0005486759.ooo.test sudo[24293]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:04 np0005486759.ooo.test sudo[24309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awqvcvdkxxmeeanwwccqkujavsyoysom ; /usr/bin/python3
Oct 14 08:05:04 np0005486759.ooo.test sudo[24309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:04 np0005486759.ooo.test python3[24311]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.oqcgz8r5tmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:05:04 np0005486759.ooo.test sudo[24309]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:05 np0005486759.ooo.test sudo[24326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiyphstesygjkxpypyztnmpvkzsnetyx ; /usr/bin/python3
Oct 14 08:05:05 np0005486759.ooo.test sudo[24326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:05 np0005486759.ooo.test python3[24328]: ansible-file Invoked with path=/tmp/ansible.oqcgz8r5tmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:05:05 np0005486759.ooo.test sudo[24326]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:05 np0005486759.ooo.test sudo[24342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zblfxnjoltffkjjhmswvcbxykkutmeev ; /usr/bin/python3
Oct 14 08:05:05 np0005486759.ooo.test sudo[24342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:06 np0005486759.ooo.test python3[24344]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:05:06 np0005486759.ooo.test sudo[24342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:06 np0005486759.ooo.test sudo[24359]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egwubsdpbjnoqvgezhfgqphcfhxpokzv ; /usr/bin/python3
Oct 14 08:05:06 np0005486759.ooo.test sudo[24359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:06 np0005486759.ooo.test python3[24361]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:05:10 np0005486759.ooo.test sudo[24359]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:10 np0005486759.ooo.test systemd-journald[618]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 80.2 (267 of 333 items), suggesting rotation.
Oct 14 08:05:10 np0005486759.ooo.test systemd-journald[618]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 08:05:10 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 08:05:10 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 08:05:10 np0005486759.ooo.test sudo[24380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttxntznsbdhcnbzvczhnorgslzgkqwqr ; /usr/bin/python3
Oct 14 08:05:10 np0005486759.ooo.test sudo[24380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:11 np0005486759.ooo.test python3[24382]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:05:11 np0005486759.ooo.test sudo[24380]: pam_unix(sudo:session): session closed for user root
Oct 14 08:05:11 np0005486759.ooo.test sudo[24397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llgojqczqarbjuztwezshcgnslpepblo ; /usr/bin/python3
Oct 14 08:05:11 np0005486759.ooo.test sudo[24397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:05:11 np0005486759.ooo.test python3[24399]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:05:26 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:05:26 np0005486759.ooo.test systemd-rc-local-generator[24600]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:05:26 np0005486759.ooo.test systemd-sysv-generator[24603]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:05:26 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:05:26 np0005486759.ooo.test systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 14 08:05:27 np0005486759.ooo.test groupadd[24618]: group added to /etc/group: name=puppet, GID=52
Oct 14 08:05:27 np0005486759.ooo.test groupadd[24618]: group added to /etc/gshadow: name=puppet
Oct 14 08:05:27 np0005486759.ooo.test groupadd[24618]: new group: name=puppet, GID=52
Oct 14 08:05:27 np0005486759.ooo.test useradd[24625]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Oct 14 08:05:34 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:05:34 np0005486759.ooo.test systemd-sysv-generator[24674]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:05:34 np0005486759.ooo.test systemd-rc-local-generator[24671]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:05:34 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:05:34 np0005486759.ooo.test systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 14 08:05:34 np0005486759.ooo.test systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 14 08:05:34 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:05:35 np0005486759.ooo.test systemd-rc-local-generator[24713]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:05:35 np0005486759.ooo.test systemd-sysv-generator[24717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:05:35 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:05:35 np0005486759.ooo.test systemd[1]: Listening on LVM2 poll daemon socket.
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  Converting 2687 SID table entries...
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 08:06:22 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 08:06:22 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 14 08:06:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:06:23 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 08:06:23 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:06:23 np0005486759.ooo.test systemd-rc-local-generator[25610]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:06:23 np0005486759.ooo.test systemd-sysv-generator[25614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:06:23 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:06:23 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 08:06:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:06:24 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 08:06:24 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 08:06:24 np0005486759.ooo.test systemd[1]: run-r7c0d14b917d549888639ce3b9a47882f.service: Deactivated successfully.
Oct 14 08:06:24 np0005486759.ooo.test systemd[1]: run-r22dd3fb636a346ed9ffefd4d7e2eb283.service: Deactivated successfully.
Oct 14 08:06:25 np0005486759.ooo.test sudo[24397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:25 np0005486759.ooo.test sudo[26426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfgzzyyqotisdlvsceqwtmizmvytdxzt ; /usr/bin/python3
Oct 14 08:06:25 np0005486759.ooo.test sudo[26426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:25 np0005486759.ooo.test python3[26428]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:26 np0005486759.ooo.test sudo[26426]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:26 np0005486759.ooo.test sudo[26565]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwnjhaobhfmzdjacpergznwapeilmoxv ; /usr/bin/python3
Oct 14 08:06:26 np0005486759.ooo.test sudo[26565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:26 np0005486759.ooo.test python3[26567]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:06:26 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:06:27 np0005486759.ooo.test systemd-rc-local-generator[26594]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:06:27 np0005486759.ooo.test systemd-sysv-generator[26599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:06:27 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:06:27 np0005486759.ooo.test sudo[26565]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:27 np0005486759.ooo.test sudo[26619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udkakggpaveojlikyzrhqbyeoqnfygce ; /usr/bin/python3
Oct 14 08:06:27 np0005486759.ooo.test sudo[26619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:27 np0005486759.ooo.test python3[26621]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:27 np0005486759.ooo.test sudo[26619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:27 np0005486759.ooo.test sudo[26635]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aehtlgfeibzihkqkzphtuteifjaxpnco ; /usr/bin/python3
Oct 14 08:06:27 np0005486759.ooo.test sudo[26635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:27 np0005486759.ooo.test python3[26637]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:27 np0005486759.ooo.test sudo[26635]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:28 np0005486759.ooo.test sudo[26652]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnxcjtbfmepaxtykeukzsqvtvzrbqmeb ; /usr/bin/python3
Oct 14 08:06:28 np0005486759.ooo.test sudo[26652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:28 np0005486759.ooo.test python3[26654]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 14 08:06:28 np0005486759.ooo.test sudo[26652]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:28 np0005486759.ooo.test sudo[26670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzhjrmjxjjfirxwcxosaelqvtgznbjot ; /usr/bin/python3
Oct 14 08:06:28 np0005486759.ooo.test sudo[26670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:28 np0005486759.ooo.test python3[26672]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:28 np0005486759.ooo.test sudo[26670]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:29 np0005486759.ooo.test sudo[26688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clfwnxplabopkjiaqwcisvtirkfbvspa ; /usr/bin/python3
Oct 14 08:06:29 np0005486759.ooo.test sudo[26688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:29 np0005486759.ooo.test python3[26690]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:29 np0005486759.ooo.test sudo[26688]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:29 np0005486759.ooo.test sudo[26706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuxbvqaitiesajbjgsptghmvdmramttf ; /usr/bin/python3
Oct 14 08:06:29 np0005486759.ooo.test sudo[26706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:29 np0005486759.ooo.test python3[26708]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 08:06:29 np0005486759.ooo.test systemd[1]: Reloading Network Manager...
Oct 14 08:06:29 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760429189.7055] audit: op="reload" arg="0" pid=26711 uid=0 result="success"
Oct 14 08:06:29 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760429189.7069] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Oct 14 08:06:29 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760429189.7070] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Oct 14 08:06:29 np0005486759.ooo.test systemd[1]: Reloaded Network Manager.
Oct 14 08:06:29 np0005486759.ooo.test sudo[26706]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:29 np0005486759.ooo.test sudo[26725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nttfroytlkosbkhaqahyxjdchlxoeary ; /usr/bin/python3
Oct 14 08:06:29 np0005486759.ooo.test sudo[26725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:30 np0005486759.ooo.test python3[26727]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:30 np0005486759.ooo.test sudo[26725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:30 np0005486759.ooo.test sudo[26742]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqylqkueiejqktdslrfvanrfuzwaxkbf ; /usr/bin/python3
Oct 14 08:06:30 np0005486759.ooo.test sudo[26742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:30 np0005486759.ooo.test python3[26744]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:06:30 np0005486759.ooo.test sudo[26742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:30 np0005486759.ooo.test sudo[26760]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vopwidyjjtqbfuohgfubuushsszjjgzi ; /usr/bin/python3
Oct 14 08:06:30 np0005486759.ooo.test sudo[26760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:30 np0005486759.ooo.test python3[26762]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:06:30 np0005486759.ooo.test sudo[26760]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:31 np0005486759.ooo.test sudo[26776]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffphlnbekeljfnhdwzehxfjgywegutde ; /usr/bin/python3
Oct 14 08:06:31 np0005486759.ooo.test sudo[26776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:31 np0005486759.ooo.test python3[26778]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:31 np0005486759.ooo.test sudo[26776]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:31 np0005486759.ooo.test sudo[26792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yayuduzuobukcfjtoxjngqzanruvopfd ; /usr/bin/python3
Oct 14 08:06:31 np0005486759.ooo.test sudo[26792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:31 np0005486759.ooo.test python3[26794]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 14 08:06:31 np0005486759.ooo.test sudo[26792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:31 np0005486759.ooo.test sudo[26808]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgjlhzbkclghnzudyljwjtnzlqqcmoge ; /usr/bin/python3
Oct 14 08:06:31 np0005486759.ooo.test sudo[26808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:31 np0005486759.ooo.test python3[26810]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:06:31 np0005486759.ooo.test sudo[26808]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:32 np0005486759.ooo.test sudo[26824]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eekujnbkzjvkwolwwfxlurrdorkngxaa ; /usr/bin/python3
Oct 14 08:06:32 np0005486759.ooo.test sudo[26824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:32 np0005486759.ooo.test python3[26826]: ansible-blockinfile Invoked with path=/tmp/ansible.s24lsb7h block=[192.168.122.106]*,[np0005486757.ctlplane.ooo.test]*,[172.17.0.106]*,[np0005486757.internalapi.ooo.test]*,[172.18.0.106]*,[np0005486757.storage.ooo.test]*,[172.20.0.106]*,[np0005486757.storagemgmt.ooo.test]*,[172.19.0.106]*,[np0005486757.tenant.ooo.test]*,[np0005486757.ooo.test]*,[np0005486757]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0VhGV8FoQnbaKI/176M5x3j2rijfSthIuvXgIZQyN+bO8xdtIwHJxFYchLfi7VKvQlt2hlpDNQqmrkkBGFMdh+mYrfLux2+1ISx/5YmSxgKENJIYvwa5dwPlHNCXkyMnkztMY8SDWEebZVcpfgbJlWNxHm08y9FYJVazc/OxyIe6g8gLyO3e6z6kDzQ8lm9zr3i9/IGOjXU1t1ms88x+6tcXGpZ3zLxo4T1lr6rNHWv2sEL2p1acghyoohfN0ybbxAINeDtMP4MhnUB64jubYXR5wMMaVP4iMSYKviz4cjUVIE64OkXOUQYJU1w4Nq8BH5p7TQ9EXkwOvYMPptbuDUqpTTkHgOpGgg/DERVTeBuU20ogCm05L0XXK7Kzg77FHeFaLvZnwyVm5BiYAV8Qz061X378BiX7oBlTojMQXebFj/BhcXG1350YzF1+kJnBHRAmCPnLZA6/KMSG39qfrwH+wEyKMRmA8Y5fzl94xz6fHcR9GRzzpbab3mKSVP0M=
                                                      [192.168.122.107]*,[np0005486759.ctlplane.ooo.test]*,[172.17.0.107]*,[np0005486759.internalapi.ooo.test]*,[172.18.0.107]*,[np0005486759.storage.ooo.test]*,[172.19.0.107]*,[np0005486759.tenant.ooo.test]*,[np0005486759.ooo.test]*,[np0005486759]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDa7N/yf5kblytISC8YE3UN49GJulR+55hCVck3l5AqdE+beyJ+w8p1C78UnecqfQMxqRm33gN6DojHe1jClFu5yXaX4PkXntTMh9OVTmHf4h+I1VDHv24Pk9IVHv/+p005cD/6p0aUXc3UlKcftzByKVCQz0hQ8VWKbVAutMFA0CybLnUKZD6ev92/TcYBkjFVAGBFdYisqLFLXLZAhKw/Vi30rEZYweRPLcWAs1HsEM3B0H8fejbp0qbBeYxafRFhfnNgGhtfYu/qAj4DjOmpAwiVKEiaaCH39yKCMuGFhU+FNoKpxDsgv+pvy4XMhOrkv8r+dAydiNrrunuHMXW9w+x5ifTxJCnpbjXSsksH6btN3AnB0QnRZ3e+Go0fnivXr/F0oOZDUcziGRnyJAik4Ycd2T/Wy0SegD/VQJ6RTln2lEYQU5N6lDsWh/fs/Fo3/Xg4/g8TKAsjYuZZzPQ4IbpfE+oyhRaz6qpi2a98pmsVsbJhuFsddikOYK9BTV8=
                                                       create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:32 np0005486759.ooo.test sudo[26824]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:32 np0005486759.ooo.test sudo[26840]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omgzufsmurcedtustixrjqdtnwtlulhe ; /usr/bin/python3
Oct 14 08:06:32 np0005486759.ooo.test sudo[26840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:32 np0005486759.ooo.test python3[26842]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.s24lsb7h' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:32 np0005486759.ooo.test sudo[26840]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:33 np0005486759.ooo.test sudo[26858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrpnsixkzelwdfbmsffvtdjrujbifjrt ; /usr/bin/python3
Oct 14 08:06:33 np0005486759.ooo.test sudo[26858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:33 np0005486759.ooo.test python3[26860]: ansible-file Invoked with path=/tmp/ansible.s24lsb7h state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:33 np0005486759.ooo.test sudo[26858]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:33 np0005486759.ooo.test sudo[26874]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hykmztpnmebujekdgjjlbpejbmaqjows ; /usr/bin/python3
Oct 14 08:06:33 np0005486759.ooo.test sudo[26874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:33 np0005486759.ooo.test python3[26876]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:06:33 np0005486759.ooo.test sudo[26874]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:33 np0005486759.ooo.test sudo[26890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlrzxbiznkwfrmofeitqvctlduykjeyi ; /usr/bin/python3
Oct 14 08:06:33 np0005486759.ooo.test sudo[26890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:34 np0005486759.ooo.test python3[26892]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:34 np0005486759.ooo.test sudo[26890]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:34 np0005486759.ooo.test sudo[26908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksplxldusaypybnwsxzxntyvnbwfdyqp ; /usr/bin/python3
Oct 14 08:06:34 np0005486759.ooo.test sudo[26908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:34 np0005486759.ooo.test python3[26910]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:34 np0005486759.ooo.test sudo[26908]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:34 np0005486759.ooo.test sudo[26927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgmyyfdubjjgiuyakcxtwacoabkrmhbm ; /usr/bin/python3
Oct 14 08:06:34 np0005486759.ooo.test sudo[26927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:34 np0005486759.ooo.test python3[26929]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Oct 14 08:06:34 np0005486759.ooo.test sudo[26927]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:34 np0005486759.ooo.test sudo[26943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swvcgfzogemjqtutbudoagcrgmizuucn ; /usr/bin/python3
Oct 14 08:06:34 np0005486759.ooo.test sudo[26943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:35 np0005486759.ooo.test sudo[26943]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:35 np0005486759.ooo.test sudo[26991]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwaybqjyhtunzozutylzgfsiszpbajwk ; /usr/bin/python3
Oct 14 08:06:35 np0005486759.ooo.test sudo[26991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:35 np0005486759.ooo.test sudo[26991]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:35 np0005486759.ooo.test sudo[27034]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pazpeooptudugufqpjcmvuvhdxaoknjb ; /usr/bin/python3
Oct 14 08:06:35 np0005486759.ooo.test sudo[27034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:35 np0005486759.ooo.test sudo[27034]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:36 np0005486759.ooo.test sudo[27064]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwtrcjhkbnjvgrqoigxsicmsoljriftl ; /usr/bin/python3
Oct 14 08:06:36 np0005486759.ooo.test sudo[27064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:36 np0005486759.ooo.test python3[27066]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:36 np0005486759.ooo.test sudo[27064]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:36 np0005486759.ooo.test sudo[27081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khggvaposmfozmnvruiujvtxtixvnfix ; /usr/bin/python3
Oct 14 08:06:36 np0005486759.ooo.test sudo[27081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:37 np0005486759.ooo.test python3[27083]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:06:40 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 08:06:40 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:06:40 np0005486759.ooo.test systemd-sysv-generator[27131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:06:40 np0005486759.ooo.test systemd-rc-local-generator[27127]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: tuned.service: Deactivated successfully.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: tuned.service: Consumed 1.957s CPU time.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 08:06:40 np0005486759.ooo.test systemd[1]: run-rcdb959866b4249dc9558e20ac83791e4.service: Deactivated successfully.
Oct 14 08:06:42 np0005486759.ooo.test systemd[1]: Started Dynamic System Tuning Daemon.
Oct 14 08:06:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:06:42 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 08:06:42 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 08:06:42 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 08:06:42 np0005486759.ooo.test systemd[1]: run-r30311fb4832744deb87a4b263d624548.service: Deactivated successfully.
Oct 14 08:06:42 np0005486759.ooo.test sudo[27081]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:43 np0005486759.ooo.test sudo[27518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbvqzokfushiioymrsbqijluemwxjeux ; /usr/bin/python3
Oct 14 08:06:43 np0005486759.ooo.test sudo[27518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:43 np0005486759.ooo.test python3[27520]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:06:43 np0005486759.ooo.test systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 14 08:06:43 np0005486759.ooo.test systemd[1]: tuned.service: Deactivated successfully.
Oct 14 08:06:43 np0005486759.ooo.test systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 14 08:06:43 np0005486759.ooo.test systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 14 08:06:44 np0005486759.ooo.test systemd[1]: Started Dynamic System Tuning Daemon.
Oct 14 08:06:44 np0005486759.ooo.test sudo[27518]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:44 np0005486759.ooo.test sudo[27713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uotoljtgkyeeufccxspcujtiwgwdvrqm ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Oct 14 08:06:44 np0005486759.ooo.test sudo[27713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:44 np0005486759.ooo.test python3[27715]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:44 np0005486759.ooo.test sudo[27713]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:45 np0005486759.ooo.test sudo[27730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wewiywyawtuyragamjkqccemtwywhrgo ; /usr/bin/python3
Oct 14 08:06:45 np0005486759.ooo.test sudo[27730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:45 np0005486759.ooo.test python3[27732]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Oct 14 08:06:45 np0005486759.ooo.test sudo[27730]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:45 np0005486759.ooo.test sudo[27746]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iylbyrsyuykxrvorwuyxpxrjdfpfhvqf ; /usr/bin/python3
Oct 14 08:06:45 np0005486759.ooo.test sudo[27746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:45 np0005486759.ooo.test python3[27748]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:06:46 np0005486759.ooo.test sudo[27746]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:46 np0005486759.ooo.test sudo[27762]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imjujkykiehylhezoadpinsnmyunipzq ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Oct 14 08:06:46 np0005486759.ooo.test sudo[27762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:46 np0005486759.ooo.test python3[27764]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:47 np0005486759.ooo.test sudo[27762]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:48 np0005486759.ooo.test sudo[27782]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlrodpidfzbcjaqftlnhtyfvysooxpox ; /usr/bin/python3
Oct 14 08:06:48 np0005486759.ooo.test sudo[27782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:48 np0005486759.ooo.test python3[27784]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:06:48 np0005486759.ooo.test sudo[27782]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:48 np0005486759.ooo.test sudo[27799]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrvpbdtphelagvrildqlgxilytmniueo ; /usr/bin/python3
Oct 14 08:06:48 np0005486759.ooo.test sudo[27799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:48 np0005486759.ooo.test python3[27801]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:06:48 np0005486759.ooo.test sudo[27799]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:49 np0005486759.ooo.test sudo[27815]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oblsgefqbmjtsuxogdaemfzpmxvqujsr ; /usr/bin/python3
Oct 14 08:06:49 np0005486759.ooo.test sudo[27815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:49 np0005486759.ooo.test python3[27817]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:49 np0005486759.ooo.test sudo[27815]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:51 np0005486759.ooo.test sudo[27831]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unriifcytmvwuvvguckcwraavhbcduni ; /usr/bin/python3
Oct 14 08:06:51 np0005486759.ooo.test sudo[27831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:51 np0005486759.ooo.test python3[27833]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:51 np0005486759.ooo.test sudo[27831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:52 np0005486759.ooo.test sudo[27879]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfoqjwshwrzsavrjjdhskgxdxdpvosbx ; /usr/bin/python3
Oct 14 08:06:52 np0005486759.ooo.test sudo[27879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:52 np0005486759.ooo.test python3[27881]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:52 np0005486759.ooo.test sudo[27879]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:52 np0005486759.ooo.test sudo[27924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhdkrfccdrlrvyszziibavfgsefcnutd ; /usr/bin/python3
Oct 14 08:06:52 np0005486759.ooo.test sudo[27924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:52 np0005486759.ooo.test python3[27926]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429211.9860237-97061-91800119323063/source _original_basename=tmpqgq95bjf follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:52 np0005486759.ooo.test sudo[27924]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:52 np0005486759.ooo.test sudo[27954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjanmyeupidswlinmnxmgcsktyfftncz ; /usr/bin/python3
Oct 14 08:06:52 np0005486759.ooo.test sudo[27954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:52 np0005486759.ooo.test python3[27956]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:52 np0005486759.ooo.test sudo[27954]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:53 np0005486759.ooo.test sudo[28002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqrgvwpmaoonvtlxigsybfptpynbjgyj ; /usr/bin/python3
Oct 14 08:06:53 np0005486759.ooo.test sudo[28002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:53 np0005486759.ooo.test python3[28004]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:53 np0005486759.ooo.test sudo[28002]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:53 np0005486759.ooo.test sudo[28045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulebqjgctcedyzmctatwqeeupcovollc ; /usr/bin/python3
Oct 14 08:06:53 np0005486759.ooo.test sudo[28045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:53 np0005486759.ooo.test python3[28047]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429213.2395267-97092-1348442864451/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=e475244bbda3d4ce35023a2f55cbf871a7a022b2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:53 np0005486759.ooo.test sudo[28045]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:54 np0005486759.ooo.test sudo[28107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzlppykqmrxxjbrwmkxzrhvhmfydqxlt ; /usr/bin/python3
Oct 14 08:06:54 np0005486759.ooo.test sudo[28107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:54 np0005486759.ooo.test python3[28109]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:54 np0005486759.ooo.test sudo[28107]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:54 np0005486759.ooo.test sudo[28150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqtpnndlvkdimbannvnaswoofznguxsd ; /usr/bin/python3
Oct 14 08:06:54 np0005486759.ooo.test sudo[28150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:54 np0005486759.ooo.test python3[28152]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429214.0631895-97112-210740647127318/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=b2bce0fe6be086f06de600d39b1d5d6aa52394ac backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:54 np0005486759.ooo.test sudo[28150]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:54 np0005486759.ooo.test sudo[28212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgicfxzaniyrnbespndlsioqlzsdyvum ; /usr/bin/python3
Oct 14 08:06:54 np0005486759.ooo.test sudo[28212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:55 np0005486759.ooo.test python3[28214]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:55 np0005486759.ooo.test sudo[28212]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:55 np0005486759.ooo.test sudo[28255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alcaupjvlzwkereblsuhvxlbgrutfmna ; /usr/bin/python3
Oct 14 08:06:55 np0005486759.ooo.test sudo[28255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:55 np0005486759.ooo.test python3[28257]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429214.7901402-97112-109649186337053/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=25760107445b8a62fac92e15b3e778990291aae9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:55 np0005486759.ooo.test sudo[28255]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:55 np0005486759.ooo.test sudo[28317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndjnjnlobpfnvyvgvlddzxdsghjuhpvm ; /usr/bin/python3
Oct 14 08:06:55 np0005486759.ooo.test sudo[28317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:56 np0005486759.ooo.test python3[28319]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:56 np0005486759.ooo.test sudo[28317]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:56 np0005486759.ooo.test sudo[28360]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hibvmitqvassbmvqxucybmfxmzddvfpi ; /usr/bin/python3
Oct 14 08:06:56 np0005486759.ooo.test sudo[28360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:56 np0005486759.ooo.test python3[28362]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429215.571599-97112-219190546046111/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=f4bda8bac75a360674eb550524a2cc6c36b23141 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:56 np0005486759.ooo.test sudo[28360]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:56 np0005486759.ooo.test sudo[28422]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swuvfijboejpoaabcksxileyhtnwssuu ; /usr/bin/python3
Oct 14 08:06:56 np0005486759.ooo.test sudo[28422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:56 np0005486759.ooo.test python3[28424]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:56 np0005486759.ooo.test sudo[28422]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:57 np0005486759.ooo.test sudo[28465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-badblgvagcuynlpanqszcdtngmcmuzms ; /usr/bin/python3
Oct 14 08:06:57 np0005486759.ooo.test sudo[28465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:57 np0005486759.ooo.test python3[28467]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429216.5277853-97112-79079234419460/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=64f5f309f5137b9e0913cbf22857157ecfa0f1f1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:57 np0005486759.ooo.test sudo[28465]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:57 np0005486759.ooo.test sudo[28527]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noejxrldtibaehjgqqlakfktiltilrvi ; /usr/bin/python3
Oct 14 08:06:57 np0005486759.ooo.test sudo[28527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:57 np0005486759.ooo.test python3[28529]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:57 np0005486759.ooo.test sudo[28527]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:57 np0005486759.ooo.test sudo[28570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzgekbttgripusgymojlhtgkvjokbveb ; /usr/bin/python3
Oct 14 08:06:57 np0005486759.ooo.test sudo[28570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:58 np0005486759.ooo.test python3[28572]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429217.318918-97112-261280908306774/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=5133fd4d396b1f335eb51c3c707ca99cdd87c103 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:58 np0005486759.ooo.test sudo[28570]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:58 np0005486759.ooo.test sudo[28632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrapyicmjtbqcauboilnnlrzrafskqto ; /usr/bin/python3
Oct 14 08:06:58 np0005486759.ooo.test sudo[28632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:58 np0005486759.ooo.test python3[28634]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:58 np0005486759.ooo.test sudo[28632]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:58 np0005486759.ooo.test sudo[28675]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpnsdfcwxtzlirqpehwgdetudxjmlqoz ; /usr/bin/python3
Oct 14 08:06:58 np0005486759.ooo.test sudo[28675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:58 np0005486759.ooo.test python3[28677]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429218.1960204-97112-202322386958558/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=3cd9dbc8dc13e6f17ecd828ca2d26939279148b0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:58 np0005486759.ooo.test sudo[28675]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:59 np0005486759.ooo.test sudo[28737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-visfoblbmnubntuqpcrzrekgdcvfrqwz ; /usr/bin/python3
Oct 14 08:06:59 np0005486759.ooo.test sudo[28737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:59 np0005486759.ooo.test python3[28739]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:06:59 np0005486759.ooo.test sudo[28737]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:59 np0005486759.ooo.test sudo[28780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adzhiymrltxrdemakhakfiolhdghweys ; /usr/bin/python3
Oct 14 08:06:59 np0005486759.ooo.test sudo[28780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:06:59 np0005486759.ooo.test python3[28782]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429218.946865-97112-157849762422697/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=59c3a8ac92da824e6385ffa13face2ba61dae8c1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:06:59 np0005486759.ooo.test sudo[28780]: pam_unix(sudo:session): session closed for user root
Oct 14 08:06:59 np0005486759.ooo.test sudo[28842]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnyqylregecaaohlzqsumudxgodulbjp ; /usr/bin/python3
Oct 14 08:06:59 np0005486759.ooo.test sudo[28842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:00 np0005486759.ooo.test python3[28844]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:00 np0005486759.ooo.test sudo[28842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:00 np0005486759.ooo.test sudo[28885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqhmwjddkbfbzgpnrwmfclostddduoql ; /usr/bin/python3
Oct 14 08:07:00 np0005486759.ooo.test sudo[28885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:00 np0005486759.ooo.test python3[28887]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429219.8151312-97112-6149623459318/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:00 np0005486759.ooo.test sudo[28885]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:00 np0005486759.ooo.test sudo[28947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uktsyfkmtskgzksiuvfsypgblwsdfvip ; /usr/bin/python3
Oct 14 08:07:00 np0005486759.ooo.test sudo[28947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:00 np0005486759.ooo.test python3[28949]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:00 np0005486759.ooo.test sudo[28947]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:01 np0005486759.ooo.test sudo[28990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xavifppwywzruqpqzfzucbvpftophntu ; /usr/bin/python3
Oct 14 08:07:01 np0005486759.ooo.test sudo[28990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:01 np0005486759.ooo.test python3[28992]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429220.6150315-97112-129851811979619/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=917f40b103323a95b937170e6d6c53e5ae5aafec backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:01 np0005486759.ooo.test sudo[28990]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:01 np0005486759.ooo.test sudo[29052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zybxkzwumxkhyryksdcfkhmdneemrsko ; /usr/bin/python3
Oct 14 08:07:01 np0005486759.ooo.test sudo[29052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:01 np0005486759.ooo.test python3[29054]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:01 np0005486759.ooo.test sudo[29052]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:02 np0005486759.ooo.test sudo[29095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyoiukuinkfsmlpdgfvmsfmnauracadt ; /usr/bin/python3
Oct 14 08:07:02 np0005486759.ooo.test sudo[29095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:02 np0005486759.ooo.test python3[29097]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429221.4854028-97112-53032195028028/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=2bd6d289bd2227bf3e29c0d4c040e22768b16208 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:02 np0005486759.ooo.test sudo[29095]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:02 np0005486759.ooo.test sudo[29125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vaimromcusitrnmziikrmlnrilvrvekz ; /usr/bin/python3
Oct 14 08:07:02 np0005486759.ooo.test sudo[29125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:02 np0005486759.ooo.test python3[29127]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:07:02 np0005486759.ooo.test sudo[29125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:03 np0005486759.ooo.test sudo[29173]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgvpobbhzdobcvqhqiwwqhurxbmchkqs ; /usr/bin/python3
Oct 14 08:07:03 np0005486759.ooo.test sudo[29173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:03 np0005486759.ooo.test python3[29175]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:03 np0005486759.ooo.test sudo[29173]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:03 np0005486759.ooo.test sudo[29216]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekcagslhuaqhylcggxcayqpsgvlclhue ; /usr/bin/python3
Oct 14 08:07:03 np0005486759.ooo.test sudo[29216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:03 np0005486759.ooo.test python3[29218]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429222.9300525-97558-89499765478483/source _original_basename=tmpo1nq021i follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:03 np0005486759.ooo.test sudo[29216]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:05 np0005486759.ooo.test sudo[29246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmnzltaknbwomokdagoilxavxjkarniu ; /usr/bin/python3
Oct 14 08:07:05 np0005486759.ooo.test sudo[29246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:05 np0005486759.ooo.test python3[29248]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 08:07:05 np0005486759.ooo.test sudo[29246]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:05 np0005486759.ooo.test sudo[29307]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhmwimvfrhuwmasxmruqnqgxralxyejo ; /usr/bin/python3
Oct 14 08:07:05 np0005486759.ooo.test sudo[29307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:05 np0005486759.ooo.test python3[29309]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:10 np0005486759.ooo.test sudo[29307]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:10 np0005486759.ooo.test sudo[29324]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdjebswebcfqovowrnwspgwodcvisukf ; /usr/bin/python3
Oct 14 08:07:10 np0005486759.ooo.test sudo[29324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:10 np0005486759.ooo.test python3[29326]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:14 np0005486759.ooo.test systemd[24104]: Starting Mark boot as successful...
Oct 14 08:07:14 np0005486759.ooo.test systemd[24104]: Finished Mark boot as successful.
Oct 14 08:07:14 np0005486759.ooo.test sudo[29324]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:14 np0005486759.ooo.test sudo[29342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asuygzogrsiwfuzfvhpqzboqeujqtyfe ; /usr/bin/python3
Oct 14 08:07:14 np0005486759.ooo.test sudo[29342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:15 np0005486759.ooo.test python3[29344]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.107 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                      MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                      echo "$INT $MTU"
                                                       _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:15 np0005486759.ooo.test sudo[29342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:15 np0005486759.ooo.test sudo[29365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pscabqjkkgeguoryizernjlstpgohdzf ; /usr/bin/python3
Oct 14 08:07:15 np0005486759.ooo.test sudo[29365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:15 np0005486759.ooo.test python3[29367]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.107 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                      MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                      echo "$INT $MTU"
                                                       _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:15 np0005486759.ooo.test sudo[29365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:16 np0005486759.ooo.test sudo[29388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suafozjmrescnfoukmyfauqxhrskjabq ; /usr/bin/python3
Oct 14 08:07:16 np0005486759.ooo.test sudo[29388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:16 np0005486759.ooo.test python3[29390]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.107 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                      MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                      echo "$INT $MTU"
                                                       _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:16 np0005486759.ooo.test sudo[29388]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:16 np0005486759.ooo.test sudo[29411]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsqmuvjofafzibrhudnxoaryzgeeicxk ; /usr/bin/python3
Oct 14 08:07:16 np0005486759.ooo.test sudo[29411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:16 np0005486759.ooo.test python3[29413]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.107 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                      MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                      echo "$INT $MTU"
                                                       _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:16 np0005486759.ooo.test sudo[29411]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:18 np0005486759.ooo.test sudo[29434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueabogwjdubpqmdavtqohghsyhawsiry ; /usr/bin/python3
Oct 14 08:07:18 np0005486759.ooo.test sudo[29434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:18 np0005486759.ooo.test python3[29436]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:18 np0005486759.ooo.test sudo[29434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:18 np0005486759.ooo.test sudo[29482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-junkbmvemsntyjzthuzzndrobqcmvewk ; /usr/bin/python3
Oct 14 08:07:18 np0005486759.ooo.test sudo[29482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:18 np0005486759.ooo.test python3[29484]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:18 np0005486759.ooo.test sudo[29482]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:18 np0005486759.ooo.test sudo[29500]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eopxptlrprtdsmqcltivbvpvsqckwlor ; /usr/bin/python3
Oct 14 08:07:18 np0005486759.ooo.test sudo[29500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:18 np0005486759.ooo.test python3[29502]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpj1y5vw8e recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:18 np0005486759.ooo.test sudo[29500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:19 np0005486759.ooo.test sudo[29530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxsvhdgridoctqywensbxfqxrhyomgzj ; /usr/bin/python3
Oct 14 08:07:19 np0005486759.ooo.test sudo[29530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:19 np0005486759.ooo.test python3[29532]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:19 np0005486759.ooo.test sudo[29530]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:19 np0005486759.ooo.test sudo[29578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpkgfrdnqsuhkjujcsngouevtjevnhae ; /usr/bin/python3
Oct 14 08:07:19 np0005486759.ooo.test sudo[29578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:19 np0005486759.ooo.test python3[29580]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:19 np0005486759.ooo.test sudo[29578]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:19 np0005486759.ooo.test sudo[29596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uilgjjawumuekcrmjhnugjsusklwseph ; /usr/bin/python3
Oct 14 08:07:19 np0005486759.ooo.test sudo[29596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:20 np0005486759.ooo.test python3[29598]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:20 np0005486759.ooo.test sudo[29596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:20 np0005486759.ooo.test sudo[29658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxzluvdvzvocpkpuajdjecwcbtaxiwmo ; /usr/bin/python3
Oct 14 08:07:20 np0005486759.ooo.test sudo[29658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:20 np0005486759.ooo.test python3[29660]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:20 np0005486759.ooo.test sudo[29658]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:20 np0005486759.ooo.test sudo[29676]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abaveyqpjncrvtfbqbhivuzejgmwxgfz ; /usr/bin/python3
Oct 14 08:07:20 np0005486759.ooo.test sudo[29676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:20 np0005486759.ooo.test python3[29678]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:20 np0005486759.ooo.test sudo[29676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:21 np0005486759.ooo.test sudo[29738]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejseitefcednhkabsjexqsitamwbqtuz ; /usr/bin/python3
Oct 14 08:07:21 np0005486759.ooo.test sudo[29738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:21 np0005486759.ooo.test python3[29740]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:21 np0005486759.ooo.test sudo[29738]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:21 np0005486759.ooo.test sudo[29756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zisrgvkcsrwigtkdmewyxncpfunswrss ; /usr/bin/python3
Oct 14 08:07:21 np0005486759.ooo.test sudo[29756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:21 np0005486759.ooo.test python3[29758]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:21 np0005486759.ooo.test sudo[29756]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:22 np0005486759.ooo.test sudo[29818]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdkpnekczphohhabunwysrbtztlhitmb ; /usr/bin/python3
Oct 14 08:07:22 np0005486759.ooo.test sudo[29818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:22 np0005486759.ooo.test python3[29820]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:22 np0005486759.ooo.test sudo[29818]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:22 np0005486759.ooo.test sudo[29836]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzonzhmtluzwnnchhxcslifzmexqminn ; /usr/bin/python3
Oct 14 08:07:22 np0005486759.ooo.test sudo[29836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:22 np0005486759.ooo.test python3[29838]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:22 np0005486759.ooo.test sudo[29836]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:22 np0005486759.ooo.test sudo[29898]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-madngvahjyoesazdomwpavxjwywmtcby ; /usr/bin/python3
Oct 14 08:07:22 np0005486759.ooo.test sudo[29898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:22 np0005486759.ooo.test python3[29900]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:22 np0005486759.ooo.test sudo[29898]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:22 np0005486759.ooo.test sudo[29916]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wglvcunclkjjzakfvgwcesqedvzkhrmm ; /usr/bin/python3
Oct 14 08:07:22 np0005486759.ooo.test sudo[29916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:23 np0005486759.ooo.test python3[29918]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:23 np0005486759.ooo.test sudo[29916]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:23 np0005486759.ooo.test sudo[29978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toocgzwahjcmyrxeflqwfueshxglyaod ; /usr/bin/python3
Oct 14 08:07:23 np0005486759.ooo.test sudo[29978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:23 np0005486759.ooo.test python3[29980]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:23 np0005486759.ooo.test sudo[29978]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:23 np0005486759.ooo.test sudo[29996]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-madkaoztvtgkzzhmrolxofplhjcgwahz ; /usr/bin/python3
Oct 14 08:07:23 np0005486759.ooo.test sudo[29996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:23 np0005486759.ooo.test python3[29998]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:23 np0005486759.ooo.test sudo[29996]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:24 np0005486759.ooo.test sudo[30058]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpwsrpeptaekbojpkdplvsgcbwtcmvsy ; /usr/bin/python3
Oct 14 08:07:24 np0005486759.ooo.test sudo[30058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:24 np0005486759.ooo.test python3[30060]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:24 np0005486759.ooo.test sudo[30058]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:24 np0005486759.ooo.test sudo[30076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqmeuwpieizfnikubjuufcudtsfthxcf ; /usr/bin/python3
Oct 14 08:07:24 np0005486759.ooo.test sudo[30076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:24 np0005486759.ooo.test python3[30078]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:24 np0005486759.ooo.test sudo[30076]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:25 np0005486759.ooo.test sudo[30138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygkwwzdigrvlqoskpxandctibzrwlfwa ; /usr/bin/python3
Oct 14 08:07:25 np0005486759.ooo.test sudo[30138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:25 np0005486759.ooo.test python3[30140]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:25 np0005486759.ooo.test sudo[30138]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:25 np0005486759.ooo.test sudo[30156]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfjcialphbcbqljevedgyitymaysqemr ; /usr/bin/python3
Oct 14 08:07:25 np0005486759.ooo.test sudo[30156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:25 np0005486759.ooo.test python3[30158]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:25 np0005486759.ooo.test sudo[30156]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:25 np0005486759.ooo.test sudo[30218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dktlofjdsxuhhfarlcwaajsbaubihplu ; /usr/bin/python3
Oct 14 08:07:25 np0005486759.ooo.test sudo[30218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:26 np0005486759.ooo.test python3[30220]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:26 np0005486759.ooo.test sudo[30218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:26 np0005486759.ooo.test sudo[30236]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikgkfizbkzlblfccfjmtckuoaegecxon ; /usr/bin/python3
Oct 14 08:07:26 np0005486759.ooo.test sudo[30236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:26 np0005486759.ooo.test python3[30238]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:26 np0005486759.ooo.test sudo[30236]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:26 np0005486759.ooo.test sudo[30298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emprxpxrmgbloktkvyuavkejtyqegeae ; /usr/bin/python3
Oct 14 08:07:26 np0005486759.ooo.test sudo[30298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:26 np0005486759.ooo.test python3[30300]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:26 np0005486759.ooo.test sudo[30298]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:26 np0005486759.ooo.test sudo[30316]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrzrvjazpwkustnxayuljnmjwhopnppl ; /usr/bin/python3
Oct 14 08:07:26 np0005486759.ooo.test sudo[30316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:27 np0005486759.ooo.test python3[30318]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:27 np0005486759.ooo.test sudo[30316]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:27 np0005486759.ooo.test sudo[30378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bogfrdnqxwiaeyogzlvxfubjgxnzpccp ; /usr/bin/python3
Oct 14 08:07:27 np0005486759.ooo.test sudo[30378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:27 np0005486759.ooo.test python3[30380]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:27 np0005486759.ooo.test sudo[30378]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:27 np0005486759.ooo.test sudo[30396]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hofkzkeavajdlgbkgbkgcuipqlwzceso ; /usr/bin/python3
Oct 14 08:07:27 np0005486759.ooo.test sudo[30396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:27 np0005486759.ooo.test python3[30398]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:27 np0005486759.ooo.test sudo[30396]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:28 np0005486759.ooo.test sudo[30426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ievhoowvjrljatuzrvgiyjlxjrwrhcjb ; /usr/bin/python3
Oct 14 08:07:28 np0005486759.ooo.test sudo[30426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:28 np0005486759.ooo.test python3[30428]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:07:28 np0005486759.ooo.test sudo[30426]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:28 np0005486759.ooo.test sudo[30474]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adiansntmtkxyduupzzrmxknilbnlrpg ; /usr/bin/python3
Oct 14 08:07:28 np0005486759.ooo.test sudo[30474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:28 np0005486759.ooo.test python3[30476]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:28 np0005486759.ooo.test sudo[30474]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:28 np0005486759.ooo.test sudo[30492]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apmdarprstatczostzlnhesbvbrfqrzj ; /usr/bin/python3
Oct 14 08:07:28 np0005486759.ooo.test sudo[30492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:29 np0005486759.ooo.test python3[30494]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpyt67bw81 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:29 np0005486759.ooo.test sudo[30492]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:30 np0005486759.ooo.test sudo[30522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbzysmfxlyvtqwulejrvxzchyawheyhd ; /usr/bin/python3
Oct 14 08:07:30 np0005486759.ooo.test sudo[30522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:30 np0005486759.ooo.test python3[30524]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:07:33 np0005486759.ooo.test sudo[30522]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:33 np0005486759.ooo.test sudo[30539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plmtvgqtfnnficybjrwwbsxdugnewaer ; /usr/bin/python3
Oct 14 08:07:33 np0005486759.ooo.test sudo[30539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:34 np0005486759.ooo.test python3[30541]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:07:34 np0005486759.ooo.test sudo[30539]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:34 np0005486759.ooo.test sudo[30557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saabuopylvieepsddasbfjlxbzmfgnwt ; /usr/bin/python3
Oct 14 08:07:34 np0005486759.ooo.test sudo[30557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:34 np0005486759.ooo.test python3[30559]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:07:34 np0005486759.ooo.test sudo[30557]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:34 np0005486759.ooo.test sudo[30575]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nomnjtueyytnqiydundtjrlvzlemezhy ; /usr/bin/python3
Oct 14 08:07:34 np0005486759.ooo.test sudo[30575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:34 np0005486759.ooo.test python3[30577]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:07:34 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:07:35 np0005486759.ooo.test systemd-sysv-generator[30608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:07:35 np0005486759.ooo.test systemd-rc-local-generator[30603]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:07:35 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:07:35 np0005486759.ooo.test systemd[1]: Starting Netfilter Tables...
Oct 14 08:07:35 np0005486759.ooo.test systemd[1]: Finished Netfilter Tables.
Oct 14 08:07:35 np0005486759.ooo.test sudo[30575]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:35 np0005486759.ooo.test sudo[30665]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwejcbfezbqlbycgxthizgizxyrbusiw ; /usr/bin/python3
Oct 14 08:07:35 np0005486759.ooo.test sudo[30665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:35 np0005486759.ooo.test python3[30667]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:35 np0005486759.ooo.test sudo[30665]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:36 np0005486759.ooo.test sudo[30708]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjqrscbeftrvyfkkzydsugwfualcrmwi ; /usr/bin/python3
Oct 14 08:07:36 np0005486759.ooo.test sudo[30708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:36 np0005486759.ooo.test python3[30710]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429255.646395-98525-252495660721527/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:36 np0005486759.ooo.test sudo[30708]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:36 np0005486759.ooo.test sudo[30738]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owbsojbspnvloojoxrakwjnpolrtnyiq ; /usr/bin/python3
Oct 14 08:07:36 np0005486759.ooo.test sudo[30738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:36 np0005486759.ooo.test python3[30740]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:36 np0005486759.ooo.test sudo[30738]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:37 np0005486759.ooo.test sudo[30756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knuxytuvvbbqupucpvivlbvscdibgolo ; /usr/bin/python3
Oct 14 08:07:37 np0005486759.ooo.test sudo[30756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:37 np0005486759.ooo.test python3[30758]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:37 np0005486759.ooo.test sudo[30756]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:37 np0005486759.ooo.test sudo[30805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juaxlzmwrmxmdnqcnogzvuayqvactlyb ; /usr/bin/python3
Oct 14 08:07:37 np0005486759.ooo.test sudo[30805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:37 np0005486759.ooo.test python3[30807]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:37 np0005486759.ooo.test sudo[30805]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:38 np0005486759.ooo.test sudo[30848]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiyozovzoebglwrovaknorzfckrttjyh ; /usr/bin/python3
Oct 14 08:07:38 np0005486759.ooo.test sudo[30848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:38 np0005486759.ooo.test python3[30850]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429257.4643853-98564-94635527691053/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:38 np0005486759.ooo.test sudo[30848]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:38 np0005486759.ooo.test sudo[30910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svozxmyhcmvpjrqtgtqnbvamhmjklbat ; /usr/bin/python3
Oct 14 08:07:38 np0005486759.ooo.test sudo[30910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:38 np0005486759.ooo.test python3[30912]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:38 np0005486759.ooo.test sudo[30910]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:39 np0005486759.ooo.test sudo[30953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-savxzcyntnouyppksarabuqhmzlfyprw ; /usr/bin/python3
Oct 14 08:07:39 np0005486759.ooo.test sudo[30953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:39 np0005486759.ooo.test python3[30955]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429258.4406512-98586-110724550645083/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:39 np0005486759.ooo.test sudo[30953]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:39 np0005486759.ooo.test sudo[31015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmwiigejoccogwnqfegmzioistaxwsea ; /usr/bin/python3
Oct 14 08:07:39 np0005486759.ooo.test sudo[31015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:39 np0005486759.ooo.test python3[31017]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:39 np0005486759.ooo.test sudo[31015]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:39 np0005486759.ooo.test sudo[31058]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-terwluupsjjyvskpzysnqdokdxtudrnf ; /usr/bin/python3
Oct 14 08:07:39 np0005486759.ooo.test sudo[31058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:40 np0005486759.ooo.test python3[31060]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429259.440167-98606-249975928285788/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:40 np0005486759.ooo.test sudo[31058]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:40 np0005486759.ooo.test sudo[31120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgihnfsacwnqrewwaazvmqpipfuqrxko ; /usr/bin/python3
Oct 14 08:07:40 np0005486759.ooo.test sudo[31120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:40 np0005486759.ooo.test python3[31122]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:40 np0005486759.ooo.test sudo[31120]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:40 np0005486759.ooo.test sudo[31163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-powscvoeqjkemxwqsbaujlkvixicfgrl ; /usr/bin/python3
Oct 14 08:07:40 np0005486759.ooo.test sudo[31163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:41 np0005486759.ooo.test python3[31165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429260.287484-98661-228372299977921/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:41 np0005486759.ooo.test sudo[31163]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:41 np0005486759.ooo.test sudo[31225]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myfgdrozytxtcbkgynmklkilquorumaq ; /usr/bin/python3
Oct 14 08:07:41 np0005486759.ooo.test sudo[31225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:41 np0005486759.ooo.test python3[31227]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:41 np0005486759.ooo.test sudo[31225]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:42 np0005486759.ooo.test sudo[31268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlxjplrdsyxwgxkmvntasupzwedvtkkl ; /usr/bin/python3
Oct 14 08:07:42 np0005486759.ooo.test sudo[31268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:42 np0005486759.ooo.test python3[31270]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429261.2198422-98730-167362423871501/source mode=None follow=False _original_basename=ruleset.j2 checksum=3777114572ae49edb19bdbce5ae072966e4d15fd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:42 np0005486759.ooo.test sudo[31268]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:42 np0005486759.ooo.test sudo[31298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbekonvydrhbkgjgbodoyjlfceanluww ; /usr/bin/python3
Oct 14 08:07:42 np0005486759.ooo.test sudo[31298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:42 np0005486759.ooo.test python3[31300]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:42 np0005486759.ooo.test sudo[31298]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:43 np0005486759.ooo.test sudo[31363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsxjgvzqfxqofmxosobdywzapbjczyet ; /usr/bin/python3
Oct 14 08:07:43 np0005486759.ooo.test sudo[31363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:43 np0005486759.ooo.test python3[31365]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                      include "/etc/nftables/tripleo-chains.nft"
                                                      include "/etc/nftables/tripleo-rules.nft"
                                                      include "/etc/nftables/tripleo-jumps.nft"
                                                       state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:07:43 np0005486759.ooo.test sudo[31363]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:43 np0005486759.ooo.test sudo[31380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbfxxmnuxfxuhrfcafxniqyloogaudoo ; /usr/bin/python3
Oct 14 08:07:43 np0005486759.ooo.test sudo[31380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:43 np0005486759.ooo.test python3[31382]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:43 np0005486759.ooo.test sudo[31380]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:43 np0005486759.ooo.test sudo[31397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqsrujknnbjvintexltvgattiymeyllp ; /usr/bin/python3
Oct 14 08:07:43 np0005486759.ooo.test sudo[31397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:44 np0005486759.ooo.test python3[31399]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:44 np0005486759.ooo.test sudo[31397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:44 np0005486759.ooo.test sudo[31416]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpufgvjihjshmvgtirzudrsssmvjufja ; /usr/bin/python3
Oct 14 08:07:44 np0005486759.ooo.test sudo[31416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:44 np0005486759.ooo.test python3[31418]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:44 np0005486759.ooo.test sudo[31416]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:44 np0005486759.ooo.test sudo[31432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvuvfwmnapqfhntaunbnpbanquzmmowz ; /usr/bin/python3
Oct 14 08:07:44 np0005486759.ooo.test sudo[31432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:44 np0005486759.ooo.test python3[31434]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:44 np0005486759.ooo.test sudo[31432]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:44 np0005486759.ooo.test sudo[31448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qepximibyqtlfnxmfggsrjwizareefok ; /usr/bin/python3
Oct 14 08:07:44 np0005486759.ooo.test sudo[31448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:45 np0005486759.ooo.test python3[31450]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:45 np0005486759.ooo.test sudo[31448]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:45 np0005486759.ooo.test sudo[31464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecuxzilprqavkdgwyqmkssgvrcqvbklc ; /usr/bin/python3
Oct 14 08:07:45 np0005486759.ooo.test sudo[31464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:45 np0005486759.ooo.test python3[31466]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 14 08:07:46 np0005486759.ooo.test sudo[31464]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:46 np0005486759.ooo.test sudo[31484]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-celsxqbacaovtccjpiwcnruhosmewtzx ; /usr/bin/python3
Oct 14 08:07:46 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Oct 14 08:07:46 np0005486759.ooo.test sudo[31484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:46 np0005486759.ooo.test python3[31486]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  Converting 2691 SID table entries...
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 08:07:47 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 08:07:47 np0005486759.ooo.test sudo[31484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:47 np0005486759.ooo.test sudo[31505]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulmtomsmfbmthxuwbcgplzfdtbhxfvlu ; /usr/bin/python3
Oct 14 08:07:47 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 14 08:07:47 np0005486759.ooo.test sudo[31505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:48 np0005486759.ooo.test python3[31507]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  Converting 2691 SID table entries...
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 08:07:48 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 08:07:49 np0005486759.ooo.test sudo[31505]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:49 np0005486759.ooo.test sudo[31553]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjhhziezzqorxkbpyucupcpthqggmpet ; /usr/bin/python3
Oct 14 08:07:49 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 14 08:07:49 np0005486759.ooo.test sudo[31553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:49 np0005486759.ooo.test python3[31555]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  Converting 2691 SID table entries...
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 08:07:50 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 08:07:50 np0005486759.ooo.test sudo[31553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:50 np0005486759.ooo.test sudo[31574]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knjninmtanxuuvkhigctoxkatpuezwos ; /usr/bin/python3
Oct 14 08:07:50 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 14 08:07:50 np0005486759.ooo.test sudo[31574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:50 np0005486759.ooo.test python3[31576]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:50 np0005486759.ooo.test sudo[31574]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:50 np0005486759.ooo.test sudo[31590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfsmuavoipibqehwryhrtihhfjfkgqgn ; /usr/bin/python3
Oct 14 08:07:50 np0005486759.ooo.test sudo[31590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:50 np0005486759.ooo.test python3[31592]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:51 np0005486759.ooo.test sudo[31590]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:51 np0005486759.ooo.test sudo[31606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzwdzpmzxlfaewcyrilnysmxddupiafq ; /usr/bin/python3
Oct 14 08:07:51 np0005486759.ooo.test sudo[31606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:51 np0005486759.ooo.test python3[31608]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:51 np0005486759.ooo.test sudo[31606]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:51 np0005486759.ooo.test sudo[31622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwvllgcvjvdkocbagceqmasdmkfophna ; /usr/bin/python3
Oct 14 08:07:51 np0005486759.ooo.test sudo[31622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:51 np0005486759.ooo.test python3[31624]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:07:51 np0005486759.ooo.test sudo[31622]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:51 np0005486759.ooo.test sudo[31638]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cucgsawmjbcmtchzkomddcajvxewjlnx ; /usr/bin/python3
Oct 14 08:07:51 np0005486759.ooo.test sudo[31638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:51 np0005486759.ooo.test python3[31640]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:07:52 np0005486759.ooo.test sudo[31638]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:52 np0005486759.ooo.test sudo[31655]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waaxtpdlathfbzorctsckbazcqjfuuef ; /usr/bin/python3
Oct 14 08:07:52 np0005486759.ooo.test sudo[31655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:52 np0005486759.ooo.test python3[31657]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:07:55 np0005486759.ooo.test sudo[31655]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:56 np0005486759.ooo.test sudo[31672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zedrxzafibnthcsvcmkzwhsglrjfjfaz ; /usr/bin/python3
Oct 14 08:07:56 np0005486759.ooo.test sudo[31672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:56 np0005486759.ooo.test python3[31674]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:56 np0005486759.ooo.test sudo[31672]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:56 np0005486759.ooo.test sudo[31720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnphxffvjkawbooclkaivqkmbassqbha ; /usr/bin/python3
Oct 14 08:07:56 np0005486759.ooo.test sudo[31720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:56 np0005486759.ooo.test python3[31722]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:56 np0005486759.ooo.test sudo[31720]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:56 np0005486759.ooo.test sudo[31763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uetldppevzvawitzdqqgvyfiucxfguyp ; /usr/bin/python3
Oct 14 08:07:56 np0005486759.ooo.test sudo[31763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:57 np0005486759.ooo.test python3[31765]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429276.3956356-99003-62139490337762/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:57 np0005486759.ooo.test sudo[31763]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:57 np0005486759.ooo.test sudo[31793]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bczijeuhqbvfuzwqyeaycuouleozishy ; /usr/bin/python3
Oct 14 08:07:57 np0005486759.ooo.test sudo[31793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:57 np0005486759.ooo.test python3[31795]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 08:07:57 np0005486759.ooo.test systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 14 08:07:57 np0005486759.ooo.test systemd[1]: Stopped Load Kernel Modules.
Oct 14 08:07:57 np0005486759.ooo.test systemd[1]: Stopping Load Kernel Modules...
Oct 14 08:07:57 np0005486759.ooo.test systemd[1]: Starting Load Kernel Modules...
Oct 14 08:07:57 np0005486759.ooo.test kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 14 08:07:57 np0005486759.ooo.test systemd-modules-load[31798]: Inserted module 'br_netfilter'
Oct 14 08:07:57 np0005486759.ooo.test systemd-modules-load[31798]: Module 'msr' is built in
Oct 14 08:07:57 np0005486759.ooo.test kernel: Bridge firewalling registered
Oct 14 08:07:57 np0005486759.ooo.test systemd[1]: Finished Load Kernel Modules.
Oct 14 08:07:57 np0005486759.ooo.test sudo[31793]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:57 np0005486759.ooo.test sudo[31848]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phvbiyuinwntqgqlnqwvfqceoqrtqafc ; /usr/bin/python3
Oct 14 08:07:57 np0005486759.ooo.test sudo[31848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:58 np0005486759.ooo.test python3[31850]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:07:58 np0005486759.ooo.test sudo[31848]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:58 np0005486759.ooo.test sudo[31891]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxcvutdwvabskyemturuwvfaebgcdhhc ; /usr/bin/python3
Oct 14 08:07:58 np0005486759.ooo.test sudo[31891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:58 np0005486759.ooo.test python3[31893]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429277.7484565-99023-109805943261962/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:07:58 np0005486759.ooo.test sudo[31891]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:58 np0005486759.ooo.test sudo[31921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooiqthrgeaebsvqffrmmyluidxbjqman ; /usr/bin/python3
Oct 14 08:07:58 np0005486759.ooo.test sudo[31921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:59 np0005486759.ooo.test python3[31923]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:07:59 np0005486759.ooo.test sudo[31921]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:59 np0005486759.ooo.test sudo[31939]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnlpwecpkslusbchwulmhfmcparbzjwl ; /usr/bin/python3
Oct 14 08:07:59 np0005486759.ooo.test sudo[31939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:59 np0005486759.ooo.test python3[31941]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:07:59 np0005486759.ooo.test sudo[31939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:59 np0005486759.ooo.test sudo[31957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqpldplcynpiinmprbxihxczcbnqfzyf ; /usr/bin/python3
Oct 14 08:07:59 np0005486759.ooo.test sudo[31957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:59 np0005486759.ooo.test python3[31959]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:07:59 np0005486759.ooo.test sudo[31957]: pam_unix(sudo:session): session closed for user root
Oct 14 08:07:59 np0005486759.ooo.test sudo[31975]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofzxpgmkpfmftlbkmxmsmatbrqosztad ; /usr/bin/python3
Oct 14 08:07:59 np0005486759.ooo.test sudo[31975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:07:59 np0005486759.ooo.test python3[31977]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:07:59 np0005486759.ooo.test sudo[31975]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:00 np0005486759.ooo.test sudo[31992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrvmhwdrtpnuhkudkjylfuzdsjvqoaxk ; /usr/bin/python3
Oct 14 08:08:00 np0005486759.ooo.test sudo[31992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:00 np0005486759.ooo.test python3[31994]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:00 np0005486759.ooo.test sudo[31992]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:00 np0005486759.ooo.test sudo[32009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzkpfspqzmfasdjarsjaqthykmoyxxtf ; /usr/bin/python3
Oct 14 08:08:00 np0005486759.ooo.test sudo[32009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:00 np0005486759.ooo.test python3[32011]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:00 np0005486759.ooo.test sudo[32009]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:00 np0005486759.ooo.test sudo[32026]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jukejahgdxxpiovgjdykbhahpwztmuth ; /usr/bin/python3
Oct 14 08:08:00 np0005486759.ooo.test sudo[32026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:00 np0005486759.ooo.test python3[32028]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:00 np0005486759.ooo.test sudo[32026]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:00 np0005486759.ooo.test sudo[32044]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhbkexsabyxrrtmizmiudujnpvywphvr ; /usr/bin/python3
Oct 14 08:08:00 np0005486759.ooo.test sudo[32044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:01 np0005486759.ooo.test python3[32046]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:01 np0005486759.ooo.test sudo[32044]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:01 np0005486759.ooo.test sudo[32062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uelytmnzygepxckojwhcywhkjhmdqjes ; /usr/bin/python3
Oct 14 08:08:01 np0005486759.ooo.test sudo[32062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:01 np0005486759.ooo.test python3[32064]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:01 np0005486759.ooo.test sudo[32062]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:01 np0005486759.ooo.test sudo[32080]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekauupfpeyglgvdpvyrnnlgmbnvncbhy ; /usr/bin/python3
Oct 14 08:08:01 np0005486759.ooo.test sudo[32080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:01 np0005486759.ooo.test python3[32082]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:01 np0005486759.ooo.test sudo[32080]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:01 np0005486759.ooo.test sudo[32098]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeubvjfgpnbnyvjztihtedxgonpkevpm ; /usr/bin/python3
Oct 14 08:08:01 np0005486759.ooo.test sudo[32098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:02 np0005486759.ooo.test python3[32100]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:02 np0005486759.ooo.test sudo[32098]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:02 np0005486759.ooo.test sudo[32116]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxtlqqscoxgnpqciijlcvwfsjbfctycb ; /usr/bin/python3
Oct 14 08:08:02 np0005486759.ooo.test sudo[32116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:02 np0005486759.ooo.test python3[32118]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:02 np0005486759.ooo.test sudo[32116]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:02 np0005486759.ooo.test sudo[32134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euxnnuqwwdpqkyudoocgmwlmnbxwgjfh ; /usr/bin/python3
Oct 14 08:08:02 np0005486759.ooo.test sudo[32134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:02 np0005486759.ooo.test python3[32136]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:02 np0005486759.ooo.test sudo[32134]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:02 np0005486759.ooo.test sudo[32152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaxulpulyeqerfyvuzonuvlfvnvobapz ; /usr/bin/python3
Oct 14 08:08:02 np0005486759.ooo.test sudo[32152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:03 np0005486759.ooo.test python3[32154]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:03 np0005486759.ooo.test sudo[32152]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:03 np0005486759.ooo.test sudo[32169]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjylweipzxwbnscwlnslntnvkxqenkhw ; /usr/bin/python3
Oct 14 08:08:03 np0005486759.ooo.test sudo[32169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:03 np0005486759.ooo.test python3[32171]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:03 np0005486759.ooo.test sudo[32169]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:03 np0005486759.ooo.test sudo[32186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxvlemvuxfwclsndflggzktcfpjlxrvp ; /usr/bin/python3
Oct 14 08:08:03 np0005486759.ooo.test sudo[32186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:03 np0005486759.ooo.test python3[32188]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:03 np0005486759.ooo.test sudo[32186]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:03 np0005486759.ooo.test sudo[32203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vizlzmwimoowzpveygqvzoogicluijfk ; /usr/bin/python3
Oct 14 08:08:03 np0005486759.ooo.test sudo[32203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:04 np0005486759.ooo.test python3[32205]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:04 np0005486759.ooo.test sudo[32203]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:04 np0005486759.ooo.test sudo[32220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxcrjfwnhvfgkvgxdnnjcqwkubvidwkl ; /usr/bin/python3
Oct 14 08:08:04 np0005486759.ooo.test sudo[32220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:04 np0005486759.ooo.test python3[32222]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Oct 14 08:08:04 np0005486759.ooo.test sudo[32220]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:04 np0005486759.ooo.test sudo[32238]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pibgynopslhfxyiagvtjkcbwpkdflopz ; /usr/bin/python3
Oct 14 08:08:04 np0005486759.ooo.test sudo[32238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:04 np0005486759.ooo.test python3[32240]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 08:08:04 np0005486759.ooo.test systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 14 08:08:04 np0005486759.ooo.test systemd[1]: Stopped Apply Kernel Variables.
Oct 14 08:08:04 np0005486759.ooo.test systemd[1]: Stopping Apply Kernel Variables...
Oct 14 08:08:04 np0005486759.ooo.test systemd[1]: Starting Apply Kernel Variables...
Oct 14 08:08:04 np0005486759.ooo.test systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 14 08:08:04 np0005486759.ooo.test systemd[1]: Finished Apply Kernel Variables.
Oct 14 08:08:04 np0005486759.ooo.test sudo[32238]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:05 np0005486759.ooo.test sudo[32258]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-livpjikvgcmffnaefipjnpgsvxadekss ; /usr/bin/python3
Oct 14 08:08:05 np0005486759.ooo.test sudo[32258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:05 np0005486759.ooo.test python3[32260]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:05 np0005486759.ooo.test sudo[32258]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:05 np0005486759.ooo.test sudo[32274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blqxbjvknikybcvxkrzrdaeclbdyhreb ; /usr/bin/python3
Oct 14 08:08:05 np0005486759.ooo.test sudo[32274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:05 np0005486759.ooo.test python3[32276]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:05 np0005486759.ooo.test sudo[32274]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:05 np0005486759.ooo.test sudo[32290]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtnhyqiqcqnqpmurhnowznnwaeodbofz ; /usr/bin/python3
Oct 14 08:08:05 np0005486759.ooo.test sudo[32290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:05 np0005486759.ooo.test python3[32292]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:05 np0005486759.ooo.test sudo[32290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:06 np0005486759.ooo.test sudo[32306]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itsjmolkacfiliorwqgclptbetxlulok ; /usr/bin/python3
Oct 14 08:08:06 np0005486759.ooo.test sudo[32306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:06 np0005486759.ooo.test python3[32308]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:08:06 np0005486759.ooo.test sudo[32306]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:06 np0005486759.ooo.test sudo[32322]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxlzeapaodjgcoidsasdkjwplvpcmrkb ; /usr/bin/python3
Oct 14 08:08:06 np0005486759.ooo.test sudo[32322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:06 np0005486759.ooo.test python3[32324]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:06 np0005486759.ooo.test sudo[32322]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:06 np0005486759.ooo.test sudo[32338]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puqohllqlaffeghontvgghnacyivxjvs ; /usr/bin/python3
Oct 14 08:08:06 np0005486759.ooo.test sudo[32338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:06 np0005486759.ooo.test python3[32340]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:06 np0005486759.ooo.test sudo[32338]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:06 np0005486759.ooo.test sudo[32354]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuuuwqjsyqiitrrzpvjnnivbsslrxjbi ; /usr/bin/python3
Oct 14 08:08:06 np0005486759.ooo.test sudo[32354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:07 np0005486759.ooo.test python3[32356]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:07 np0005486759.ooo.test sudo[32354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:07 np0005486759.ooo.test sudo[32370]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvwrutjdhzjgvkoyzlybiidpvfwpypif ; /usr/bin/python3
Oct 14 08:08:07 np0005486759.ooo.test sudo[32370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:07 np0005486759.ooo.test python3[32372]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:07 np0005486759.ooo.test sudo[32370]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:07 np0005486759.ooo.test sudo[32386]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gigodjzedceecexkninhvcwyneoobsla ; /usr/bin/python3
Oct 14 08:08:07 np0005486759.ooo.test sudo[32386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:07 np0005486759.ooo.test python3[32388]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:07 np0005486759.ooo.test sudo[32386]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:07 np0005486759.ooo.test sudo[32434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-entpcfeftdutbkgcvhespcjfepackvat ; /usr/bin/python3
Oct 14 08:08:07 np0005486759.ooo.test sudo[32434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:07 np0005486759.ooo.test python3[32436]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:07 np0005486759.ooo.test sudo[32434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:08 np0005486759.ooo.test sudo[32477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skgfajojcgmkhkeutodbxjvqznvaoktj ; /usr/bin/python3
Oct 14 08:08:08 np0005486759.ooo.test sudo[32477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:08 np0005486759.ooo.test python3[32479]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429287.7234468-99388-93836121853663/source _original_basename=tmpude4te1o follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:08 np0005486759.ooo.test sudo[32477]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:08 np0005486759.ooo.test sudo[32507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otyayzizlgtccdfpipfypcfeotcjfirm ; /usr/bin/python3
Oct 14 08:08:08 np0005486759.ooo.test sudo[32507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:08 np0005486759.ooo.test python3[32509]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:08 np0005486759.ooo.test sudo[32507]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:09 np0005486759.ooo.test sudo[32524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvizddzaxjzlafqvoapoknkyivrhwhlm ; /usr/bin/python3
Oct 14 08:08:09 np0005486759.ooo.test sudo[32524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:09 np0005486759.ooo.test python3[32526]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:09 np0005486759.ooo.test sudo[32524]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:10 np0005486759.ooo.test sudo[32540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwmbpdigmrljygcghwctxzikkqxmfnoq ; /usr/bin/python3
Oct 14 08:08:10 np0005486759.ooo.test sudo[32540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:10 np0005486759.ooo.test python3[32542]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:10 np0005486759.ooo.test sudo[32540]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:10 np0005486759.ooo.test sudo[32556]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhinajdjlqcmuctzlqynsiebhqzjiuav ; /usr/bin/python3
Oct 14 08:08:10 np0005486759.ooo.test sudo[32556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:10 np0005486759.ooo.test python3[32558]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:10 np0005486759.ooo.test sudo[32556]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:10 np0005486759.ooo.test sudo[32572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roygnyskwoofbkwxsdsvnoiverlepuse ; /usr/bin/python3
Oct 14 08:08:10 np0005486759.ooo.test sudo[32572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:11 np0005486759.ooo.test python3[32574]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:11 np0005486759.ooo.test sudo[32572]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:11 np0005486759.ooo.test sudo[32588]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khklitvmlfhwltgdvkqfiyrexbgtgpom ; /usr/bin/python3
Oct 14 08:08:11 np0005486759.ooo.test sudo[32588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:11 np0005486759.ooo.test python3[32590]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:11 np0005486759.ooo.test sudo[32588]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:11 np0005486759.ooo.test sudo[32604]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txqfmfyerfvzossrwkoxesnxffwkgovg ; /usr/bin/python3
Oct 14 08:08:11 np0005486759.ooo.test sudo[32604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:11 np0005486759.ooo.test python3[32606]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:11 np0005486759.ooo.test sudo[32604]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:11 np0005486759.ooo.test sudo[32620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzryilpnwqsiiecnnbjlfsuwwyfpniml ; /usr/bin/python3
Oct 14 08:08:11 np0005486759.ooo.test sudo[32620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:12 np0005486759.ooo.test python3[32622]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:12 np0005486759.ooo.test sudo[32620]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:12 np0005486759.ooo.test sudo[32636]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbhsxqmmlloampeslmiqhqhkcmcwbyqf ; /usr/bin/python3
Oct 14 08:08:12 np0005486759.ooo.test sudo[32636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:12 np0005486759.ooo.test python3[32638]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:12 np0005486759.ooo.test sudo[32636]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:12 np0005486759.ooo.test sudo[32652]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uldmxpndpsghopwgtefnvaydkfnvgkni ; /usr/bin/python3
Oct 14 08:08:12 np0005486759.ooo.test sudo[32652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:12 np0005486759.ooo.test python3[32654]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:12 np0005486759.ooo.test sudo[32652]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:12 np0005486759.ooo.test sudo[32668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klijhyrxxqafawurifpyvpfauokisrhl ; /usr/bin/python3
Oct 14 08:08:12 np0005486759.ooo.test sudo[32668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:12 np0005486759.ooo.test python3[32670]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:12 np0005486759.ooo.test sudo[32668]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:13 np0005486759.ooo.test sudo[32684]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arkmmhumfwdyopsnschzusdjngpvncmc ; /usr/bin/python3
Oct 14 08:08:13 np0005486759.ooo.test sudo[32684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:13 np0005486759.ooo.test python3[32686]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Oct 14 08:08:13 np0005486759.ooo.test groupadd[32687]: group added to /etc/group: name=qemu, GID=107
Oct 14 08:08:13 np0005486759.ooo.test groupadd[32687]: group added to /etc/gshadow: name=qemu
Oct 14 08:08:13 np0005486759.ooo.test groupadd[32687]: new group: name=qemu, GID=107
Oct 14 08:08:13 np0005486759.ooo.test sudo[32684]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:13 np0005486759.ooo.test sudo[32706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpaiyyzaurcvmccobfdfwcxcspjrwmfv ; /usr/bin/python3
Oct 14 08:08:13 np0005486759.ooo.test sudo[32706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:13 np0005486759.ooo.test python3[32708]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486759.ooo.test update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 14 08:08:14 np0005486759.ooo.test useradd[32710]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Oct 14 08:08:14 np0005486759.ooo.test sudo[32706]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:14 np0005486759.ooo.test sudo[32730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuopanttqedjfzwpgtpyeugsmrfpmgpf ; /usr/bin/python3
Oct 14 08:08:14 np0005486759.ooo.test sudo[32730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:14 np0005486759.ooo.test python3[32732]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Oct 14 08:08:14 np0005486759.ooo.test sudo[32730]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:14 np0005486759.ooo.test sudo[32746]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkocckoblbcvqfwajscmtjlzbcckekqu ; /usr/bin/python3
Oct 14 08:08:14 np0005486759.ooo.test sudo[32746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:14 np0005486759.ooo.test python3[32748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:14 np0005486759.ooo.test sudo[32746]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:15 np0005486759.ooo.test sudo[32795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzpczbbbrmgtsblvncotdpizocsjvczp ; /usr/bin/python3
Oct 14 08:08:15 np0005486759.ooo.test sudo[32795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:15 np0005486759.ooo.test python3[32797]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:15 np0005486759.ooo.test sudo[32795]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:15 np0005486759.ooo.test sudo[32838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-expvjapxvtphoshnvztnkvyjxuadqfsv ; /usr/bin/python3
Oct 14 08:08:15 np0005486759.ooo.test sudo[32838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:15 np0005486759.ooo.test python3[32840]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429294.9397657-99580-105606211196299/source _original_basename=tmp6lvfj6q_ follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:15 np0005486759.ooo.test sudo[32838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:15 np0005486759.ooo.test sudo[32868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juxtwklstniguwerdwpfolqdzaoqqppj ; /usr/bin/python3
Oct 14 08:08:15 np0005486759.ooo.test sudo[32868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:16 np0005486759.ooo.test python3[32870]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 14 08:08:16 np0005486759.ooo.test sudo[32868]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:16 np0005486759.ooo.test sudo[32888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpwukzmmawymfudfsujcdnbednwqllyb ; /usr/bin/python3
Oct 14 08:08:16 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 14 08:08:16 np0005486759.ooo.test sudo[32888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:17 np0005486759.ooo.test python3[32890]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:17 np0005486759.ooo.test sudo[32888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:17 np0005486759.ooo.test sudo[32904]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhpkwchbhwcqquntvxcuxmclduzmskgq ; /usr/bin/python3
Oct 14 08:08:17 np0005486759.ooo.test sudo[32904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:17 np0005486759.ooo.test python3[32906]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Oct 14 08:08:18 np0005486759.ooo.test sudo[32904]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:18 np0005486759.ooo.test sudo[32924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwzjloxkiezbozqcznrmciwolxwrxsyy ; /usr/bin/python3
Oct 14 08:08:18 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct 14 08:08:18 np0005486759.ooo.test sudo[32924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:19 np0005486759.ooo.test python3[32926]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:08:21 np0005486759.ooo.test sudo[32924]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:21 np0005486759.ooo.test sudo[32941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqungoxsdpabohegczguplskqpxqxkpi ; /usr/bin/python3
Oct 14 08:08:21 np0005486759.ooo.test sudo[32941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:21 np0005486759.ooo.test python3[32943]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 08:08:22 np0005486759.ooo.test sudo[32941]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:22 np0005486759.ooo.test sudo[33002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbwsfigxpaamcvoknlxuvppjmsnxziwb ; /usr/bin/python3
Oct 14 08:08:22 np0005486759.ooo.test sudo[33002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:22 np0005486759.ooo.test python3[33004]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:22 np0005486759.ooo.test sudo[33002]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:22 np0005486759.ooo.test sudo[33018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpcdntvhkknadbdydjvyfbbbkxxtdnsv ; /usr/bin/python3
Oct 14 08:08:22 np0005486759.ooo.test sudo[33018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:23 np0005486759.ooo.test python3[33020]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                       _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:08:23 np0005486759.ooo.test sudo[33018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:23 np0005486759.ooo.test sudo[33077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xosjzminafvszonkeuncbsonfodlqxwe ; /usr/bin/python3
Oct 14 08:08:23 np0005486759.ooo.test sudo[33077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:23 np0005486759.ooo.test python3[33079]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:23 np0005486759.ooo.test sudo[33077]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:23 np0005486759.ooo.test sudo[33120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iptphpdpdgddamovnubzkkadubkiujel ; /usr/bin/python3
Oct 14 08:08:23 np0005486759.ooo.test sudo[33120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:23 np0005486759.ooo.test python3[33122]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429303.1973298-99757-147814407659558/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=29a3095e37518bdebc310f108d22edb035867084 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:23 np0005486759.ooo.test sudo[33120]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:24 np0005486759.ooo.test sudo[33182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwgjaqrjeqgqglcztojuhibeviqvylvu ; /usr/bin/python3
Oct 14 08:08:24 np0005486759.ooo.test sudo[33182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:24 np0005486759.ooo.test python3[33184]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:24 np0005486759.ooo.test sudo[33182]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:24 np0005486759.ooo.test sudo[33227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtoeqrnulotlryosadytafmgpxumgcft ; /usr/bin/python3
Oct 14 08:08:24 np0005486759.ooo.test sudo[33227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:24 np0005486759.ooo.test python3[33229]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429304.1497443-99776-91990872467993/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:24 np0005486759.ooo.test sudo[33227]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:25 np0005486759.ooo.test sudo[33257]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hapvkjgeugwnbkfezvhekporhpwmbjve ; /usr/bin/python3
Oct 14 08:08:25 np0005486759.ooo.test sudo[33257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:25 np0005486759.ooo.test python3[33259]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:25 np0005486759.ooo.test sudo[33257]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:25 np0005486759.ooo.test sudo[33273]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xspsuudryguygobulizeepccnxrsmbol ; /usr/bin/python3
Oct 14 08:08:25 np0005486759.ooo.test sudo[33273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:25 np0005486759.ooo.test python3[33275]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:25 np0005486759.ooo.test sudo[33273]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:25 np0005486759.ooo.test sudo[33289]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdlxbuoukahozduinkrhcwhoqzebjimn ; /usr/bin/python3
Oct 14 08:08:25 np0005486759.ooo.test sudo[33289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:25 np0005486759.ooo.test python3[33291]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:25 np0005486759.ooo.test sudo[33289]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:26 np0005486759.ooo.test sudo[33305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hecshvfsomoerewuekowgurzukkodllg ; /usr/bin/python3
Oct 14 08:08:26 np0005486759.ooo.test sudo[33305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:26 np0005486759.ooo.test python3[33307]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:26 np0005486759.ooo.test sudo[33305]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:26 np0005486759.ooo.test sudo[33353]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivofcmmwvxbbetfcllxxuobhjahrkhfx ; /usr/bin/python3
Oct 14 08:08:26 np0005486759.ooo.test sudo[33353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:26 np0005486759.ooo.test python3[33355]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:26 np0005486759.ooo.test sudo[33353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:27 np0005486759.ooo.test sudo[33396]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntagquhassbznshcahivdiemivwsvxel ; /usr/bin/python3
Oct 14 08:08:27 np0005486759.ooo.test sudo[33396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:27 np0005486759.ooo.test python3[33398]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429306.6343799-99826-24542863553486/source _original_basename=tmpm4fcr4dg follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:27 np0005486759.ooo.test sudo[33396]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:27 np0005486759.ooo.test sudo[33426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anzucfomylkdcmgwfpouluioonitkkfr ; /usr/bin/python3
Oct 14 08:08:27 np0005486759.ooo.test sudo[33426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:27 np0005486759.ooo.test python3[33428]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:27 np0005486759.ooo.test sudo[33426]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:27 np0005486759.ooo.test sudo[33442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggkpdzcdlvtjvdrdvfdzhucsbjkxhgjo ; /usr/bin/python3
Oct 14 08:08:27 np0005486759.ooo.test sudo[33442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:28 np0005486759.ooo.test python3[33444]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:28 np0005486759.ooo.test sudo[33442]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:28 np0005486759.ooo.test sudo[33458]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgffgeafstwpdfzudhfanprpisxbiygp ; /usr/bin/python3
Oct 14 08:08:28 np0005486759.ooo.test sudo[33458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:28 np0005486759.ooo.test python3[33460]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:08:31 np0005486759.ooo.test sudo[33458]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:31 np0005486759.ooo.test sudo[33507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxqkygoxmjbcaectaroazeehrbdakhxs ; /usr/bin/python3
Oct 14 08:08:31 np0005486759.ooo.test sudo[33507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:31 np0005486759.ooo.test python3[33509]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:31 np0005486759.ooo.test sudo[33507]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:32 np0005486759.ooo.test sudo[33552]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxsqwdmyopdptgakpdgfyjggseenpynw ; /usr/bin/python3
Oct 14 08:08:32 np0005486759.ooo.test sudo[33552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:32 np0005486759.ooo.test python3[33554]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429311.4670897-100174-197821370610605/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:32 np0005486759.ooo.test sudo[33552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:32 np0005486759.ooo.test sudo[33583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdckvjwblnrcdmmzbnooxfrujcrkuzvs ; /usr/bin/python3
Oct 14 08:08:32 np0005486759.ooo.test sudo[33583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:32 np0005486759.ooo.test python3[33585]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Stopping OpenSSH server daemon...
Oct 14 08:08:32 np0005486759.ooo.test sshd[1132]: Received signal 15; terminating.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: sshd.service: Deactivated successfully.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Stopped OpenSSH server daemon.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: sshd.service: Consumed 5.050s CPU time, read 1.9M from disk, written 236.0K to disk.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Stopped target sshd-keygen.target.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Stopping sshd-keygen.target...
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Reached target sshd-keygen.target.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Starting OpenSSH server daemon...
Oct 14 08:08:32 np0005486759.ooo.test sshd[33589]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 08:08:32 np0005486759.ooo.test sshd[33589]: Server listening on 0.0.0.0 port 22.
Oct 14 08:08:32 np0005486759.ooo.test sshd[33589]: Server listening on :: port 22.
Oct 14 08:08:32 np0005486759.ooo.test systemd[1]: Started OpenSSH server daemon.
Oct 14 08:08:32 np0005486759.ooo.test sudo[33583]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:32 np0005486759.ooo.test sudo[33603]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwwxpcpyagkdbmwzgvgtkwoufyjbthad ; /usr/bin/python3
Oct 14 08:08:32 np0005486759.ooo.test sudo[33603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:33 np0005486759.ooo.test python3[33605]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:33 np0005486759.ooo.test sudo[33603]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:33 np0005486759.ooo.test sudo[33621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umhhlnisbnwrgtjnqmwzwokplxqeanxy ; /usr/bin/python3
Oct 14 08:08:33 np0005486759.ooo.test sudo[33621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:34 np0005486759.ooo.test python3[33623]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:34 np0005486759.ooo.test sudo[33621]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:34 np0005486759.ooo.test sudo[33639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cscetvtjwandmvcwitlzqxvqcquflesg ; /usr/bin/python3
Oct 14 08:08:34 np0005486759.ooo.test sudo[33639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:34 np0005486759.ooo.test python3[33641]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:08:37 np0005486759.ooo.test sudo[33639]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:37 np0005486759.ooo.test sudo[33688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmanelgaahwhielycdcofinbtywmfqza ; /usr/bin/python3
Oct 14 08:08:37 np0005486759.ooo.test sudo[33688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:37 np0005486759.ooo.test python3[33690]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:37 np0005486759.ooo.test sudo[33688]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:38 np0005486759.ooo.test sudo[33733]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nocrwojslsupqsnvonxddklhlqlogzya ; /usr/bin/python3
Oct 14 08:08:38 np0005486759.ooo.test sudo[33733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:38 np0005486759.ooo.test python3[33735]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429317.512304-100270-26341896508680/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:38 np0005486759.ooo.test sudo[33733]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:38 np0005486759.ooo.test sudo[33763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thqopoljzagkjwhwvcukucrcvpiqecai ; /usr/bin/python3
Oct 14 08:08:38 np0005486759.ooo.test sudo[33763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:38 np0005486759.ooo.test python3[33765]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:08:40 np0005486759.ooo.test sudo[33763]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:40 np0005486759.ooo.test sudo[33781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqqclviysimzjhjfuzijecxnnvlsleva ; /usr/bin/python3
Oct 14 08:08:40 np0005486759.ooo.test sudo[33781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:40 np0005486759.ooo.test python3[33783]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 08:08:40 np0005486759.ooo.test systemd[1]: Stopping NTP client/server...
Oct 14 08:08:40 np0005486759.ooo.test chronyd[765]: chronyd exiting
Oct 14 08:08:40 np0005486759.ooo.test systemd[1]: chronyd.service: Deactivated successfully.
Oct 14 08:08:40 np0005486759.ooo.test systemd[1]: Stopped NTP client/server.
Oct 14 08:08:40 np0005486759.ooo.test systemd[1]: chronyd.service: Consumed 120ms CPU time, read 1.9M from disk, written 4.0K to disk.
Oct 14 08:08:40 np0005486759.ooo.test systemd[1]: Starting NTP client/server...
Oct 14 08:08:40 np0005486759.ooo.test chronyd[33790]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 14 08:08:40 np0005486759.ooo.test chronyd[33790]: Frequency -30.457 +/- 0.024 ppm read from /var/lib/chrony/drift
Oct 14 08:08:40 np0005486759.ooo.test chronyd[33790]: Loaded seccomp filter (level 2)
Oct 14 08:08:40 np0005486759.ooo.test systemd[1]: Started NTP client/server.
Oct 14 08:08:40 np0005486759.ooo.test sudo[33781]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:41 np0005486759.ooo.test sudo[33837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snkpmdsydppkeywicipaapawaiwxjnkg ; /usr/bin/python3
Oct 14 08:08:41 np0005486759.ooo.test sudo[33837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:41 np0005486759.ooo.test python3[33839]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:41 np0005486759.ooo.test sudo[33837]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:41 np0005486759.ooo.test sudo[33880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csfzxcnvclmxkyopyikqdpmugkpalhos ; /usr/bin/python3
Oct 14 08:08:41 np0005486759.ooo.test sudo[33880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:41 np0005486759.ooo.test python3[33882]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429320.8699875-100340-104561756322034/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:41 np0005486759.ooo.test sudo[33880]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:41 np0005486759.ooo.test sudo[33910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peahjqcyxrcmbnepjmboqwqndpbvlpjr ; /usr/bin/python3
Oct 14 08:08:41 np0005486759.ooo.test sudo[33910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:42 np0005486759.ooo.test python3[33912]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:08:42 np0005486759.ooo.test systemd-rc-local-generator[33934]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:08:42 np0005486759.ooo.test systemd-sysv-generator[33938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:08:42 np0005486759.ooo.test systemd-rc-local-generator[33974]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:08:42 np0005486759.ooo.test systemd-sysv-generator[33978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: Starting chronyd online sources service...
Oct 14 08:08:42 np0005486759.ooo.test chronyc[33987]: 200 OK
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: chrony-online.service: Deactivated successfully.
Oct 14 08:08:42 np0005486759.ooo.test systemd[1]: Finished chronyd online sources service.
Oct 14 08:08:42 np0005486759.ooo.test sudo[33910]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:42 np0005486759.ooo.test sudo[34001]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vekaejloqcpnpmljxfxfjrkkfqsdekuj ; /usr/bin/python3
Oct 14 08:08:42 np0005486759.ooo.test sudo[34001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:42 np0005486759.ooo.test python3[34003]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:42 np0005486759.ooo.test chronyd[33790]: System clock was stepped by -0.000000 seconds
Oct 14 08:08:42 np0005486759.ooo.test sudo[34001]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:43 np0005486759.ooo.test sudo[34018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmqnveatvdumjxmjarujjlhzpiuwozts ; /usr/bin/python3
Oct 14 08:08:43 np0005486759.ooo.test sudo[34018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:43 np0005486759.ooo.test python3[34020]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:44 np0005486759.ooo.test chronyd[33790]: Selected source 162.159.200.123 (pool.ntp.org)
Oct 14 08:08:53 np0005486759.ooo.test sudo[34018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:53 np0005486759.ooo.test sudo[34035]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbbpfzycjrxpiynxlizlkyctrybjguwc ; /usr/bin/python3
Oct 14 08:08:53 np0005486759.ooo.test sudo[34035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:53 np0005486759.ooo.test python3[34037]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:53 np0005486759.ooo.test chronyd[33790]: System clock was stepped by -0.000001 seconds
Oct 14 08:08:53 np0005486759.ooo.test sudo[34035]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:53 np0005486759.ooo.test sudo[34052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npmeedtrwdzstcwuthvkkqbcdsbywzie ; /usr/bin/python3
Oct 14 08:08:53 np0005486759.ooo.test sudo[34052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:54 np0005486759.ooo.test python3[34054]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:54 np0005486759.ooo.test sudo[34052]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:54 np0005486759.ooo.test sudo[34069]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcrcvmxtspufkbslxbmouivxhpkdnqut ; /usr/bin/python3
Oct 14 08:08:54 np0005486759.ooo.test sudo[34069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:54 np0005486759.ooo.test python3[34071]: ansible-timezone Invoked with name=UTC hwclock=None
Oct 14 08:08:54 np0005486759.ooo.test systemd[1]: Starting Time & Date Service...
Oct 14 08:08:54 np0005486759.ooo.test systemd[1]: Started Time & Date Service.
Oct 14 08:08:54 np0005486759.ooo.test sudo[34069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:55 np0005486759.ooo.test sudo[34089]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyumhaisyftfxidrokoqswemvvsfxecn ; /usr/bin/python3
Oct 14 08:08:55 np0005486759.ooo.test sudo[34089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:55 np0005486759.ooo.test python3[34091]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:55 np0005486759.ooo.test sudo[34089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:55 np0005486759.ooo.test sudo[34106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awipqrtpufnokobuoaycznyngkmqrfqv ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Oct 14 08:08:55 np0005486759.ooo.test sudo[34106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:56 np0005486759.ooo.test python3[34108]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:08:56 np0005486759.ooo.test sudo[34106]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:56 np0005486759.ooo.test sudo[34123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjdwcqybnwuiygyciupvpcznepufwvge ; /usr/bin/python3
Oct 14 08:08:56 np0005486759.ooo.test sudo[34123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:56 np0005486759.ooo.test python3[34125]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Oct 14 08:08:56 np0005486759.ooo.test sudo[34123]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:56 np0005486759.ooo.test sudo[34139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaiuwwfkvthnetarcrvimxpupunuzguy ; /usr/bin/python3
Oct 14 08:08:56 np0005486759.ooo.test sudo[34139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:56 np0005486759.ooo.test python3[34141]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:08:56 np0005486759.ooo.test sudo[34139]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:57 np0005486759.ooo.test sudo[34155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npnyqmvfzehrolmhlrbsphuhvigwslka ; /usr/bin/python3
Oct 14 08:08:57 np0005486759.ooo.test sudo[34155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:57 np0005486759.ooo.test python3[34157]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:57 np0005486759.ooo.test sudo[34155]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:57 np0005486759.ooo.test sudo[34171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcatrqckyuitybydgdmciekegjstiuwq ; /usr/bin/python3
Oct 14 08:08:57 np0005486759.ooo.test sudo[34171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:57 np0005486759.ooo.test python3[34173]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:08:57 np0005486759.ooo.test sudo[34171]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:58 np0005486759.ooo.test sudo[34219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yygfyxlslpzymtstbiqimkelycncgbuq ; /usr/bin/python3
Oct 14 08:08:58 np0005486759.ooo.test sudo[34219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:58 np0005486759.ooo.test python3[34221]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:58 np0005486759.ooo.test sudo[34219]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:58 np0005486759.ooo.test sudo[34262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbbkapeqbndbdzupfvgxxijluvuxioqt ; /usr/bin/python3
Oct 14 08:08:58 np0005486759.ooo.test sudo[34262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:58 np0005486759.ooo.test python3[34264]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429338.0639985-100673-6717416222206/source _original_basename=tmpg4i5dxcr follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:58 np0005486759.ooo.test sudo[34262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:58 np0005486759.ooo.test sudo[34324]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noinpniywmefxvondcegqabgstcdcvfm ; /usr/bin/python3
Oct 14 08:08:58 np0005486759.ooo.test sudo[34324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:59 np0005486759.ooo.test python3[34326]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:08:59 np0005486759.ooo.test sudo[34324]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:59 np0005486759.ooo.test sudo[34367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdkyyfxvwlgkknymkmkarjbjspqzhiam ; /usr/bin/python3
Oct 14 08:08:59 np0005486759.ooo.test sudo[34367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:59 np0005486759.ooo.test python3[34369]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429338.8443232-100695-134509746757553/source _original_basename=tmpt95aslde follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:08:59 np0005486759.ooo.test sudo[34367]: pam_unix(sudo:session): session closed for user root
Oct 14 08:08:59 np0005486759.ooo.test sudo[34397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stsaoeqijhzqwrsdpjztjlfbjgbsnadh ; /usr/bin/python3
Oct 14 08:08:59 np0005486759.ooo.test sudo[34397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:08:59 np0005486759.ooo.test python3[34399]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 14 08:08:59 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:09:00 np0005486759.ooo.test systemd-rc-local-generator[34428]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:09:00 np0005486759.ooo.test systemd-sysv-generator[34432]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:09:00 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:09:00 np0005486759.ooo.test sudo[34397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:00 np0005486759.ooo.test sudo[34451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdaundosfascyyrvtomnqfzjezjogxad ; /usr/bin/python3
Oct 14 08:09:00 np0005486759.ooo.test sudo[34451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:00 np0005486759.ooo.test python3[34453]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:09:00 np0005486759.ooo.test sudo[34451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:00 np0005486759.ooo.test sudo[34467]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrfrvrwrgnrtmffxujshynegdbgrpeqa ; /usr/bin/python3
Oct 14 08:09:00 np0005486759.ooo.test sudo[34467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:00 np0005486759.ooo.test python3[34469]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:09:00 np0005486759.ooo.test sudo[34467]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:00 np0005486759.ooo.test sudo[34484]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgyjfbauzguoxgvozxafdjpkvvoknpll ; /usr/bin/python3
Oct 14 08:09:00 np0005486759.ooo.test sudo[34484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:01 np0005486759.ooo.test python3[34486]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:09:01 np0005486759.ooo.test systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Oct 14 08:09:01 np0005486759.ooo.test sudo[34484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:01 np0005486759.ooo.test sudo[34502]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmazjcegaednuwrvoqqhkqtioymjzmkz ; /usr/bin/python3
Oct 14 08:09:01 np0005486759.ooo.test sudo[34502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:01 np0005486759.ooo.test python3[34504]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:09:01 np0005486759.ooo.test sudo[34502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:01 np0005486759.ooo.test sudo[34518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlxxkvshblhpaoeggogdcutckrvsuwew ; /usr/bin/python3
Oct 14 08:09:01 np0005486759.ooo.test sudo[34518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:01 np0005486759.ooo.test python3[34520]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:09:01 np0005486759.ooo.test sudo[34518]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:02 np0005486759.ooo.test sudo[34566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfhucnklahgdhlvdqwrzscuqpuyngvhd ; /usr/bin/python3
Oct 14 08:09:02 np0005486759.ooo.test sudo[34566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:02 np0005486759.ooo.test python3[34568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:09:02 np0005486759.ooo.test sudo[34566]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:02 np0005486759.ooo.test sudo[34609]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uajzmiqijseavmwxpkcdpgivkvoduhjc ; /usr/bin/python3
Oct 14 08:09:02 np0005486759.ooo.test sudo[34609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:02 np0005486759.ooo.test python3[34611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429341.9180715-101005-40436887630589/source _original_basename=tmplb0z_2d9 follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:09:02 np0005486759.ooo.test sudo[34609]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:03 np0005486759.ooo.test sudo[34639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yahikklcswebmfkrltjdxisrlnhzoucd ; /usr/bin/python3
Oct 14 08:09:03 np0005486759.ooo.test sudo[34639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:03 np0005486759.ooo.test python3[34641]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:09:03 np0005486759.ooo.test sudo[34639]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:03 np0005486759.ooo.test sudo[34655]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmafzkzjifkytosjdzqwqvwcxjgyrzfu ; /usr/bin/python3
Oct 14 08:09:03 np0005486759.ooo.test sudo[34655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:03 np0005486759.ooo.test python3[34657]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Oct 14 08:09:03 np0005486759.ooo.test sudo[34655]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:03 np0005486759.ooo.test sudo[34671]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdbkxjffxgaqlwcuqjeqlwlgiztopmnh ; /usr/bin/python3
Oct 14 08:09:03 np0005486759.ooo.test sudo[34671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:03 np0005486759.ooo.test python3[34673]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:09:03 np0005486759.ooo.test sudo[34671]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:04 np0005486759.ooo.test sudo[34687]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuqnrvekrwlfervtzalrufonpubaxgeh ; /usr/bin/python3
Oct 14 08:09:04 np0005486759.ooo.test sudo[34687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:04 np0005486759.ooo.test python3[34689]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:09:04 np0005486759.ooo.test sudo[34687]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:04 np0005486759.ooo.test sudo[34703]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhkekclkqnrnribkflqcuixkaixmoasg ; /usr/bin/python3
Oct 14 08:09:04 np0005486759.ooo.test sudo[34703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:04 np0005486759.ooo.test python3[34705]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:09:04 np0005486759.ooo.test sudo[34703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:04 np0005486759.ooo.test sudo[34719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trjjqawwypyrjiuqjpfzsavpkctmkcck ; /usr/bin/python3
Oct 14 08:09:04 np0005486759.ooo.test sudo[34719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:05 np0005486759.ooo.test python3[34721]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  Converting 2696 SID table entries...
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 08:09:05 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 08:09:05 np0005486759.ooo.test sudo[34719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:06 np0005486759.ooo.test sudo[34742]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynbddgvkchcwvmormgeypmfaiyrqiiqp ; /usr/bin/python3
Oct 14 08:09:06 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 14 08:09:06 np0005486759.ooo.test sudo[34742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:06 np0005486759.ooo.test python3[34744]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:09:06 np0005486759.ooo.test sudo[34742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:06 np0005486759.ooo.test sudo[34758]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deymoeuaxyjtvqlsqtnvhmgcccfbhzfj ; /usr/bin/python3
Oct 14 08:09:06 np0005486759.ooo.test sudo[34758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:06 np0005486759.ooo.test sudo[34758]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:06 np0005486759.ooo.test sudo[34806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osnjbtmxlpdiasrgwrysvuboyubyxwdo ; /usr/bin/python3
Oct 14 08:09:06 np0005486759.ooo.test sudo[34806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:07 np0005486759.ooo.test sudo[34806]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:07 np0005486759.ooo.test sudo[34849]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-batbqhhrwminoyraqdpaocwsztdwlqqy ; /usr/bin/python3
Oct 14 08:09:07 np0005486759.ooo.test sudo[34849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:07 np0005486759.ooo.test sudo[34849]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:07 np0005486759.ooo.test sudo[34879]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfhxnvbdibyiqcbebtuuumeovgapwtfg ; /usr/bin/python3
Oct 14 08:09:07 np0005486759.ooo.test sudo[34879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:08 np0005486759.ooo.test python3[34881]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}}}
Oct 14 08:09:08 np0005486759.ooo.test sudo[34879]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:08 np0005486759.ooo.test sudo[34895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvdrwnldimxjesylogjszjfgiabdytlm ; /usr/bin/python3
Oct 14 08:09:08 np0005486759.ooo.test rsyslogd[758]: message too long (29078) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Oct 14 08:09:08 np0005486759.ooo.test sudo[34895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:08 np0005486759.ooo.test python3[34897]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:09:08 np0005486759.ooo.test sudo[34895]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:08 np0005486759.ooo.test sudo[34911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enhpyewxicavkcmyybmjjkxmlbcnqzyv ; /usr/bin/python3
Oct 14 08:09:08 np0005486759.ooo.test sudo[34911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:08 np0005486759.ooo.test python3[34913]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:09:08 np0005486759.ooo.test sudo[34911]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:09 np0005486759.ooo.test sudo[34927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmaceeimdisvauyvxouhcbhfmftfmdwn ; /usr/bin/python3
Oct 14 08:09:09 np0005486759.ooo.test sudo[34927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:09 np0005486759.ooo.test python3[34929]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Oct 14 08:09:09 np0005486759.ooo.test sudo[34927]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:11 np0005486759.ooo.test sudo[34975]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgylzcuehpxotbxmerzeoiusxureaxql ; /usr/bin/python3
Oct 14 08:09:11 np0005486759.ooo.test sudo[34975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:11 np0005486759.ooo.test python3[34977]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:09:11 np0005486759.ooo.test sudo[34975]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:11 np0005486759.ooo.test sudo[35018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhrphigfqxztxpvviegxtecvitdfhqtm ; /usr/bin/python3
Oct 14 08:09:11 np0005486759.ooo.test sudo[35018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:11 np0005486759.ooo.test python3[35020]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429351.0642521-101144-236027798266844/source _original_basename=tmplhootjzx follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:09:11 np0005486759.ooo.test sudo[35018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:11 np0005486759.ooo.test sudo[35048]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drqmwwowbwaulzzgqobgbkzptxakpotd ; /usr/bin/python3
Oct 14 08:09:11 np0005486759.ooo.test sudo[35048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:12 np0005486759.ooo.test python3[35050]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:09:12 np0005486759.ooo.test sudo[35048]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:12 np0005486759.ooo.test sudo[35098]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stiekjdpxxvczbqoiamrlzitjwxiwwjw ; /usr/bin/python3
Oct 14 08:09:12 np0005486759.ooo.test sudo[35098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:12 np0005486759.ooo.test sudo[35098]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:13 np0005486759.ooo.test sudo[35141]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdqbajtvffrgzuvcxscbbzrnjbtarhlt ; /usr/bin/python3
Oct 14 08:09:13 np0005486759.ooo.test sudo[35141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:13 np0005486759.ooo.test sudo[35141]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:13 np0005486759.ooo.test sudo[35171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ettlahbmknmmbvksbhtfoxvjplubmkuc ; /usr/bin/python3
Oct 14 08:09:13 np0005486759.ooo.test sudo[35171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:13 np0005486759.ooo.test python3[35173]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:09:13 np0005486759.ooo.test sudo[35171]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:14 np0005486759.ooo.test sudo[35219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mogtzkxbzuqwnbzpqhstuidbymoywcji ; /usr/bin/python3
Oct 14 08:09:14 np0005486759.ooo.test sudo[35219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:14 np0005486759.ooo.test sudo[35219]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:14 np0005486759.ooo.test sudo[35262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkpnisaaalhyhjatmwoatsbhwduwobyd ; /usr/bin/python3
Oct 14 08:09:14 np0005486759.ooo.test sudo[35262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:14 np0005486759.ooo.test sudo[35262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:14 np0005486759.ooo.test sudo[35292]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwnkykkdoyfpyeuzxghlwobleujgtjoe ; /usr/bin/python3
Oct 14 08:09:14 np0005486759.ooo.test sudo[35292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:15 np0005486759.ooo.test python3[35294]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Oct 14 08:09:15 np0005486759.ooo.test sudo[35292]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:16 np0005486759.ooo.test sudo[35308]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhglurpyhaxmhkhtisazcxzacwixptbb ; /usr/bin/python3
Oct 14 08:09:16 np0005486759.ooo.test sudo[35308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:16 np0005486759.ooo.test python3[35310]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:09:16 np0005486759.ooo.test sudo[35308]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:17 np0005486759.ooo.test sudo[35325]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztztgkielfjdjrlaarsqpclzyxzwultz ; /usr/bin/python3
Oct 14 08:09:17 np0005486759.ooo.test sudo[35325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:17 np0005486759.ooo.test python3[35327]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:09:21 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 08:09:21 np0005486759.ooo.test dbus-broker-launch[11988]: Noticed file-system modification, trigger reload.
Oct 14 08:09:21 np0005486759.ooo.test dbus-broker-launch[11988]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 14 08:09:21 np0005486759.ooo.test dbus-broker-launch[11988]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 14 08:09:21 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 08:09:21 np0005486759.ooo.test systemd[1]: Reexecuting.
Oct 14 08:09:21 np0005486759.ooo.test systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 14 08:09:21 np0005486759.ooo.test systemd[1]: Detected virtualization kvm.
Oct 14 08:09:21 np0005486759.ooo.test systemd[1]: Detected architecture x86-64.
Oct 14 08:09:21 np0005486759.ooo.test systemd-sysv-generator[35384]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:09:21 np0005486759.ooo.test systemd-rc-local-generator[35379]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:09:21 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:09:24 np0005486759.ooo.test systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  Converting 2696 SID table entries...
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 08:09:29 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 08:09:29 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 08:09:29 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 14 08:09:29 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 08:09:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:09:30 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 08:09:30 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:09:31 np0005486759.ooo.test systemd-rc-local-generator[35502]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:09:31 np0005486759.ooo.test systemd-sysv-generator[35507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 08:09:31 np0005486759.ooo.test systemd-journald[618]: Journal stopped
Oct 14 08:09:31 np0005486759.ooo.test systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Stopping Journal Service...
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Stopped Journal Service.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: systemd-journald.service: Consumed 1.405s CPU time.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Starting Journal Service...
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: systemd-udevd.service: Consumed 2.133s CPU time.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 14 08:09:31 np0005486759.ooo.test systemd-journald[35787]: Journal started
Oct 14 08:09:31 np0005486759.ooo.test systemd-journald[35787]: Runtime Journal (/run/log/journal/8e1d5208cffec42b50976967e1d1cfd0) is 11.8M, max 314.7M, 302.9M free.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Started Journal Service.
Oct 14 08:09:31 np0005486759.ooo.test systemd-udevd[35801]: Using default interface naming scheme 'rhel-9.0'.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:09:31 np0005486759.ooo.test systemd-rc-local-generator[36408]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:09:31 np0005486759.ooo.test systemd-sysv-generator[36413]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 08:09:31 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 08:09:32 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Consumed 1.141s CPU time.
Oct 14 08:09:32 np0005486759.ooo.test systemd[1]: run-r361db59e87f140de9053956b598ac835.service: Deactivated successfully.
Oct 14 08:09:32 np0005486759.ooo.test systemd[1]: run-r391731533b434d2183234ddc3d279ca6.service: Deactivated successfully.
Oct 14 08:09:33 np0005486759.ooo.test sudo[35325]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:33 np0005486759.ooo.test sudo[36820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cimpxacxgcpgrcfhzvtmgoxupfmnqcra ; /usr/bin/python3
Oct 14 08:09:33 np0005486759.ooo.test sudo[36820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:33 np0005486759.ooo.test python3[36822]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Oct 14 08:09:33 np0005486759.ooo.test sudo[36820]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:33 np0005486759.ooo.test sudo[36839]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpgazrvsgqpoggqzhoaywrdqcsbxymsw ; /usr/bin/python3
Oct 14 08:09:33 np0005486759.ooo.test sudo[36839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:33 np0005486759.ooo.test python3[36841]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:09:34 np0005486759.ooo.test sudo[36839]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:34 np0005486759.ooo.test sudo[36857]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqtkvwfqlnsfhhuwklveczykogdldooo ; /usr/bin/python3
Oct 14 08:09:34 np0005486759.ooo.test sudo[36857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:34 np0005486759.ooo.test python3[36859]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:09:34 np0005486759.ooo.test python3[36859]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Oct 14 08:09:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:34 np0005486759.ooo.test python3[36859]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Oct 14 08:09:39 np0005486759.ooo.test kernel: VFS: idmapped mount is not enabled.
Oct 14 08:09:42 np0005486759.ooo.test podman[36871]: 2025-10-14 08:09:35.028266715 +0000 UTC m=+0.045595696 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Oct 14 08:09:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:42 np0005486759.ooo.test python3[36859]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 1571c200d626c35388c5864f613dd17fb1618f6192fe622da60a47fa61763c46 --format json
Oct 14 08:09:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:42 np0005486759.ooo.test sudo[36857]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:43 np0005486759.ooo.test sudo[37018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frgwjqcngfkvtkpghndfbqkuhsczzbtm ; /usr/bin/python3
Oct 14 08:09:43 np0005486759.ooo.test sudo[37018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:43 np0005486759.ooo.test python3[37020]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:09:43 np0005486759.ooo.test python3[37020]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Oct 14 08:09:43 np0005486759.ooo.test python3[37020]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Oct 14 08:09:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:52 np0005486759.ooo.test podman[37033]: 2025-10-14 08:09:43.341873886 +0000 UTC m=+0.031619956 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Oct 14 08:09:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:52 np0005486759.ooo.test python3[37020]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 1e3eee8f9b979ec527f69dda079bc969bf9ddbe65c90f0543f3891d72e56a75e --format json
Oct 14 08:09:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:52 np0005486759.ooo.test sudo[37018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:09:53 np0005486759.ooo.test sudo[37191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wajvuysxlnmpbjuuqwmbxvjyglmyupai ; /usr/bin/python3
Oct 14 08:09:53 np0005486759.ooo.test sudo[37191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:09:53 np0005486759.ooo.test python3[37193]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:09:53 np0005486759.ooo.test python3[37193]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Oct 14 08:09:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:09:53 np0005486759.ooo.test python3[37193]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Oct 14 08:10:10 np0005486759.ooo.test podman[37206]: 2025-10-14 08:09:53.336744989 +0000 UTC m=+0.040975564 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:10:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:10 np0005486759.ooo.test systemd[24104]: Created slice User Background Tasks Slice.
Oct 14 08:10:10 np0005486759.ooo.test systemd[24104]: Starting Cleanup of User's Temporary Files and Directories...
Oct 14 08:10:10 np0005486759.ooo.test python3[37193]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a56a2196ea2290002b5e3e60b4c440f2326e4f1173ca4d9c0a320716a756e568 --format json
Oct 14 08:10:10 np0005486759.ooo.test systemd[24104]: Finished Cleanup of User's Temporary Files and Directories.
Oct 14 08:10:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:10 np0005486759.ooo.test sudo[37191]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:10 np0005486759.ooo.test sudo[37405]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krwfrwaudzvknmxdkgvdyolrbrnsfnez ; /usr/bin/python3
Oct 14 08:10:11 np0005486759.ooo.test sudo[37405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:11 np0005486759.ooo.test python3[37407]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:11 np0005486759.ooo.test python3[37407]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Oct 14 08:10:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:11 np0005486759.ooo.test python3[37407]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Oct 14 08:10:26 np0005486759.ooo.test podman[37421]: 2025-10-14 08:10:11.285518422 +0000 UTC m=+0.045515395 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:10:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:26 np0005486759.ooo.test python3[37407]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 89ed729ad5d881399a0bbd370b8f3c39b84e5a87c6e02b0d1f2c943d2d9cfb7a --format json
Oct 14 08:10:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:26 np0005486759.ooo.test sudo[37405]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:26 np0005486759.ooo.test sudo[37706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xovoqjvtnhbvhtttnuveqvopqofygkzx ; /usr/bin/python3
Oct 14 08:10:26 np0005486759.ooo.test sudo[37706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:26 np0005486759.ooo.test python3[37708]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:26 np0005486759.ooo.test python3[37708]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Oct 14 08:10:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:26 np0005486759.ooo.test python3[37708]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Oct 14 08:10:37 np0005486759.ooo.test podman[37720]: 2025-10-14 08:10:26.610130169 +0000 UTC m=+0.043672217 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Oct 14 08:10:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:37 np0005486759.ooo.test python3[37708]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a5e44a6280ab7a1da1b469cc214b40ecdad1d13f0c37c24f32cb45b40cce41d6 --format json
Oct 14 08:10:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:37 np0005486759.ooo.test sudo[37706]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:37 np0005486759.ooo.test sudo[37919]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxflluzmvptqjfcpauuzevlknxzwzadg ; /usr/bin/python3
Oct 14 08:10:37 np0005486759.ooo.test sudo[37919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:37 np0005486759.ooo.test python3[37921]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:37 np0005486759.ooo.test python3[37921]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Oct 14 08:10:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:37 np0005486759.ooo.test python3[37921]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Oct 14 08:10:42 np0005486759.ooo.test podman[37934]: 2025-10-14 08:10:37.608275008 +0000 UTC m=+0.048260918 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Oct 14 08:10:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:42 np0005486759.ooo.test python3[37921]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ef4308e71ba3950618e5de99f6c775558514a06fb9f6d93ca5c54d685a1349a6 --format json
Oct 14 08:10:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:42 np0005486759.ooo.test sudo[37919]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:42 np0005486759.ooo.test sudo[38055]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxzpvusremzcvbrpfxmxvqnrsttnjjwq ; /usr/bin/python3
Oct 14 08:10:42 np0005486759.ooo.test sudo[38055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:42 np0005486759.ooo.test python3[38057]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:42 np0005486759.ooo.test python3[38057]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Oct 14 08:10:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:42 np0005486759.ooo.test python3[38057]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Oct 14 08:10:45 np0005486759.ooo.test podman[38069]: 2025-10-14 08:10:42.839168392 +0000 UTC m=+0.047771704 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Oct 14 08:10:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:45 np0005486759.ooo.test python3[38057]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 5b5e3dbf480a168d795a47e53d0695cd833f381ef10119a3de87e5946f6b53e5 --format json
Oct 14 08:10:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:45 np0005486759.ooo.test sudo[38055]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:45 np0005486759.ooo.test sudo[38191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgklgctulxciznewoaktfcgghlunjezo ; /usr/bin/python3
Oct 14 08:10:45 np0005486759.ooo.test sudo[38191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:46 np0005486759.ooo.test python3[38193]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:46 np0005486759.ooo.test python3[38193]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Oct 14 08:10:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:46 np0005486759.ooo.test python3[38193]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Oct 14 08:10:48 np0005486759.ooo.test podman[38207]: 2025-10-14 08:10:46.163725144 +0000 UTC m=+0.027124217 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Oct 14 08:10:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:48 np0005486759.ooo.test python3[38193]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 250768c493b95c1151e047902a648e6659ba35adb4c6e0af85c231937d0cc9b7 --format json
Oct 14 08:10:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:48 np0005486759.ooo.test sudo[38191]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:48 np0005486759.ooo.test sudo[38328]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzsrqavjyziyrjvyepyfozqxvjvswgbv ; /usr/bin/python3
Oct 14 08:10:48 np0005486759.ooo.test sudo[38328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:48 np0005486759.ooo.test python3[38330]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:48 np0005486759.ooo.test python3[38330]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Oct 14 08:10:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:48 np0005486759.ooo.test python3[38330]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Oct 14 08:10:51 np0005486759.ooo.test podman[38343]: 2025-10-14 08:10:48.971926491 +0000 UTC m=+0.033457522 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Oct 14 08:10:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:51 np0005486759.ooo.test python3[38330]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 68d3d3a77bfc9fce94ca9ce2b28076450b851f6f1e82e97fbe356ce4ab0f7849 --format json
Oct 14 08:10:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:51 np0005486759.ooo.test sudo[38328]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:52 np0005486759.ooo.test sudo[38464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejjuoirxlbjqemzrwkdkifalpgokjcib ; /usr/bin/python3
Oct 14 08:10:52 np0005486759.ooo.test sudo[38464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:52 np0005486759.ooo.test python3[38466]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:52 np0005486759.ooo.test python3[38466]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Oct 14 08:10:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:52 np0005486759.ooo.test python3[38466]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Oct 14 08:10:57 np0005486759.ooo.test podman[38478]: 2025-10-14 08:10:52.320104871 +0000 UTC m=+0.025701203 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Oct 14 08:10:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:57 np0005486759.ooo.test python3[38466]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 01fc8d861e2b923ef0bf1d5c40a269bd976b00e8a31e8c56d63f3504b82b1c76 --format json
Oct 14 08:10:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:57 np0005486759.ooo.test sudo[38464]: pam_unix(sudo:session): session closed for user root
Oct 14 08:10:57 np0005486759.ooo.test sudo[38610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuwbqxozsnehkoniqknlppffymszycyg ; /usr/bin/python3
Oct 14 08:10:57 np0005486759.ooo.test sudo[38610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:10:57 np0005486759.ooo.test python3[38612]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Oct 14 08:10:57 np0005486759.ooo.test python3[38612]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Oct 14 08:10:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:10:57 np0005486759.ooo.test python3[38612]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Oct 14 08:11:00 np0005486759.ooo.test podman[38624]: 2025-10-14 08:10:57.847627258 +0000 UTC m=+0.036690951 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Oct 14 08:11:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:00 np0005486759.ooo.test python3[38612]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7f7fcb1a516a6191c7a8cb132a460e04d50ca4381f114f08dcbfe84340e49ac0 --format json
Oct 14 08:11:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:00 np0005486759.ooo.test sudo[38610]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:01 np0005486759.ooo.test sudo[38742]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wudpdyczniychocuqiypsoxsmipbcgxj ; /usr/bin/python3
Oct 14 08:11:01 np0005486759.ooo.test sudo[38742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:01 np0005486759.ooo.test python3[38744]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:11:01 np0005486759.ooo.test sudo[38742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:01 np0005486759.ooo.test sudo[38792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqwwysybmkupdrpalrpqdmtcadwoiupu ; /usr/bin/python3
Oct 14 08:11:01 np0005486759.ooo.test sudo[38792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:01 np0005486759.ooo.test sudo[38792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:01 np0005486759.ooo.test sudo[38810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwgumzqvzzojhxyitntxdhxbqucgeisg ; /usr/bin/python3
Oct 14 08:11:01 np0005486759.ooo.test sudo[38810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:02 np0005486759.ooo.test sudo[38810]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:02 np0005486759.ooo.test sudo[38914]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddbpsidrkukyeknjiekqxikxcamyrcgr ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429462.272958-103086-34992227721674/async_wrapper.py 206933717617 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429462.272958-103086-34992227721674/AnsiballZ_command.py _
Oct 14 08:11:02 np0005486759.ooo.test sudo[38914]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:11:02 np0005486759.ooo.test ansible-async_wrapper.py[38916]: Invoked with 206933717617 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429462.272958-103086-34992227721674/AnsiballZ_command.py _
Oct 14 08:11:02 np0005486759.ooo.test ansible-async_wrapper.py[38919]: Starting module and watcher
Oct 14 08:11:02 np0005486759.ooo.test ansible-async_wrapper.py[38919]: Start watching 38920 (3600)
Oct 14 08:11:02 np0005486759.ooo.test ansible-async_wrapper.py[38920]: Start module (38920)
Oct 14 08:11:02 np0005486759.ooo.test ansible-async_wrapper.py[38916]: Return async_wrapper task started.
Oct 14 08:11:02 np0005486759.ooo.test sudo[38914]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:02 np0005486759.ooo.test sudo[38935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyglkajyzosifpkjsotlvuvdmmlcrckc ; /usr/bin/python3
Oct 14 08:11:02 np0005486759.ooo.test sudo[38935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:03 np0005486759.ooo.test python3[38940]: ansible-ansible.legacy.async_status Invoked with jid=206933717617.38916 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:11:03 np0005486759.ooo.test sudo[38935]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    (file & line not available)
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    (file & line not available)
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.12 seconds
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Notice: Applied catalog in 0.12 seconds
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Application:
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    Initial environment: production
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    Converged environment: production
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:          Run mode: user
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Changes:
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:             Total: 3
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Events:
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:           Success: 3
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:             Total: 3
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Resources:
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:           Changed: 3
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:       Out of sync: 3
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:             Total: 10
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Time:
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:          Schedule: 0.00
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:              File: 0.01
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:              Exec: 0.02
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:            Augeas: 0.06
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    Transaction evaluation: 0.10
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    Catalog application: 0.12
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:    Config retrieval: 0.15
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:          Last run: 1760429466
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:        Filebucket: 0.00
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:             Total: 0.12
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]: Version:
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:            Config: 1760429466
Oct 14 08:11:06 np0005486759.ooo.test puppet-user[38939]:            Puppet: 7.10.0
Oct 14 08:11:06 np0005486759.ooo.test ansible-async_wrapper.py[38920]: Module complete (38920)
Oct 14 08:11:07 np0005486759.ooo.test ansible-async_wrapper.py[38919]: Done in kid B.
Oct 14 08:11:13 np0005486759.ooo.test sudo[39065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xacfionaatcrupisegkuqdezjmmaieqt ; /usr/bin/python3
Oct 14 08:11:13 np0005486759.ooo.test sudo[39065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:13 np0005486759.ooo.test python3[39067]: ansible-ansible.legacy.async_status Invoked with jid=206933717617.38916 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:11:13 np0005486759.ooo.test sudo[39065]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:13 np0005486759.ooo.test sudo[39081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giltvbomjkernxccdtwudhwctlhjqqyw ; /usr/bin/python3
Oct 14 08:11:13 np0005486759.ooo.test sudo[39081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:13 np0005486759.ooo.test python3[39083]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:11:13 np0005486759.ooo.test sudo[39081]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:14 np0005486759.ooo.test sudo[39097]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocrqfamyawackfxknocnrxykxcdzkmyw ; /usr/bin/python3
Oct 14 08:11:14 np0005486759.ooo.test sudo[39097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:14 np0005486759.ooo.test python3[39099]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:11:14 np0005486759.ooo.test sudo[39097]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:14 np0005486759.ooo.test sudo[39145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idqbsphfrrxwtultbldfaebpzajshoql ; /usr/bin/python3
Oct 14 08:11:14 np0005486759.ooo.test sudo[39145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:14 np0005486759.ooo.test python3[39147]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:14 np0005486759.ooo.test sudo[39145]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:14 np0005486759.ooo.test sudo[39188]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnbaummggrxbchmnfqfxkzhjcjuqbqnh ; /usr/bin/python3
Oct 14 08:11:14 np0005486759.ooo.test sudo[39188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:14 np0005486759.ooo.test python3[39190]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429474.3629599-103393-66794590016021/source _original_basename=tmpjpx36yww follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:11:15 np0005486759.ooo.test sudo[39188]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:15 np0005486759.ooo.test sudo[39218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztphsywfummhvvfyuayvsikngwvuleve ; /usr/bin/python3
Oct 14 08:11:15 np0005486759.ooo.test sudo[39218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:15 np0005486759.ooo.test python3[39220]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:15 np0005486759.ooo.test sudo[39218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:15 np0005486759.ooo.test sudo[39234]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzwozfslggsdyyxznyaejhfakpnknlsk ; /usr/bin/python3
Oct 14 08:11:15 np0005486759.ooo.test sudo[39234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:16 np0005486759.ooo.test sudo[39234]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:16 np0005486759.ooo.test sudo[39321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teqbessjiubcaozfyosrzalbvlfaeaut ; /usr/bin/python3
Oct 14 08:11:16 np0005486759.ooo.test sudo[39321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:16 np0005486759.ooo.test python3[39323]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Oct 14 08:11:16 np0005486759.ooo.test sudo[39321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:16 np0005486759.ooo.test systemd-journald[35787]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 78.1 (260 of 333 items), suggesting rotation.
Oct 14 08:11:16 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 08:11:16 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 08:11:16 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 08:11:17 np0005486759.ooo.test sudo[39341]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bovbgepljrmbhnyxldvgjjbsyvzxpwmt ; /usr/bin/python3
Oct 14 08:11:17 np0005486759.ooo.test sudo[39341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:17 np0005486759.ooo.test python3[39343]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 08:11:17 np0005486759.ooo.test sudo[39341]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:17 np0005486759.ooo.test sudo[39357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laucwhvxdnbtautmakclqptwzsoicdkb ; /usr/bin/python3
Oct 14 08:11:17 np0005486759.ooo.test sudo[39357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:17 np0005486759.ooo.test python3[39359]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005486759 step=1 update_config_hash_only=False
Oct 14 08:11:17 np0005486759.ooo.test sudo[39357]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:17 np0005486759.ooo.test sudo[39373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ostyxrnugfihxkmyrlpdozajawuykqpt ; /usr/bin/python3
Oct 14 08:11:17 np0005486759.ooo.test sudo[39373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:18 np0005486759.ooo.test python3[39375]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:18 np0005486759.ooo.test sudo[39373]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:18 np0005486759.ooo.test sudo[39389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbspwypmhoreinyuoefsetijduauoumw ; /usr/bin/python3
Oct 14 08:11:18 np0005486759.ooo.test sudo[39389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:18 np0005486759.ooo.test python3[39391]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Oct 14 08:11:18 np0005486759.ooo.test sudo[39389]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:19 np0005486759.ooo.test sudo[39405]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gopgdnkdibroaxcvjvsrdbhdgdjvvxdb ; /usr/bin/python3
Oct 14 08:11:19 np0005486759.ooo.test sudo[39405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:19 np0005486759.ooo.test python3[39407]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 08:11:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:19 np0005486759.ooo.test sudo[39405]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:19 np0005486759.ooo.test sudo[39434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xghpodddrhyeghwmpvytkltmkarjxhaz ; /usr/bin/python3
Oct 14 08:11:19 np0005486759.ooo.test sudo[39434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:20 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:20 np0005486759.ooo.test podman[39625]: 2025-10-14 08:11:20.459391386 +0000 UTC m=+0.041934441 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Oct 14 08:11:20 np0005486759.ooo.test podman[39602]: 2025-10-14 08:11:20.472341024 +0000 UTC m=+0.076162724 container create 79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, container_name=container-puppet-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-07-21T13:27:15, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:11:20 np0005486759.ooo.test podman[39625]: 2025-10-14 08:11:20.500078383 +0000 UTC m=+0.082621418 container create 3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, container_name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, build-date=2025-07-21T13:07:59, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libpod-conmon-79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231.scope.
Oct 14 08:11:20 np0005486759.ooo.test podman[39647]: 2025-10-14 08:11:20.520138716 +0000 UTC m=+0.075349526 container create 474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, version=17.1.9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, distribution-scope=public, com.redhat.component=openstack-collectd-container, release=2, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team)
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libpod-conmon-3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4.scope.
Oct 14 08:11:20 np0005486759.ooo.test podman[39638]: 2025-10-14 08:11:20.533684715 +0000 UTC m=+0.090178549 container create a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, version=17.1.9, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, container_name=container-puppet-nova_libvirt, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=2, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git)
Oct 14 08:11:20 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6039781ecbdbb8b912b1fabb353ef7c9e921a35bfb1c76a3538df9595de5862/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:20 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6039781ecbdbb8b912b1fabb353ef7c9e921a35bfb1c76a3538df9595de5862/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:20 np0005486759.ooo.test podman[39602]: 2025-10-14 08:11:20.438248616 +0000 UTC m=+0.042070326 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:20 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40d714933e5b0d972646fbd568da7cc308b3f168911dd1b81f3db3e7d11dcd1a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libpod-conmon-474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023.scope.
Oct 14 08:11:20 np0005486759.ooo.test podman[39650]: 2025-10-14 08:11:20.554256545 +0000 UTC m=+0.096800397 container create 0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, release=1, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, architecture=x86_64, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, container_name=container-puppet-crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libpod-conmon-a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c.scope.
Oct 14 08:11:20 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c760467e4bcded5cc756a9c9f562bd63684ba8491ef5a99de42ba004cfc34cbd/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:20 np0005486759.ooo.test podman[39647]: 2025-10-14 08:11:20.571352136 +0000 UTC m=+0.126562936 container init 474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=container-puppet-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_puppet_step1, batch=17.1_20250721.1, version=17.1.9, release=2, com.redhat.component=openstack-collectd-container)
Oct 14 08:11:20 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfd4f5857600cb26cb7d3ab11c417725f8adfe1c222d34600a4b303096399cf3/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libpod-conmon-0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1.scope.
Oct 14 08:11:20 np0005486759.ooo.test podman[39638]: 2025-10-14 08:11:20.57956596 +0000 UTC m=+0.136059784 container init a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, io.buildah.version=1.33.12, version=17.1.9, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, release=2, build-date=2025-07-21T14:56:59, config_id=tripleo_puppet_step1, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:11:20 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:20 np0005486759.ooo.test podman[39638]: 2025-10-14 08:11:20.585876198 +0000 UTC m=+0.142370032 container start a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, release=2, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, container_name=container-puppet-nova_libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T14:56:59, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1)
Oct 14 08:11:20 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b272b93e3e9b77af062c082df69449b0ad42f33081484c8b336852990e7bca40/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:20 np0005486759.ooo.test podman[39638]: 2025-10-14 08:11:20.58707859 +0000 UTC m=+0.143572444 container attach a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, batch=17.1_20250721.1, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=container-puppet-nova_libvirt, architecture=x86_64, release=2, io.openshift.expose-services=, build-date=2025-07-21T14:56:59)
Oct 14 08:11:20 np0005486759.ooo.test podman[39647]: 2025-10-14 08:11:20.489643392 +0000 UTC m=+0.044854212 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Oct 14 08:11:20 np0005486759.ooo.test podman[39650]: 2025-10-14 08:11:20.592026781 +0000 UTC m=+0.134570623 container init 0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, container_name=container-puppet-crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Oct 14 08:11:20 np0005486759.ooo.test podman[39638]: 2025-10-14 08:11:20.492601104 +0000 UTC m=+0.049094958 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:11:20 np0005486759.ooo.test podman[39650]: 2025-10-14 08:11:20.59692398 +0000 UTC m=+0.139467832 container start 0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, container_name=container-puppet-crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, tcib_managed=true, vendor=Red Hat, Inc., release=1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Oct 14 08:11:20 np0005486759.ooo.test podman[39650]: 2025-10-14 08:11:20.597073475 +0000 UTC m=+0.139617327 container attach 0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, distribution-scope=public, managed_by=tripleo_ansible, release=1, vcs-type=git, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_puppet_step1, container_name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9)
Oct 14 08:11:20 np0005486759.ooo.test podman[39650]: 2025-10-14 08:11:20.496711446 +0000 UTC m=+0.039255318 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Oct 14 08:11:21 np0005486759.ooo.test podman[39602]: 2025-10-14 08:11:21.579720274 +0000 UTC m=+1.183542014 container init 79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid)
Oct 14 08:11:21 np0005486759.ooo.test podman[39647]: 2025-10-14 08:11:21.590808977 +0000 UTC m=+1.146019817 container start 474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=2, container_name=container-puppet-collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:11:21 np0005486759.ooo.test podman[39647]: 2025-10-14 08:11:21.591246372 +0000 UTC m=+1.146457212 container attach 474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, batch=17.1_20250721.1, version=17.1.9, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=2, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=container-puppet-collectd)
Oct 14 08:11:21 np0005486759.ooo.test podman[39625]: 2025-10-14 08:11:21.627153524 +0000 UTC m=+1.209696589 container init 3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=container-puppet-metrics_qdr, distribution-scope=public, version=17.1.9, release=1, build-date=2025-07-21T13:07:59, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:11:21 np0005486759.ooo.test systemd[1]: tmp-crun.Fcgt05.mount: Deactivated successfully.
Oct 14 08:11:21 np0005486759.ooo.test podman[39602]: 2025-10-14 08:11:21.655941798 +0000 UTC m=+1.259763528 container start 79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., container_name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Oct 14 08:11:21 np0005486759.ooo.test podman[39602]: 2025-10-14 08:11:21.656488497 +0000 UTC m=+1.260310287 container attach 79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, config_id=tripleo_puppet_step1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.component=openstack-iscsid-container)
Oct 14 08:11:21 np0005486759.ooo.test podman[39625]: 2025-10-14 08:11:21.703024466 +0000 UTC m=+1.285567501 container start 3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, container_name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:11:21 np0005486759.ooo.test podman[39625]: 2025-10-14 08:11:21.703367458 +0000 UTC m=+1.285910563 container attach 3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_puppet_step1, build-date=2025-07-21T13:07:59, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, container_name=container-puppet-metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct 14 08:11:23 np0005486759.ooo.test podman[39520]: 2025-10-14 08:11:20.361788572 +0000 UTC m=+0.045465323 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Oct 14 08:11:23 np0005486759.ooo.test ovs-vsctl[39947]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: Accepting previously invalid value for target type 'Integer'
Oct 14 08:11:23 np0005486759.ooo.test podman[40072]: 2025-10-14 08:11:23.521101294 +0000 UTC m=+0.067284417 container create bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:49:23, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, name=rhosp17/openstack-ceilometer-central, tcib_managed=true, vcs-type=git, release=1, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=container-puppet-ceilometer, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.08 seconds
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.12 seconds
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test systemd[1]: Started libpod-conmon-bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d.scope.
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.10 seconds
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Oct 14 08:11:23 np0005486759.ooo.test crontab[40211]: (root) LIST (root)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Oct 14 08:11:23 np0005486759.ooo.test podman[40072]: 2025-10-14 08:11:23.493323573 +0000 UTC m=+0.039506696 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}a345ecb570ef0a8d48a21f3538ffa5f69b66eb00c6b758a7dc84efe33c309022'
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Oct 14 08:11:23 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Notice: Applied catalog in 0.02 seconds
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Application:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    Initial environment: production
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    Converged environment: production
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:          Run mode: user
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Changes:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:             Total: 7
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Events:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:           Success: 7
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:             Total: 7
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Resources:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:           Skipped: 13
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:           Changed: 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:       Out of sync: 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:             Total: 20
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Time:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:              File: 0.01
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    Transaction evaluation: 0.02
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    Catalog application: 0.02
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:    Config retrieval: 0.15
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:          Last run: 1760429483
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:             Total: 0.02
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]: Version:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:            Config: 1760429483
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39776]:            Puppet: 7.10.0
Oct 14 08:11:23 np0005486759.ooo.test crontab[40212]: (root) REPLACE (root)
Oct 14 08:11:23 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd45300f46d8608b2dbe48ea53730ffee2e8be48f3fb0ebbc0a31a116495a92c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Notice: Applied catalog in 0.04 seconds
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Application:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    Initial environment: production
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    Converged environment: production
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:          Run mode: user
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Changes:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:             Total: 2
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Events:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:           Success: 2
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:             Total: 2
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Resources:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:           Changed: 2
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:       Out of sync: 2
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:           Skipped: 7
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:             Total: 9
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Time:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:              File: 0.01
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:              Cron: 0.01
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    Transaction evaluation: 0.04
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    Catalog application: 0.04
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:    Config retrieval: 0.11
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:          Last run: 1760429483
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:             Total: 0.04
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]: Version:
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:            Config: 1760429483
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39760]:            Puppet: 7.10.0
Oct 14 08:11:23 np0005486759.ooo.test podman[40072]: 2025-10-14 08:11:23.617416363 +0000 UTC m=+0.163599486 container init bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, name=rhosp17/openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, build-date=2025-07-21T14:49:23, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-central-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test podman[40072]: 2025-10-14 08:11:23.630835907 +0000 UTC m=+0.177019030 container start bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, tcib_managed=true, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, build-date=2025-07-21T14:49:23, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-ceilometer-central-container, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:11:23 np0005486759.ooo.test podman[40072]: 2025-10-14 08:11:23.631076635 +0000 UTC m=+0.177259768 container attach bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, name=rhosp17/openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.9, config_id=tripleo_puppet_step1, container_name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-07-21T14:49:23)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]:    (file & line not available)
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39766]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39762]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.34 seconds
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: in a future release. Use nova::cinder::os_region_name instead
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Oct 14 08:11:23 np0005486759.ooo.test puppet-user[39764]: in a future release. Use nova::cinder::catalog_info instead
Oct 14 08:11:23 np0005486759.ooo.test systemd[1]: libpod-3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4.scope: Deactivated successfully.
Oct 14 08:11:23 np0005486759.ooo.test systemd[1]: libpod-3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4.scope: Consumed 2.117s CPU time.
Oct 14 08:11:23 np0005486759.ooo.test systemd[1]: libpod-0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1.scope: Deactivated successfully.
Oct 14 08:11:23 np0005486759.ooo.test systemd[1]: libpod-0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1.scope: Consumed 2.129s CPU time.
Oct 14 08:11:23 np0005486759.ooo.test podman[39650]: 2025-10-14 08:11:23.945832496 +0000 UTC m=+3.488376378 container died 0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=container-puppet-crond, name=rhosp17/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., release=1)
Oct 14 08:11:23 np0005486759.ooo.test podman[40297]: 2025-10-14 08:11:23.981718056 +0000 UTC m=+0.048697024 container died 3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:11:24 np0005486759.ooo.test podman[40311]: 2025-10-14 08:11:24.022982873 +0000 UTC m=+0.068933064 container cleanup 0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, container_name=container-puppet-crond, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Oct 14 08:11:24 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-conmon-0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1.scope: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Oct 14 08:11:24 np0005486759.ooo.test podman[40297]: 2025-10-14 08:11:24.057368161 +0000 UTC m=+0.124347119 container cleanup 3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, version=17.1.9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, container_name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-conmon-3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4.scope: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Notice: Applied catalog in 0.44 seconds
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Application:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:    Initial environment: production
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:    Converged environment: production
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:          Run mode: user
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Changes:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:             Total: 4
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Events:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:           Success: 4
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:             Total: 4
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Resources:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:           Changed: 4
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:       Out of sync: 4
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:           Skipped: 8
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:             Total: 13
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Time:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:              File: 0.00
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:              Exec: 0.04
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:    Config retrieval: 0.14
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:            Augeas: 0.39
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:    Transaction evaluation: 0.44
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:    Catalog application: 0.44
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:          Last run: 1760429484
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:             Total: 0.44
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]: Version:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:            Config: 1760429483
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39766]:            Puppet: 7.10.0
Oct 14 08:11:24 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::metrics::qdr
                                                       --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}0d4e701b7b2398bbf396579a0713d46d3c496c79edc52f2e260456f359c9a46c'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Notice: Applied catalog in 0.18 seconds
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Application:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:    Initial environment: production
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:    Converged environment: production
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:          Run mode: user
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Changes:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:             Total: 42
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Events:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:           Success: 42
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:             Total: 42
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Resources:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:           Skipped: 13
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:           Changed: 37
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:       Out of sync: 37
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:             Total: 78
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Time:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:              File: 0.08
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:    Transaction evaluation: 0.18
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:    Catalog application: 0.18
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:    Config retrieval: 0.46
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:          Last run: 1760429484
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:       Concat file: 0.00
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:    Concat fragment: 0.00
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:             Total: 0.18
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]: Version:
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:            Config: 1760429483
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39762]:            Puppet: 7.10.0
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Oct 14 08:11:24 np0005486759.ooo.test puppet-user[39764]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231.scope: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231.scope: Consumed 2.601s CPU time.
Oct 14 08:11:24 np0005486759.ooo.test podman[39602]: 2025-10-14 08:11:24.368346692 +0000 UTC m=+3.972168392 container died 79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=container-puppet-iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, config_id=tripleo_puppet_step1)
Oct 14 08:11:24 np0005486759.ooo.test podman[40477]: 2025-10-14 08:11:24.452425628 +0000 UTC m=+0.077958716 container cleanup 79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-conmon-79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231.scope: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::iscsid
                                                       --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Oct 14 08:11:24 np0005486759.ooo.test podman[40492]: 2025-10-14 08:11:24.478978856 +0000 UTC m=+0.081591882 container create 9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, distribution-scope=public, architecture=x86_64, container_name=container-puppet-ovn_controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.33.12, version=17.1.9)
Oct 14 08:11:24 np0005486759.ooo.test podman[40475]: 2025-10-14 08:11:24.49500745 +0000 UTC m=+0.118651483 container create c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, release=1, io.openshift.expose-services=, container_name=container-puppet-rsyslog, io.buildah.version=1.33.12, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, batch=17.1_20250721.1)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: Started libpod-conmon-9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31.scope.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: Started libpod-conmon-c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63.scope.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:24 np0005486759.ooo.test podman[40475]: 2025-10-14 08:11:24.420795524 +0000 UTC m=+0.044439557 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Oct 14 08:11:24 np0005486759.ooo.test podman[40492]: 2025-10-14 08:11:24.42179625 +0000 UTC m=+0.024409286 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Oct 14 08:11:24 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:24 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: tmp-crun.1PPtti.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b272b93e3e9b77af062c082df69449b0ad42f33081484c8b336852990e7bca40-merged.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-40d714933e5b0d972646fbd568da7cc308b3f168911dd1b81f3db3e7d11dcd1a-merged.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a6039781ecbdbb8b912b1fabb353ef7c9e921a35bfb1c76a3538df9595de5862-merged.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79109e37acaa1522c2c11e9ffadd9ad25436a62ffddb83f1307946ec69595231-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:24 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46bcb444d4e48307c55b4ae7ac02d3cb4b8dbda16b75b41e26f9c580726eafbe/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023.scope: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023.scope: Consumed 2.682s CPU time.
Oct 14 08:11:24 np0005486759.ooo.test podman[40475]: 2025-10-14 08:11:24.563043452 +0000 UTC m=+0.186687475 container init c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vcs-type=git, build-date=2025-07-21T12:58:40, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, container_name=container-puppet-rsyslog)
Oct 14 08:11:24 np0005486759.ooo.test podman[39647]: 2025-10-14 08:11:24.563397624 +0000 UTC m=+4.118608444 container died 474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, release=2, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_puppet_step1, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1)
Oct 14 08:11:24 np0005486759.ooo.test podman[40492]: 2025-10-14 08:11:24.587119164 +0000 UTC m=+0.189732200 container init 9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, release=1, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_puppet_step1, container_name=container-puppet-ovn_controller, batch=17.1_20250721.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64)
Oct 14 08:11:24 np0005486759.ooo.test podman[40492]: 2025-10-14 08:11:24.601300145 +0000 UTC m=+0.203913171 container start 9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, name=rhosp17/openstack-ovn-controller, container_name=container-puppet-ovn_controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:11:24 np0005486759.ooo.test podman[40492]: 2025-10-14 08:11:24.601495301 +0000 UTC m=+0.204108787 container attach 9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=container-puppet-ovn_controller, architecture=x86_64, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:11:24 np0005486759.ooo.test podman[40475]: 2025-10-14 08:11:24.620767218 +0000 UTC m=+0.244411241 container start c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, tcib_managed=true, name=rhosp17/openstack-rsyslog, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-rsyslog-container, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, container_name=container-puppet-rsyslog, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T12:58:40)
Oct 14 08:11:24 np0005486759.ooo.test podman[40475]: 2025-10-14 08:11:24.621753081 +0000 UTC m=+0.245397104 container attach c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_id=tripleo_puppet_step1, container_name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.9, maintainer=OpenStack TripleO Team, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T12:58:40, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test podman[40565]: 2025-10-14 08:11:24.683207146 +0000 UTC m=+0.107469236 container cleanup 474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, version=17.1.9, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., container_name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_puppet_step1, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:11:24 np0005486759.ooo.test systemd[1]: libpod-conmon-474f816364f2b7f626e697d3f34dd2b27b2e6288228ef62209a9abca6b487023.scope: Deactivated successfully.
Oct 14 08:11:24 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 1.49 seconds
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}86cba0335b97cb5e97fca98355d4798949c761ec5315edbff5ed59326105591f'
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Warning: Empty environment setting 'TLS_PASSWORD'
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}d2b78c6be902b697c0e6eb42616690977757e5a5a2aa47db9d7d4341b8ae4d58'
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c760467e4bcded5cc756a9c9f562bd63684ba8491ef5a99de42ba004cfc34cbd-merged.mount: Deactivated successfully.
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]:    (file & line not available)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]:    (file & line not available)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[40270]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Oct 14 08:11:25 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.38 seconds
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    (file & line not available)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    (file & line not available)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]:    (file & line not available)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]:    (file & line not available)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Notice: Applied catalog in 0.40 seconds
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Application:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:    Initial environment: production
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:    Converged environment: production
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:          Run mode: user
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Changes:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:             Total: 31
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Events:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:           Success: 31
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:             Total: 31
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Resources:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:           Skipped: 22
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:           Changed: 31
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:       Out of sync: 31
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:             Total: 151
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Time:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:           Package: 0.02
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:    Ceilometer config: 0.31
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:    Transaction evaluation: 0.39
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:    Catalog application: 0.40
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:    Config retrieval: 0.44
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:          Last run: 1760429486
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:         Resources: 0.00
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:             Total: 0.40
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]: Version:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:            Config: 1760429485
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40270]:            Puppet: 7.10.0
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.22 seconds
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.22 seconds
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}a5baf5797d6d35f75ce7de76dca108f09c0307c213c861f0e0c4a7ac24d40c8d'
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40930]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Notice: Applied catalog in 0.11 seconds
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Application:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    Initial environment: production
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    Converged environment: production
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:          Run mode: user
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Changes:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:             Total: 3
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Events:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:           Success: 3
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:             Total: 3
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Resources:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:           Skipped: 11
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:           Changed: 3
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:       Out of sync: 3
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:             Total: 25
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Time:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:       Concat file: 0.00
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    Concat fragment: 0.00
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:              File: 0.01
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    Transaction evaluation: 0.11
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    Catalog application: 0.11
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:    Config retrieval: 0.27
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:          Last run: 1760429486
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:             Total: 0.11
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]: Version:
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:            Config: 1760429486
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40596]:            Puppet: 7.10.0
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40933]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40935]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.107
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40944]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005486759.ooo.test
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005486759.novalocal' to 'np0005486759.ooo.test'
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40954]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40965]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40972]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test ovs-vsctl[40986]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Oct 14 08:11:26 np0005486759.ooo.test systemd[1]: libpod-bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d.scope: Deactivated successfully.
Oct 14 08:11:26 np0005486759.ooo.test systemd[1]: libpod-bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d.scope: Consumed 3.028s CPU time.
Oct 14 08:11:26 np0005486759.ooo.test podman[40072]: 2025-10-14 08:11:26.981307896 +0000 UTC m=+3.527491079 container died bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1, build-date=2025-07-21T14:49:23, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, com.redhat.component=openstack-ceilometer-central-container, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, container_name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-central)
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Oct 14 08:11:26 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test ovs-vsctl[41006]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: tmp-crun.hrMGtw.mount: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fd45300f46d8608b2dbe48ea53730ffee2e8be48f3fb0ebbc0a31a116495a92c-merged.mount: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test ovs-vsctl[41017]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: libpod-c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63.scope: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: libpod-c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63.scope: Consumed 2.366s CPU time.
Oct 14 08:11:27 np0005486759.ooo.test ovs-vsctl[41025]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test podman[40475]: 2025-10-14 08:11:27.103247992 +0000 UTC m=+2.726892015 container died c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, container_name=container-puppet-rsyslog, build-date=2025-07-21T12:58:40, distribution-scope=public, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-rsyslog-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Oct 14 08:11:27 np0005486759.ooo.test ovs-vsctl[41028]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test ovs-vsctl[41036]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test podman[40995]: 2025-10-14 08:11:27.15642366 +0000 UTC m=+0.166060992 container cleanup bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:49:23, version=17.1.9, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-central/images/17.1.9-1, batch=17.1_20250721.1, release=1, vcs-ref=1ce3db7211bdafb9cc5e59a103488bd6a8dc3f2f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, vcs-type=git, tcib_managed=true)
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: libpod-conmon-bc64c42223fb684d3526f7c6e1cbda0537a0281706edcf3c3683a85f6d69f06d.scope: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::ceilometer::agent::polling
                                                      include tripleo::profile::base::ceilometer::agent::polling
                                                       --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Notice: Applied catalog in 0.48 seconds
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Application:
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:    Initial environment: production
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:    Converged environment: production
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:          Run mode: user
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Changes:
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:             Total: 13
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Events:
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:           Success: 13
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:             Total: 13
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Resources:
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:           Skipped: 12
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:           Changed: 13
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:       Out of sync: 13
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:             Total: 28
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Time:
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:              Exec: 0.01
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:    Config retrieval: 0.26
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:         Vs config: 0.42
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:    Transaction evaluation: 0.47
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:    Catalog application: 0.48
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:          Last run: 1760429487
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:             Total: 0.48
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]: Version:
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:            Config: 1760429486
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[40606]:            Puppet: 7.10.0
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test podman[41029]: 2025-10-14 08:11:27.225121805 +0000 UTC m=+0.116325082 container cleanup c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.buildah.version=1.33.12, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_puppet_step1, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, container_name=container-puppet-rsyslog, managed_by=tripleo_ansible, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vcs-type=git, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T12:58:40, vendor=Red Hat, Inc.)
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: libpod-conmon-c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63.scope: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: libpod-9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31.scope: Deactivated successfully.
Oct 14 08:11:27 np0005486759.ooo.test systemd[1]: libpod-9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31.scope: Consumed 2.865s CPU time.
Oct 14 08:11:27 np0005486759.ooo.test podman[40492]: 2025-10-14 08:11:27.599754055 +0000 UTC m=+3.202367091 container died 9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, container_name=container-puppet-ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Oct 14 08:11:27 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46bcb444d4e48307c55b4ae7ac02d3cb4b8dbda16b75b41e26f9c580726eafbe-merged.mount: Deactivated successfully.
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c73051cd2281c883ff8ed6aab5267d5de6e5c955d70d807a732cb6e956473c63-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4-merged.mount: Deactivated successfully.
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Oct 14 08:11:28 np0005486759.ooo.test podman[41134]: 2025-10-14 08:11:28.529166694 +0000 UTC m=+0.918776532 container cleanup 9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, tcib_managed=true)
Oct 14 08:11:28 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::neutron::agents::ovn
                                                       --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: libpod-conmon-9ece1baeb3f9c3c392fe9c63185db1ebb45064897f8018d17ba3cdf0d9464c31.scope: Deactivated successfully.
Oct 14 08:11:28 np0005486759.ooo.test podman[40643]: 2025-10-14 08:11:24.912288785 +0000 UTC m=+0.033614773 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Oct 14 08:11:28 np0005486759.ooo.test podman[41214]: 2025-10-14 08:11:28.787773693 +0000 UTC m=+0.084362107 container create 4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, version=17.1.9, architecture=x86_64, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, build-date=2025-07-21T15:44:03, config_id=tripleo_puppet_step1, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-server-container, container_name=container-puppet-neutron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, release=1)
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: Started libpod-conmon-4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec.scope.
Oct 14 08:11:28 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:28 np0005486759.ooo.test podman[41214]: 2025-10-14 08:11:28.744315241 +0000 UTC m=+0.040903675 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Oct 14 08:11:28 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b917dec3e2eef93d51865cb5ed3798b1af37c60222c9bce06119f7a2c331ff68/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:28 np0005486759.ooo.test podman[41214]: 2025-10-14 08:11:28.856072773 +0000 UTC m=+0.152661177 container init 4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:44:03, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, release=1, com.redhat.component=openstack-neutron-server-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, container_name=container-puppet-neutron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20250721.1, vendor=Red Hat, Inc.)
Oct 14 08:11:28 np0005486759.ooo.test podman[41214]: 2025-10-14 08:11:28.867546271 +0000 UTC m=+0.164134675 container start 4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, name=rhosp17/openstack-neutron-server, vcs-type=git, container_name=container-puppet-neutron, config_id=tripleo_puppet_step1, build-date=2025-07-21T15:44:03, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, release=1, com.redhat.component=openstack-neutron-server-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1)
Oct 14 08:11:28 np0005486759.ooo.test podman[41214]: 2025-10-14 08:11:28.867789849 +0000 UTC m=+0.164378263 container attach 4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, release=1, com.redhat.component=openstack-neutron-server-container, container_name=container-puppet-neutron, managed_by=tripleo_ansible, build-date=2025-07-21T15:44:03, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-server, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-server)
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Oct 14 08:11:28 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3a12438802493a75725c4f7704f2af6db1ef72af396369e5de28f6f4d6a7ed98'
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Notice: Applied catalog in 4.31 seconds
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Application:
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Initial environment: production
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Converged environment: production
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:          Run mode: user
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Changes:
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:             Total: 172
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Events:
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:           Success: 172
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:             Total: 172
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Resources:
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:           Changed: 172
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:       Out of sync: 172
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:           Skipped: 52
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:             Total: 473
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Time:
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:       Concat file: 0.00
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Concat fragment: 0.00
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:            Anchor: 0.00
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:         File line: 0.00
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Virtlogd config: 0.00
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Virtstoraged config: 0.01
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Virtqemud config: 0.01
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Virtsecretd config: 0.02
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:              Exec: 0.02
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Virtnodedevd config: 0.02
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:              File: 0.02
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Virtproxyd config: 0.03
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:           Package: 0.03
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:            Augeas: 1.01
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Config retrieval: 1.77
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:          Last run: 1760429489
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:       Nova config: 2.93
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Transaction evaluation: 4.30
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:    Catalog application: 4.31
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:         Resources: 0.00
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:             Total: 4.31
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]: Version:
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:            Config: 1760429483
Oct 14 08:11:29 np0005486759.ooo.test puppet-user[39764]:            Puppet: 7.10.0
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Oct 14 08:11:30 np0005486759.ooo.test systemd[1]: libpod-a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c.scope: Deactivated successfully.
Oct 14 08:11:30 np0005486759.ooo.test systemd[1]: libpod-a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c.scope: Consumed 8.867s CPU time.
Oct 14 08:11:30 np0005486759.ooo.test podman[39638]: 2025-10-14 08:11:30.716877589 +0000 UTC m=+10.273371453 container died a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, managed_by=tripleo_ansible, build-date=2025-07-21T14:56:59, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_puppet_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=2)
Oct 14 08:11:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-cfd4f5857600cb26cb7d3ab11c417725f8adfe1c222d34600a4b303096399cf3-merged.mount: Deactivated successfully.
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]:    (file & line not available)
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]:    (file & line not available)
Oct 14 08:11:30 np0005486759.ooo.test puppet-user[41245]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Oct 14 08:11:30 np0005486759.ooo.test podman[41320]: 2025-10-14 08:11:30.88210256 +0000 UTC m=+0.151180757 container cleanup a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, release=2, config_id=tripleo_puppet_step1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=container-puppet-nova_libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59)
Oct 14 08:11:30 np0005486759.ooo.test systemd[1]: libpod-conmon-a82f35875c54c150ab1a9c6b0d4bf6847d13578285d372765a666d73b59a190c.scope: Deactivated successfully.
Oct 14 08:11:30 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                      # TODO(emilien): figure how to deal with libvirt profile.
                                                      # We'll probably treat it like we do with Neutron plugins.
                                                      # Until then, just include it in the default nova-compute role.
                                                      include tripleo::profile::base::nova::compute::libvirt
                                                      
                                                      include tripleo::profile::base::nova::libvirt
                                                      
                                                      include tripleo::profile::base::nova::compute::libvirt_guests
                                                      
                                                      include tripleo::profile::base::sshd
                                                      include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.64 seconds
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Notice: Applied catalog in 0.45 seconds
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Application:
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Initial environment: production
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Converged environment: production
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:          Run mode: user
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Changes:
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:             Total: 33
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Events:
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:           Success: 33
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:             Total: 33
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Resources:
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:           Skipped: 21
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:           Changed: 33
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:       Out of sync: 33
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:             Total: 155
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Time:
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:         Resources: 0.00
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Ovn metadata agent config: 0.02
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Neutron config: 0.38
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Transaction evaluation: 0.44
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Catalog application: 0.45
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:    Config retrieval: 0.71
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:          Last run: 1760429491
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:             Total: 0.45
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]: Version:
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:            Config: 1760429490
Oct 14 08:11:31 np0005486759.ooo.test puppet-user[41245]:            Puppet: 7.10.0
Oct 14 08:11:32 np0005486759.ooo.test systemd[1]: libpod-4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec.scope: Deactivated successfully.
Oct 14 08:11:32 np0005486759.ooo.test systemd[1]: libpod-4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec.scope: Consumed 3.575s CPU time.
Oct 14 08:11:32 np0005486759.ooo.test podman[41214]: 2025-10-14 08:11:32.4688395 +0000 UTC m=+3.765427934 container died 4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, distribution-scope=public, container_name=container-puppet-neutron, vendor=Red Hat, Inc., io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:44:03, summary=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, release=1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, com.redhat.component=openstack-neutron-server-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server)
Oct 14 08:11:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b917dec3e2eef93d51865cb5ed3798b1af37c60222c9bce06119f7a2c331ff68-merged.mount: Deactivated successfully.
Oct 14 08:11:32 np0005486759.ooo.test podman[41430]: 2025-10-14 08:11:32.572341998 +0000 UTC m=+0.096657811 container cleanup 4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, container_name=container-puppet-neutron, release=1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=a2a5d3babd6b02c0b20df6d01cd606fef9bdf69d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-server/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T15:44:03, config_id=tripleo_puppet_step1, name=rhosp17/openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server)
Oct 14 08:11:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:32 np0005486759.ooo.test systemd[1]: libpod-conmon-4ce034b164282fce0e3b2887f8021ef9521f5af8c7b961bb3e96216ffdffb1ec.scope: Deactivated successfully.
Oct 14 08:11:32 np0005486759.ooo.test python3[39436]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005486759 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                      include tripleo::profile::base::neutron::ovn_metadata
                                                       --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005486759', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Oct 14 08:11:32 np0005486759.ooo.test sudo[39434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:33 np0005486759.ooo.test sudo[41481]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elfgxhzbcnobeegrdvfcznnndplguyug ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:33 np0005486759.ooo.test sudo[41481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:33 np0005486759.ooo.test python3[41483]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:33 np0005486759.ooo.test sudo[41481]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:33 np0005486759.ooo.test sudo[41497]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwzylbmwpyfyrdhbzigwmybspthexlvb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:33 np0005486759.ooo.test sudo[41497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:33 np0005486759.ooo.test sudo[41497]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:34 np0005486759.ooo.test sudo[41513]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ledrnhapiqaykwhbsccjvjcozlzwwjnr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:34 np0005486759.ooo.test sudo[41513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:34 np0005486759.ooo.test python3[41515]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:11:34 np0005486759.ooo.test sudo[41513]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:34 np0005486759.ooo.test sudo[41563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqxfysgnkhlkjrtharspwxyeawuwcbwv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:34 np0005486759.ooo.test sudo[41563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:34 np0005486759.ooo.test python3[41565]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:34 np0005486759.ooo.test sudo[41563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:34 np0005486759.ooo.test sudo[41606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxtdvanolyjxxtdiwmqsvsbaqnzmxqjt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:35 np0005486759.ooo.test sudo[41606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:35 np0005486759.ooo.test python3[41608]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429494.5700023-103671-68445144865233/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:35 np0005486759.ooo.test sudo[41606]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:35 np0005486759.ooo.test sudo[41668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rokeajzifagbptcgkyknrgyvxhktxppm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:35 np0005486759.ooo.test sudo[41668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:35 np0005486759.ooo.test python3[41670]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:35 np0005486759.ooo.test sudo[41668]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:35 np0005486759.ooo.test sudo[41711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvzsnhypxuxzrhfkfctpofxndvrgxcpf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:35 np0005486759.ooo.test sudo[41711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:35 np0005486759.ooo.test python3[41713]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429495.299676-103671-126730431629161/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:35 np0005486759.ooo.test sudo[41711]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:36 np0005486759.ooo.test sudo[41773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdzzksostkhxvojuywlzdovpzrcvztig ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:36 np0005486759.ooo.test sudo[41773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:36 np0005486759.ooo.test python3[41775]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:36 np0005486759.ooo.test sudo[41773]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:36 np0005486759.ooo.test sudo[41816]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxeetoavkmtznpauaybnajukzbxwsqrx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:36 np0005486759.ooo.test sudo[41816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:36 np0005486759.ooo.test python3[41818]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429496.1465943-103930-176471444370414/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:36 np0005486759.ooo.test sudo[41816]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:37 np0005486759.ooo.test sudo[41878]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klksfqsaqpabfesurvvkaecqqqooxjbl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:37 np0005486759.ooo.test sudo[41878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:37 np0005486759.ooo.test sshd[41881]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 08:11:37 np0005486759.ooo.test python3[41880]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:37 np0005486759.ooo.test sudo[41878]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:37 np0005486759.ooo.test sudo[41923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypgopauxizjkdhevkxmcwoyvlglhyouz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:37 np0005486759.ooo.test sudo[41923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:37 np0005486759.ooo.test python3[41925]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429496.9507468-103970-119904978468201/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:37 np0005486759.ooo.test sudo[41923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:37 np0005486759.ooo.test sudo[41953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okbezioplhzlesuvfruobiutyhmcmefy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:37 np0005486759.ooo.test sudo[41953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:37 np0005486759.ooo.test python3[41955]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:11:37 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:11:38 np0005486759.ooo.test systemd-rc-local-generator[41982]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:11:38 np0005486759.ooo.test systemd-sysv-generator[41986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:11:38 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:11:38 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:11:38 np0005486759.ooo.test systemd-sysv-generator[42020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:11:38 np0005486759.ooo.test systemd-rc-local-generator[42016]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:11:38 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:11:38 np0005486759.ooo.test systemd[1]: Starting TripleO Container Shutdown...
Oct 14 08:11:38 np0005486759.ooo.test systemd[1]: Finished TripleO Container Shutdown.
Oct 14 08:11:38 np0005486759.ooo.test sudo[41953]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:38 np0005486759.ooo.test sudo[42077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzalcuwradfftbjacsttqpaooeuqhhex ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:38 np0005486759.ooo.test sudo[42077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:38 np0005486759.ooo.test python3[42079]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:38 np0005486759.ooo.test sudo[42077]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:39 np0005486759.ooo.test sudo[42120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvhbifnoiaowhsykmdxrqgahvnkhkika ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:39 np0005486759.ooo.test sudo[42120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:39 np0005486759.ooo.test python3[42122]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429498.5922282-103986-75456428586010/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:39 np0005486759.ooo.test sudo[42120]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:39 np0005486759.ooo.test sshd[41881]: Invalid user support from 78.128.112.74 port 54592
Oct 14 08:11:39 np0005486759.ooo.test sudo[42182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsjdstwonnzvcgpshhvieboppkcjnrtk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:39 np0005486759.ooo.test sudo[42182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:39 np0005486759.ooo.test sshd[41881]: Connection closed by invalid user support 78.128.112.74 port 54592 [preauth]
Oct 14 08:11:39 np0005486759.ooo.test python3[42184]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:11:39 np0005486759.ooo.test sudo[42182]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:39 np0005486759.ooo.test sudo[42225]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcvdcypvkabtktritakwqdaazwffvkvi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:39 np0005486759.ooo.test sudo[42225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:39 np0005486759.ooo.test python3[42227]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429499.3796115-103996-170305345773859/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:39 np0005486759.ooo.test sudo[42225]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:40 np0005486759.ooo.test sudo[42255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjllwomyzzckixpkohewmqmmhakpjgiz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:40 np0005486759.ooo.test sudo[42255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:40 np0005486759.ooo.test python3[42257]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:11:40 np0005486759.ooo.test systemd-rc-local-generator[42280]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:11:40 np0005486759.ooo.test systemd-sysv-generator[42287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:11:40 np0005486759.ooo.test systemd-rc-local-generator[42319]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:11:40 np0005486759.ooo.test systemd-sysv-generator[42323]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 08:11:40 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 08:11:40 np0005486759.ooo.test sudo[42255]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:41 np0005486759.ooo.test sudo[42347]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yymwggodvblfcudrbgqldgbxqutxijed ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:41 np0005486759.ooo.test sudo[42347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 2887d8c13a95df5ab0d7c0a262884982
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 324199e84b6ced954fd0cecf75a965ca
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 61017e001fc358991ba0100081a72ad5
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 09cd203329a42b234cd1e76ba6006819
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 9509e102f1abab83a0acc6d291975c60
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 9509e102f1abab83a0acc6d291975c60
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 3fc36489e0095da197228558d2f007a2
Oct 14 08:11:41 np0005486759.ooo.test python3[42349]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc
Oct 14 08:11:41 np0005486759.ooo.test sudo[42347]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:41 np0005486759.ooo.test sudo[42363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbhwfvlzhtvxxajgxrkjtgacfflvyjbx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:41 np0005486759.ooo.test sudo[42363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:41 np0005486759.ooo.test sudo[42363]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:42 np0005486759.ooo.test sudo[42404]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duaguhvlcjyxcoshuchakicsoetmofmo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:42 np0005486759.ooo.test sudo[42404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:42 np0005486759.ooo.test python3[42406]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Oct 14 08:11:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 14 08:11:42 np0005486759.ooo.test podman[42445]: 2025-10-14 08:11:42.98604282 +0000 UTC m=+0.069752893 container create 7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, container_name=metrics_qdr_init_logs, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true)
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: Started libpod-conmon-7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2.scope.
Oct 14 08:11:43 np0005486759.ooo.test podman[42445]: 2025-10-14 08:11:42.950869134 +0000 UTC m=+0.034579217 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39a1665d335f40f9d6ad1b2f0d2d8e770e46fe1b712ba5efdccec54f84e9ea3f/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:43 np0005486759.ooo.test podman[42445]: 2025-10-14 08:11:43.072626993 +0000 UTC m=+0.156336986 container init 7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, batch=17.1_20250721.1, container_name=metrics_qdr_init_logs, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc.)
Oct 14 08:11:43 np0005486759.ooo.test podman[42445]: 2025-10-14 08:11:43.085134606 +0000 UTC m=+0.168844639 container start 7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.buildah.version=1.33.12, container_name=metrics_qdr_init_logs, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1)
Oct 14 08:11:43 np0005486759.ooo.test podman[42445]: 2025-10-14 08:11:43.085929793 +0000 UTC m=+0.169639806 container attach 7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr_init_logs, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible)
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: libpod-7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2.scope: Deactivated successfully.
Oct 14 08:11:43 np0005486759.ooo.test podman[42445]: 2025-10-14 08:11:43.095497563 +0000 UTC m=+0.179207636 container died 7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, container_name=metrics_qdr_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., release=1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, managed_by=tripleo_ansible)
Oct 14 08:11:43 np0005486759.ooo.test podman[42464]: 2025-10-14 08:11:43.189363879 +0000 UTC m=+0.083057753 container cleanup 7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, build-date=2025-07-21T13:07:59, release=1, distribution-scope=public, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: libpod-conmon-7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2.scope: Deactivated successfully.
Oct 14 08:11:43 np0005486759.ooo.test python3[42406]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Oct 14 08:11:43 np0005486759.ooo.test podman[42542]: 2025-10-14 08:11:43.622660628 +0000 UTC m=+0.075037485 container create fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true)
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: Started libpod-conmon-fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.scope.
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:11:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19103ee73e807f98da7458e81c363a7b60ad569c451bd8f50c0daf39d2101319/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19103ee73e807f98da7458e81c363a7b60ad569c451bd8f50c0daf39d2101319/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Oct 14 08:11:43 np0005486759.ooo.test podman[42542]: 2025-10-14 08:11:43.591046864 +0000 UTC m=+0.043423811 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:11:43 np0005486759.ooo.test podman[42542]: 2025-10-14 08:11:43.70835544 +0000 UTC m=+0.160732317 container init fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:11:43 np0005486759.ooo.test sudo[42563]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:11:43 np0005486759.ooo.test sudo[42563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:11:43 np0005486759.ooo.test podman[42542]: 2025-10-14 08:11:43.745008987 +0000 UTC m=+0.197385854 container start fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=metrics_qdr, version=17.1.9, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public)
Oct 14 08:11:43 np0005486759.ooo.test python3[42406]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=2887d8c13a95df5ab0d7c0a262884982 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Oct 14 08:11:43 np0005486759.ooo.test sudo[42563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:43 np0005486759.ooo.test podman[42564]: 2025-10-14 08:11:43.841202222 +0000 UTC m=+0.086116318 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-39a1665d335f40f9d6ad1b2f0d2d8e770e46fe1b712ba5efdccec54f84e9ea3f-merged.mount: Deactivated successfully.
Oct 14 08:11:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d3afdcab54a40c8402ee9e3976f91f73bed3e4bda6c10f1e73f05902a89f8d2-userdata-shm.mount: Deactivated successfully.
Oct 14 08:11:43 np0005486759.ooo.test sudo[42404]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:44 np0005486759.ooo.test podman[42564]: 2025-10-14 08:11:44.08039226 +0000 UTC m=+0.325306316 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, version=17.1.9)
Oct 14 08:11:44 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:11:44 np0005486759.ooo.test sudo[42637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwwkbnvcznxudeoqglwowkahfnrhpnws ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:44 np0005486759.ooo.test sudo[42637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:44 np0005486759.ooo.test python3[42639]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:44 np0005486759.ooo.test sudo[42637]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:44 np0005486759.ooo.test sudo[42653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xucbjkrwjhzsdsckljkgomsawanqlzms ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:44 np0005486759.ooo.test sudo[42653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:44 np0005486759.ooo.test python3[42655]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:11:44 np0005486759.ooo.test sudo[42653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:44 np0005486759.ooo.test sudo[42714]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtjyapfkpknfpozgwmdcgkpgphiwzoof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:44 np0005486759.ooo.test sudo[42714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:45 np0005486759.ooo.test python3[42716]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429504.6098862-104090-110226456041486/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:45 np0005486759.ooo.test sudo[42714]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:45 np0005486759.ooo.test sudo[42730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfjvhdaewfjfpqxpxcftluefgnlpjizp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:45 np0005486759.ooo.test sudo[42730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:45 np0005486759.ooo.test python3[42732]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 08:11:45 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:11:45 np0005486759.ooo.test systemd-sysv-generator[42760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:11:45 np0005486759.ooo.test systemd-rc-local-generator[42757]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:11:45 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:11:45 np0005486759.ooo.test sudo[42730]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:45 np0005486759.ooo.test sudo[42782]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruljlgvrsbkhiqiqqkdtzukwyvatrnfu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:11:45 np0005486759.ooo.test sudo[42782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:46 np0005486759.ooo.test python3[42784]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:11:46 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:11:46 np0005486759.ooo.test systemd-rc-local-generator[42813]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:11:46 np0005486759.ooo.test systemd-sysv-generator[42816]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:11:46 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:11:46 np0005486759.ooo.test systemd[1]: Starting metrics_qdr container...
Oct 14 08:11:46 np0005486759.ooo.test systemd[1]: Started metrics_qdr container.
Oct 14 08:11:46 np0005486759.ooo.test sudo[42782]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:46 np0005486759.ooo.test sudo[42862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atflnjsrgkeuimtekurtfbztfkrztkoj ; /usr/bin/python3
Oct 14 08:11:46 np0005486759.ooo.test sudo[42862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:46 np0005486759.ooo.test python3[42864]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:46 np0005486759.ooo.test sudo[42862]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:47 np0005486759.ooo.test sudo[42910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjjeqkciribppoukhjptchbvgslswnpx ; /usr/bin/python3
Oct 14 08:11:47 np0005486759.ooo.test sudo[42910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:47 np0005486759.ooo.test sudo[42910]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:47 np0005486759.ooo.test sudo[42953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urjtatbvttmgegtnxukwfibyrmuanflc ; /usr/bin/python3
Oct 14 08:11:47 np0005486759.ooo.test sudo[42953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:47 np0005486759.ooo.test sudo[42953]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:48 np0005486759.ooo.test sudo[42983]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuxfvsckvshueykdysphelsypviuszdu ; /usr/bin/python3
Oct 14 08:11:48 np0005486759.ooo.test sudo[42983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:48 np0005486759.ooo.test python3[42985]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005486759 step=1 update_config_hash_only=False
Oct 14 08:11:48 np0005486759.ooo.test sudo[42983]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:48 np0005486759.ooo.test sudo[42999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqanjtvjsxhzreycuwfixglphucbkekn ; /usr/bin/python3
Oct 14 08:11:48 np0005486759.ooo.test sudo[42999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:48 np0005486759.ooo.test python3[43001]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:11:48 np0005486759.ooo.test sudo[42999]: pam_unix(sudo:session): session closed for user root
Oct 14 08:11:48 np0005486759.ooo.test sudo[43015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrzvfaotwdxdqspzbrgneryeizzifzxy ; /usr/bin/python3
Oct 14 08:11:48 np0005486759.ooo.test sudo[43015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:11:49 np0005486759.ooo.test python3[43017]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Oct 14 08:11:49 np0005486759.ooo.test sudo[43015]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:12:14 np0005486759.ooo.test systemd[1]: tmp-crun.SltFGu.mount: Deactivated successfully.
Oct 14 08:12:14 np0005486759.ooo.test podman[43018]: 2025-10-14 08:12:14.468629299 +0000 UTC m=+0.095144275 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:12:14 np0005486759.ooo.test podman[43018]: 2025-10-14 08:12:14.64172171 +0000 UTC m=+0.268236696 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, build-date=2025-07-21T13:07:59, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, tcib_managed=true)
Oct 14 08:12:14 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:12:24 np0005486759.ooo.test sudo[43092]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czqizrnxlqgbmhzuenkwkhfllhymaxxv ; /usr/bin/python3
Oct 14 08:12:24 np0005486759.ooo.test sudo[43092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:24 np0005486759.ooo.test python3[43094]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:24 np0005486759.ooo.test sudo[43092]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:24 np0005486759.ooo.test sudo[43137]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqubwkyqmmldqeohvzzkrmbltiwqvyag ; /usr/bin/python3
Oct 14 08:12:24 np0005486759.ooo.test sudo[43137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:24 np0005486759.ooo.test python3[43139]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429544.2919698-104931-74351784766125/source _original_basename=tmp06gk6oma follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:24 np0005486759.ooo.test sudo[43137]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:25 np0005486759.ooo.test sudo[43199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rohdnyqbkawwkxawzuxyyeblpyzwcfly ; /usr/bin/python3
Oct 14 08:12:25 np0005486759.ooo.test sudo[43199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:25 np0005486759.ooo.test python3[43201]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:25 np0005486759.ooo.test sudo[43199]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:25 np0005486759.ooo.test sudo[43242]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caenkpbprrvqoqbhdqasqmspkpuphuna ; /usr/bin/python3
Oct 14 08:12:25 np0005486759.ooo.test sudo[43242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:26 np0005486759.ooo.test python3[43244]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429545.3664258-104961-209260325345314/source _original_basename=tmpeirhmasf follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:26 np0005486759.ooo.test sudo[43242]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:26 np0005486759.ooo.test sudo[43272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyxgswyfhesayfzrhsyqodqcbmudbwhz ; /usr/bin/python3
Oct 14 08:12:26 np0005486759.ooo.test sudo[43272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:26 np0005486759.ooo.test python3[43274]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Oct 14 08:12:26 np0005486759.ooo.test crontab[43275]: (root) LIST (root)
Oct 14 08:12:26 np0005486759.ooo.test crontab[43276]: (root) REPLACE (root)
Oct 14 08:12:26 np0005486759.ooo.test sudo[43272]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:26 np0005486759.ooo.test sudo[43290]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxwznxdsszvxmgkcbrgoicvicslrftlg ; /usr/bin/python3
Oct 14 08:12:26 np0005486759.ooo.test sudo[43290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:26 np0005486759.ooo.test python3[43292]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:12:26 np0005486759.ooo.test sudo[43290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:27 np0005486759.ooo.test sudo[43340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lldmsulypiohoewozrhvdchcuzeoinmw ; /usr/bin/python3
Oct 14 08:12:27 np0005486759.ooo.test sudo[43340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:27 np0005486759.ooo.test sudo[43340]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:27 np0005486759.ooo.test sudo[43358]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvrkmrxzgdgqqdbordlhawuafjjyddjw ; /usr/bin/python3
Oct 14 08:12:27 np0005486759.ooo.test sudo[43358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:27 np0005486759.ooo.test sudo[43358]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:28 np0005486759.ooo.test sudo[43462]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mixxpqvrhmduysmwfxmrtuoqthupmaok ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429547.7759333-105006-121997148314488/async_wrapper.py 831283418660 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429547.7759333-105006-121997148314488/AnsiballZ_command.py _
Oct 14 08:12:28 np0005486759.ooo.test sudo[43462]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:12:28 np0005486759.ooo.test ansible-async_wrapper.py[43464]: Invoked with 831283418660 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429547.7759333-105006-121997148314488/AnsiballZ_command.py _
Oct 14 08:12:28 np0005486759.ooo.test ansible-async_wrapper.py[43467]: Starting module and watcher
Oct 14 08:12:28 np0005486759.ooo.test ansible-async_wrapper.py[43467]: Start watching 43468 (3600)
Oct 14 08:12:28 np0005486759.ooo.test ansible-async_wrapper.py[43468]: Start module (43468)
Oct 14 08:12:28 np0005486759.ooo.test ansible-async_wrapper.py[43464]: Return async_wrapper task started.
Oct 14 08:12:28 np0005486759.ooo.test sudo[43462]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:28 np0005486759.ooo.test sudo[43483]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzfcqemvcvljtbbcebpptlpfrlizukbf ; /usr/bin/python3
Oct 14 08:12:28 np0005486759.ooo.test sudo[43483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:28 np0005486759.ooo.test python3[43488]: ansible-ansible.legacy.async_status Invoked with jid=831283418660.43464 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:12:28 np0005486759.ooo.test sudo[43483]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    (file & line not available)
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    (file & line not available)
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.11 seconds
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Notice: Applied catalog in 0.04 seconds
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Application:
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    Initial environment: production
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    Converged environment: production
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:          Run mode: user
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Changes:
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Events:
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Resources:
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:             Total: 10
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Time:
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:          Schedule: 0.00
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:              File: 0.00
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:              Exec: 0.01
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:            Augeas: 0.01
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    Transaction evaluation: 0.03
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    Catalog application: 0.04
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:    Config retrieval: 0.15
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:          Last run: 1760429552
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:        Filebucket: 0.00
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:             Total: 0.04
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]: Version:
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:            Config: 1760429552
Oct 14 08:12:32 np0005486759.ooo.test puppet-user[43487]:            Puppet: 7.10.0
Oct 14 08:12:32 np0005486759.ooo.test ansible-async_wrapper.py[43468]: Module complete (43468)
Oct 14 08:12:33 np0005486759.ooo.test ansible-async_wrapper.py[43467]: Done in kid B.
Oct 14 08:12:38 np0005486759.ooo.test sudo[43613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zemdqtvjjythpdyewpexvyinfomwnwed ; /usr/bin/python3
Oct 14 08:12:38 np0005486759.ooo.test sudo[43613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:38 np0005486759.ooo.test python3[43615]: ansible-ansible.legacy.async_status Invoked with jid=831283418660.43464 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:12:38 np0005486759.ooo.test sudo[43613]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:39 np0005486759.ooo.test sudo[43629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctxwypdlnmnwbyngzbaomdyghaifrayo ; /usr/bin/python3
Oct 14 08:12:39 np0005486759.ooo.test sudo[43629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:39 np0005486759.ooo.test python3[43631]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:12:39 np0005486759.ooo.test sudo[43629]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:39 np0005486759.ooo.test sudo[43645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuwascaqymlrqbnttsuvzdsbxnueylfa ; /usr/bin/python3
Oct 14 08:12:39 np0005486759.ooo.test sudo[43645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:39 np0005486759.ooo.test python3[43647]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:12:39 np0005486759.ooo.test sudo[43645]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:40 np0005486759.ooo.test sudo[43695]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynmiosbuuonwvjiiksgelevnbswtyids ; /usr/bin/python3
Oct 14 08:12:40 np0005486759.ooo.test sudo[43695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:40 np0005486759.ooo.test python3[43697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:40 np0005486759.ooo.test sudo[43695]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:40 np0005486759.ooo.test sudo[43713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjywhbqkrijhpjxxmktfkmwblmwxmhez ; /usr/bin/python3
Oct 14 08:12:40 np0005486759.ooo.test sudo[43713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:40 np0005486759.ooo.test python3[43715]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmptfe4kagm recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:12:40 np0005486759.ooo.test sudo[43713]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:40 np0005486759.ooo.test sudo[43743]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaiuyazgfkxeossswrfvxmhbdjqyapcx ; /usr/bin/python3
Oct 14 08:12:40 np0005486759.ooo.test sudo[43743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:40 np0005486759.ooo.test python3[43745]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:40 np0005486759.ooo.test sudo[43743]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:41 np0005486759.ooo.test sudo[43759]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxcqvtgmagjawlwgfwuuxfktrftodmwo ; /usr/bin/python3
Oct 14 08:12:41 np0005486759.ooo.test sudo[43759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:41 np0005486759.ooo.test sudo[43759]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:41 np0005486759.ooo.test sudo[43846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vywgtradcorrphpygmzapbcmwzwuvoek ; /usr/bin/python3
Oct 14 08:12:41 np0005486759.ooo.test sudo[43846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:42 np0005486759.ooo.test python3[43848]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Oct 14 08:12:42 np0005486759.ooo.test sudo[43846]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:42 np0005486759.ooo.test sudo[43865]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzgyeoiqxzzjusyqkuywviznldjnbzqi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:42 np0005486759.ooo.test sudo[43865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:42 np0005486759.ooo.test python3[43867]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:42 np0005486759.ooo.test sudo[43865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:42 np0005486759.ooo.test sudo[43881]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwxotyipptgbyskrqbvwktepopfhrvcd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:42 np0005486759.ooo.test sudo[43881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:42 np0005486759.ooo.test sudo[43881]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:43 np0005486759.ooo.test sudo[43897]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdgrspjhrylkqvhnvvlcfikfgnhalubs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:43 np0005486759.ooo.test sudo[43897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:43 np0005486759.ooo.test python3[43899]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:12:43 np0005486759.ooo.test sudo[43897]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:43 np0005486759.ooo.test sudo[43947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxpigwhmzbkwvmbwipvbmldggppnoomw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:43 np0005486759.ooo.test sudo[43947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:43 np0005486759.ooo.test python3[43949]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:43 np0005486759.ooo.test sudo[43947]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:44 np0005486759.ooo.test sudo[43965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssxcmuckzeckboworfzlidagqefjuqwp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:44 np0005486759.ooo.test sudo[43965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:44 np0005486759.ooo.test python3[43967]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:44 np0005486759.ooo.test sudo[43965]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:44 np0005486759.ooo.test sudo[44027]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqrcfkaxdnlktxnmivfzrjcwsadopzaq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:44 np0005486759.ooo.test sudo[44027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:44 np0005486759.ooo.test python3[44029]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:44 np0005486759.ooo.test sudo[44027]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:44 np0005486759.ooo.test sudo[44045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhurwldypmagsfmegrmnwqphofoiuata ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:44 np0005486759.ooo.test sudo[44045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:12:44 np0005486759.ooo.test podman[44048]: 2025-10-14 08:12:44.907228135 +0000 UTC m=+0.079399973 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:12:44 np0005486759.ooo.test python3[44047]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:44 np0005486759.ooo.test sudo[44045]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:45 np0005486759.ooo.test podman[44048]: 2025-10-14 08:12:45.137139895 +0000 UTC m=+0.309311763 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, release=1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:12:45 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:12:45 np0005486759.ooo.test sudo[44135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjwhwdyqbhansjjznrzxloatfhrticpu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:45 np0005486759.ooo.test sudo[44135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:45 np0005486759.ooo.test python3[44137]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:45 np0005486759.ooo.test sudo[44135]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:45 np0005486759.ooo.test sudo[44153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsharnxfisxikgtjakmgzflscruyevcl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:45 np0005486759.ooo.test sudo[44153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:45 np0005486759.ooo.test python3[44155]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:45 np0005486759.ooo.test sudo[44153]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:45 np0005486759.ooo.test sudo[44215]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypldjeuavxoelfjyoqekrzxwkccjulfm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:45 np0005486759.ooo.test sudo[44215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:46 np0005486759.ooo.test python3[44217]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:46 np0005486759.ooo.test sudo[44215]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:46 np0005486759.ooo.test sudo[44233]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvfisdnsrpnlthuzuewwncluxozbzgov ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:46 np0005486759.ooo.test sudo[44233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:46 np0005486759.ooo.test python3[44235]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:46 np0005486759.ooo.test sudo[44233]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:46 np0005486759.ooo.test sudo[44263]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpdsakfdqkeasbdpytdjlzupmmwjjefy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:46 np0005486759.ooo.test sudo[44263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:46 np0005486759.ooo.test python3[44265]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:12:46 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:12:46 np0005486759.ooo.test systemd-rc-local-generator[44286]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:12:46 np0005486759.ooo.test systemd-sysv-generator[44290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:12:47 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:12:47 np0005486759.ooo.test sudo[44263]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:47 np0005486759.ooo.test sudo[44349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylyvevongkqsqjyosniipjbvdfwnwuvz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:47 np0005486759.ooo.test sudo[44349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:47 np0005486759.ooo.test python3[44351]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:47 np0005486759.ooo.test sudo[44349]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:47 np0005486759.ooo.test sudo[44367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcqllqdztvfkbcxvxjgjwtpzwvhrjzya ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:47 np0005486759.ooo.test sudo[44367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:47 np0005486759.ooo.test python3[44369]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:47 np0005486759.ooo.test sudo[44367]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:48 np0005486759.ooo.test sudo[44429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzhxtcxaegjslorqieifidrapwnkhpkk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:48 np0005486759.ooo.test sudo[44429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:48 np0005486759.ooo.test python3[44431]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:12:48 np0005486759.ooo.test sudo[44429]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:48 np0005486759.ooo.test sudo[44447]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqzaakdtovctmqneyzpykbkfllkdjlot ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:48 np0005486759.ooo.test sudo[44447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:48 np0005486759.ooo.test python3[44449]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:48 np0005486759.ooo.test sudo[44447]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:48 np0005486759.ooo.test sudo[44477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiyrntwnkcrhafceklgpiurbswomhnjw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:48 np0005486759.ooo.test sudo[44477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:49 np0005486759.ooo.test python3[44479]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:12:49 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:12:49 np0005486759.ooo.test systemd-sysv-generator[44506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:12:49 np0005486759.ooo.test systemd-rc-local-generator[44500]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:12:49 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:12:49 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 08:12:49 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 08:12:49 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 08:12:49 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 08:12:49 np0005486759.ooo.test sudo[44477]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:49 np0005486759.ooo.test sudo[44533]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtercspgctdxlyvknbdrdashjchqfwyu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:49 np0005486759.ooo.test sudo[44533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:49 np0005486759.ooo.test python3[44535]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Oct 14 08:12:49 np0005486759.ooo.test sudo[44533]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:50 np0005486759.ooo.test sudo[44549]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxddroycbxbbwndgunexoujxbhrvrbty ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:50 np0005486759.ooo.test sudo[44549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:50 np0005486759.ooo.test sudo[44549]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:50 np0005486759.ooo.test sudo[44591]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqkxwxbqjerrtjtabgjkvhkkiptydvsc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:12:50 np0005486759.ooo.test sudo[44591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:51 np0005486759.ooo.test python3[44593]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Oct 14 08:12:51 np0005486759.ooo.test podman[44666]: 2025-10-14 08:12:51.409886029 +0000 UTC m=+0.094682778 container create a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.33.12, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20250721.1, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtqemud_init_logs, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., version=17.1.9)
Oct 14 08:12:51 np0005486759.ooo.test podman[44678]: 2025-10-14 08:12:51.440778428 +0000 UTC m=+0.102739075 container create 65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, tcib_managed=true, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute_init_log, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step2, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:12:51 np0005486759.ooo.test podman[44666]: 2025-10-14 08:12:51.364139812 +0000 UTC m=+0.048936571 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:12:51 np0005486759.ooo.test podman[44678]: 2025-10-14 08:12:51.376736782 +0000 UTC m=+0.038697479 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: Started libpod-conmon-a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2.scope.
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: Started libpod-conmon-65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4.scope.
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:12:51 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a1f8014015fac1f54a9bc082942615ea4555a297c6729790a93ba597cfe8e5d/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:12:51 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28829b7b07eca0acf5593dbdc913962aa4bb6cc59399d9fd0278f6a01f103c03/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Oct 14 08:12:51 np0005486759.ooo.test podman[44678]: 2025-10-14 08:12:51.535382162 +0000 UTC m=+0.197342799 container init 65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, tcib_managed=true, build-date=2025-07-21T14:48:37, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step2, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:12:51 np0005486759.ooo.test podman[44666]: 2025-10-14 08:12:51.537008093 +0000 UTC m=+0.221804812 container init a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, release=2, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, container_name=nova_virtqemud_init_logs)
Oct 14 08:12:51 np0005486759.ooo.test podman[44678]: 2025-10-14 08:12:51.546457805 +0000 UTC m=+0.208418452 container start 65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=nova_compute_init_log, config_id=tripleo_step2, release=1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.9, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:12:51 np0005486759.ooo.test python3[44593]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1760428936 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: libpod-a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2.scope: Deactivated successfully.
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: libpod-65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4.scope: Deactivated successfully.
Oct 14 08:12:51 np0005486759.ooo.test podman[44666]: 2025-10-14 08:12:51.59714094 +0000 UTC m=+0.281937689 container start a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, release=2, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, architecture=x86_64, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-07-21T14:56:59, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_virtqemud_init_logs, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:12:51 np0005486759.ooo.test python3[44593]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1760428936 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Oct 14 08:12:51 np0005486759.ooo.test podman[44709]: 2025-10-14 08:12:51.625034206 +0000 UTC m=+0.057289400 container died 65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, managed_by=tripleo_ansible, config_id=tripleo_step2, distribution-scope=public, container_name=nova_compute_init_log, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, release=1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, architecture=x86_64)
Oct 14 08:12:51 np0005486759.ooo.test podman[44741]: 2025-10-14 08:12:51.673010168 +0000 UTC m=+0.056115224 container died a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, build-date=2025-07-21T14:56:59, vendor=Red Hat, Inc., container_name=nova_virtqemud_init_logs, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, batch=17.1_20250721.1, tcib_managed=true, distribution-scope=public)
Oct 14 08:12:51 np0005486759.ooo.test podman[44710]: 2025-10-14 08:12:51.747282365 +0000 UTC m=+0.179932424 container cleanup a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.expose-services=, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, config_id=tripleo_step2, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: libpod-conmon-a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2.scope: Deactivated successfully.
Oct 14 08:12:51 np0005486759.ooo.test podman[44711]: 2025-10-14 08:12:51.804862343 +0000 UTC m=+0.236295655 container cleanup 65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, tcib_managed=true, config_id=tripleo_step2, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, container_name=nova_compute_init_log, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:12:51 np0005486759.ooo.test systemd[1]: libpod-conmon-65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4.scope: Deactivated successfully.
Oct 14 08:12:52 np0005486759.ooo.test podman[44868]: 2025-10-14 08:12:52.017094029 +0000 UTC m=+0.053646180 container create c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, release=2, name=rhosp17/openstack-nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, container_name=create_virtlogd_wrapper, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:12:52 np0005486759.ooo.test podman[44857]: 2025-10-14 08:12:52.041826823 +0000 UTC m=+0.092920979 container create f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step2, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=create_haproxy_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: Started libpod-conmon-c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4.scope.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: Started libpod-conmon-f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf.scope.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:12:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/340db1f32c5b8494030b19fa09e01376495b270982b3deb650613f6581d7039b/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Oct 14 08:12:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c30bb5e31956af76c03d50d112f6bb34230c8fc11c69fcfa029a4de38d1c951d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 08:12:52 np0005486759.ooo.test podman[44868]: 2025-10-14 08:12:52.084881087 +0000 UTC m=+0.121433238 container init c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step2, distribution-scope=public, vendor=Red Hat, Inc., container_name=create_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-07-21T14:56:59, release=2, version=17.1.9, name=rhosp17/openstack-nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:12:52 np0005486759.ooo.test podman[44868]: 2025-10-14 08:12:51.987585331 +0000 UTC m=+0.024137462 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:12:52 np0005486759.ooo.test podman[44857]: 2025-10-14 08:12:52.087869341 +0000 UTC m=+0.138963447 container init f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step2, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, release=1, container_name=create_haproxy_wrapper, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Oct 14 08:12:52 np0005486759.ooo.test podman[44868]: 2025-10-14 08:12:52.09070456 +0000 UTC m=+0.127256681 container start c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=2, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-07-21T14:56:59, container_name=create_virtlogd_wrapper, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2)
Oct 14 08:12:52 np0005486759.ooo.test podman[44868]: 2025-10-14 08:12:52.090929108 +0000 UTC m=+0.127481299 container attach c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-07-21T14:56:59, config_id=tripleo_step2, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, maintainer=OpenStack TripleO Team, release=2, vendor=Red Hat, Inc., container_name=create_virtlogd_wrapper, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-libvirt)
Oct 14 08:12:52 np0005486759.ooo.test podman[44857]: 2025-10-14 08:12:51.992856843 +0000 UTC m=+0.043950929 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Oct 14 08:12:52 np0005486759.ooo.test podman[44857]: 2025-10-14 08:12:52.096788932 +0000 UTC m=+0.147883028 container start f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_id=tripleo_step2, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, container_name=create_haproxy_wrapper)
Oct 14 08:12:52 np0005486759.ooo.test podman[44857]: 2025-10-14 08:12:52.097054542 +0000 UTC m=+0.148148698 container attach f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step2, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, release=1)
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: tmp-crun.SKz9f3.mount: Deactivated successfully.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-2a1f8014015fac1f54a9bc082942615ea4555a297c6729790a93ba597cfe8e5d-merged.mount: Deactivated successfully.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65d95edc3e371946c294e627d60679aec976a63090344a84fc8b3275922ad8d4-userdata-shm.mount: Deactivated successfully.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-28829b7b07eca0acf5593dbdc913962aa4bb6cc59399d9fd0278f6a01f103c03-merged.mount: Deactivated successfully.
Oct 14 08:12:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2-userdata-shm.mount: Deactivated successfully.
Oct 14 08:12:53 np0005486759.ooo.test ovs-vsctl[44968]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Oct 14 08:12:54 np0005486759.ooo.test systemd[1]: libpod-c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4.scope: Deactivated successfully.
Oct 14 08:12:54 np0005486759.ooo.test systemd[1]: libpod-c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4.scope: Consumed 2.140s CPU time.
Oct 14 08:12:54 np0005486759.ooo.test podman[44868]: 2025-10-14 08:12:54.241528692 +0000 UTC m=+2.278080873 container died c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, version=17.1.9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, release=2, build-date=2025-07-21T14:56:59, container_name=create_virtlogd_wrapper, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:12:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4-userdata-shm.mount: Deactivated successfully.
Oct 14 08:12:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-340db1f32c5b8494030b19fa09e01376495b270982b3deb650613f6581d7039b-merged.mount: Deactivated successfully.
Oct 14 08:12:54 np0005486759.ooo.test podman[45113]: 2025-10-14 08:12:54.338418172 +0000 UTC m=+0.084700225 container cleanup c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, release=2, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, build-date=2025-07-21T14:56:59, config_id=tripleo_step2, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 08:12:54 np0005486759.ooo.test systemd[1]: libpod-conmon-c4914709ecca8f7151975b96db134a4638e60cbe7d59428435937a140a6a76a4.scope: Deactivated successfully.
Oct 14 08:12:54 np0005486759.ooo.test python3[44593]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1760428936 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Oct 14 08:12:55 np0005486759.ooo.test systemd[1]: libpod-f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf.scope: Deactivated successfully.
Oct 14 08:12:55 np0005486759.ooo.test systemd[1]: libpod-f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf.scope: Consumed 2.095s CPU time.
Oct 14 08:12:55 np0005486759.ooo.test podman[44857]: 2025-10-14 08:12:55.211829859 +0000 UTC m=+3.262923935 container died f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step2, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=create_haproxy_wrapper, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:12:55 np0005486759.ooo.test podman[45151]: 2025-10-14 08:12:55.271012519 +0000 UTC m=+0.053788285 container cleanup f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, container_name=create_haproxy_wrapper, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, release=1, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=)
Oct 14 08:12:55 np0005486759.ooo.test systemd[1]: libpod-conmon-f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf.scope: Deactivated successfully.
Oct 14 08:12:55 np0005486759.ooo.test python3[44593]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Oct 14 08:12:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c30bb5e31956af76c03d50d112f6bb34230c8fc11c69fcfa029a4de38d1c951d-merged.mount: Deactivated successfully.
Oct 14 08:12:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6ca7d52fb5776f1a0feb87ba30a16a8d7f681782cffd3f372fe67e4dffdf1bf-userdata-shm.mount: Deactivated successfully.
Oct 14 08:12:55 np0005486759.ooo.test sudo[44591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:55 np0005486759.ooo.test sudo[45202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqlcsyzgfpllsmvhbkizrepadvfvxiiv ; /usr/bin/python3
Oct 14 08:12:55 np0005486759.ooo.test sudo[45202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:55 np0005486759.ooo.test python3[45204]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:55 np0005486759.ooo.test sudo[45202]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:56 np0005486759.ooo.test sudo[45250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzcavachmrlpyfjebylhoetvnffaoovt ; /usr/bin/python3
Oct 14 08:12:56 np0005486759.ooo.test sudo[45250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:56 np0005486759.ooo.test sudo[45250]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:56 np0005486759.ooo.test sudo[45293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whsgxxtpbxvouztklpcajsgphjdgkylh ; /usr/bin/python3
Oct 14 08:12:56 np0005486759.ooo.test sudo[45293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:56 np0005486759.ooo.test sudo[45293]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:56 np0005486759.ooo.test sudo[45323]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkmcdeygviquttcguscuunifkhiadqzr ; /usr/bin/python3
Oct 14 08:12:56 np0005486759.ooo.test sudo[45323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:57 np0005486759.ooo.test python3[45325]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005486759 step=2 update_config_hash_only=False
Oct 14 08:12:57 np0005486759.ooo.test sudo[45323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:57 np0005486759.ooo.test sudo[45339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fykqcjzpiyijdisyqsphouauucmjjgbt ; /usr/bin/python3
Oct 14 08:12:57 np0005486759.ooo.test sudo[45339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:57 np0005486759.ooo.test python3[45341]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:12:57 np0005486759.ooo.test sudo[45339]: pam_unix(sudo:session): session closed for user root
Oct 14 08:12:57 np0005486759.ooo.test sudo[45355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maaxparvhpxmljrjxvniliuoamsswldn ; /usr/bin/python3
Oct 14 08:12:57 np0005486759.ooo.test sudo[45355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:12:57 np0005486759.ooo.test python3[45357]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Oct 14 08:12:57 np0005486759.ooo.test sudo[45355]: pam_unix(sudo:session): session closed for user root
Oct 14 08:13:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:13:15 np0005486759.ooo.test podman[45358]: 2025-10-14 08:13:15.468995548 +0000 UTC m=+0.092527818 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.12, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:13:15 np0005486759.ooo.test podman[45358]: 2025-10-14 08:13:15.680387928 +0000 UTC m=+0.303920208 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:13:15 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:13:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:13:46 np0005486759.ooo.test podman[45386]: 2025-10-14 08:13:46.461665059 +0000 UTC m=+0.084361760 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Oct 14 08:13:46 np0005486759.ooo.test podman[45386]: 2025-10-14 08:13:46.658580911 +0000 UTC m=+0.281277562 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:13:46 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:14:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:14:17 np0005486759.ooo.test podman[45413]: 2025-10-14 08:14:17.453554408 +0000 UTC m=+0.082501400 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9)
Oct 14 08:14:17 np0005486759.ooo.test podman[45413]: 2025-10-14 08:14:17.650614165 +0000 UTC m=+0.279561157 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, version=17.1.9, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1)
Oct 14 08:14:17 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:14:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:14:48 np0005486759.ooo.test systemd[1]: tmp-crun.RaQHmM.mount: Deactivated successfully.
Oct 14 08:14:48 np0005486759.ooo.test podman[45441]: 2025-10-14 08:14:48.463925046 +0000 UTC m=+0.089760918 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_step1, distribution-scope=public)
Oct 14 08:14:48 np0005486759.ooo.test podman[45441]: 2025-10-14 08:14:48.686685843 +0000 UTC m=+0.312521795 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr)
Oct 14 08:14:48 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:15:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:15:19 np0005486759.ooo.test systemd[1]: tmp-crun.1ORb7a.mount: Deactivated successfully.
Oct 14 08:15:19 np0005486759.ooo.test podman[45471]: 2025-10-14 08:15:19.442242707 +0000 UTC m=+0.072917536 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:15:19 np0005486759.ooo.test podman[45471]: 2025-10-14 08:15:19.623768927 +0000 UTC m=+0.254443826 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-type=git)
Oct 14 08:15:19 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:15:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:15:50 np0005486759.ooo.test podman[45500]: 2025-10-14 08:15:50.458408381 +0000 UTC m=+0.083518220 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T13:07:59)
Oct 14 08:15:50 np0005486759.ooo.test podman[45500]: 2025-10-14 08:15:50.691639202 +0000 UTC m=+0.316749051 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:15:50 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:15:58 np0005486759.ooo.test sudo[45574]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gibghllxunfpcyamqmetfftsmfgrsjbq ; /usr/bin/python3
Oct 14 08:15:58 np0005486759.ooo.test sudo[45574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:15:58 np0005486759.ooo.test python3[45576]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:15:58 np0005486759.ooo.test sudo[45574]: pam_unix(sudo:session): session closed for user root
Oct 14 08:15:58 np0005486759.ooo.test sudo[45619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiaewcbxaistjvpwomjxogsijxsfblsh ; /usr/bin/python3
Oct 14 08:15:58 np0005486759.ooo.test sudo[45619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:15:59 np0005486759.ooo.test python3[45621]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429758.328054-108898-70050417173764/source _original_basename=tmp_2tgldqc follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:15:59 np0005486759.ooo.test sudo[45619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:15:59 np0005486759.ooo.test sudo[45649]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpzunfmrftwigzqtpqeudafvkfmkcjdd ; /usr/bin/python3
Oct 14 08:15:59 np0005486759.ooo.test sudo[45649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:15:59 np0005486759.ooo.test python3[45651]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:15:59 np0005486759.ooo.test sudo[45649]: pam_unix(sudo:session): session closed for user root
Oct 14 08:15:59 np0005486759.ooo.test sudo[45699]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjzdkyoftgxagcyrydaetjtuqagswhhd ; /usr/bin/python3
Oct 14 08:16:00 np0005486759.ooo.test sudo[45699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:00 np0005486759.ooo.test sudo[45699]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:00 np0005486759.ooo.test sudo[45717]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vepggnkcgyjtaqawcvnnfjockcyibles ; /usr/bin/python3
Oct 14 08:16:00 np0005486759.ooo.test sudo[45717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:00 np0005486759.ooo.test sudo[45717]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:01 np0005486759.ooo.test sudo[45821]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljaxhpnfpfcmazlaekqzilcbgjdddnsp ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429760.6664138-108954-209334902144416/async_wrapper.py 851441010322 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429760.6664138-108954-209334902144416/AnsiballZ_command.py _
Oct 14 08:16:01 np0005486759.ooo.test sudo[45821]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:16:01 np0005486759.ooo.test ansible-async_wrapper.py[45823]: Invoked with 851441010322 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429760.6664138-108954-209334902144416/AnsiballZ_command.py _
Oct 14 08:16:01 np0005486759.ooo.test ansible-async_wrapper.py[45826]: Starting module and watcher
Oct 14 08:16:01 np0005486759.ooo.test ansible-async_wrapper.py[45826]: Start watching 45827 (3600)
Oct 14 08:16:01 np0005486759.ooo.test ansible-async_wrapper.py[45827]: Start module (45827)
Oct 14 08:16:01 np0005486759.ooo.test ansible-async_wrapper.py[45823]: Return async_wrapper task started.
Oct 14 08:16:01 np0005486759.ooo.test sudo[45821]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:01 np0005486759.ooo.test sudo[45842]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bejtwcgkhdixxkftbozryxlbihmfvish ; /usr/bin/python3
Oct 14 08:16:01 np0005486759.ooo.test sudo[45842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:01 np0005486759.ooo.test python3[45845]: ansible-ansible.legacy.async_status Invoked with jid=851441010322.45823 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:16:01 np0005486759.ooo.test sudo[45842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    (file & line not available)
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    (file & line not available)
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.11 seconds
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Notice: Applied catalog in 0.03 seconds
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Application:
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    Initial environment: production
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    Converged environment: production
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:          Run mode: user
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Changes:
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Events:
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Resources:
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:             Total: 10
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Time:
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:          Schedule: 0.00
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:              File: 0.00
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:              Exec: 0.00
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:            Augeas: 0.01
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    Transaction evaluation: 0.02
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    Catalog application: 0.03
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:    Config retrieval: 0.14
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:          Last run: 1760429765
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:        Filebucket: 0.00
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:             Total: 0.04
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]: Version:
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:            Config: 1760429765
Oct 14 08:16:05 np0005486759.ooo.test puppet-user[45847]:            Puppet: 7.10.0
Oct 14 08:16:05 np0005486759.ooo.test ansible-async_wrapper.py[45827]: Module complete (45827)
Oct 14 08:16:06 np0005486759.ooo.test ansible-async_wrapper.py[45826]: Done in kid B.
Oct 14 08:16:11 np0005486759.ooo.test sudo[45971]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unypzgnfxpokpvanjouzbnlfmdaxamom ; /usr/bin/python3
Oct 14 08:16:11 np0005486759.ooo.test sudo[45971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:11 np0005486759.ooo.test python3[45973]: ansible-ansible.legacy.async_status Invoked with jid=851441010322.45823 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:16:11 np0005486759.ooo.test sudo[45971]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:12 np0005486759.ooo.test sudo[45987]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgzcystymxfaplvdrfusguehjkjvwmbo ; /usr/bin/python3
Oct 14 08:16:12 np0005486759.ooo.test sudo[45987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:12 np0005486759.ooo.test python3[45989]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:16:12 np0005486759.ooo.test sudo[45987]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:12 np0005486759.ooo.test sudo[46003]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rixyqwkgnmgpjoeyscdthogoubmfgbka ; /usr/bin/python3
Oct 14 08:16:12 np0005486759.ooo.test sudo[46003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:12 np0005486759.ooo.test python3[46005]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:12 np0005486759.ooo.test sudo[46003]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:13 np0005486759.ooo.test sudo[46053]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvomocfazedoxmuokracfmpeicjvnnxb ; /usr/bin/python3
Oct 14 08:16:13 np0005486759.ooo.test sudo[46053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:13 np0005486759.ooo.test python3[46055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:13 np0005486759.ooo.test sudo[46053]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:13 np0005486759.ooo.test sudo[46071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdcjiikrvjmputvpwwhnbcyakwotulcu ; /usr/bin/python3
Oct 14 08:16:13 np0005486759.ooo.test sudo[46071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:13 np0005486759.ooo.test python3[46073]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpwei2nby3 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:16:13 np0005486759.ooo.test sudo[46071]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:13 np0005486759.ooo.test sudo[46101]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyrwqijhaqqvysbynrebvpeuzwpqxbsg ; /usr/bin/python3
Oct 14 08:16:13 np0005486759.ooo.test sudo[46101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:13 np0005486759.ooo.test python3[46103]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:13 np0005486759.ooo.test sudo[46101]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:14 np0005486759.ooo.test sudo[46117]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dorpwshqassjygdepeaoylkkprlrycif ; /usr/bin/python3
Oct 14 08:16:14 np0005486759.ooo.test sudo[46117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:14 np0005486759.ooo.test sudo[46117]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:14 np0005486759.ooo.test sudo[46204]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqdsjskrjasbjxplxwqhzhacwmgarwct ; /usr/bin/python3
Oct 14 08:16:14 np0005486759.ooo.test sudo[46204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:14 np0005486759.ooo.test python3[46206]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Oct 14 08:16:14 np0005486759.ooo.test sudo[46204]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:15 np0005486759.ooo.test sudo[46223]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egrbneoftvlaiitvlnvmuzufxhqoqopx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:15 np0005486759.ooo.test sudo[46223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:15 np0005486759.ooo.test python3[46225]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:15 np0005486759.ooo.test sudo[46223]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:15 np0005486759.ooo.test sudo[46239]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsfgrrdecumsdaedyhsaxmqfuujbzxfg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:15 np0005486759.ooo.test sudo[46239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:16 np0005486759.ooo.test sudo[46239]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:16 np0005486759.ooo.test sudo[46255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzgruhyxgxgmpdtuktnyjlxgscdzgheq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:16 np0005486759.ooo.test sudo[46255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:16 np0005486759.ooo.test python3[46257]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:16 np0005486759.ooo.test sudo[46255]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:16 np0005486759.ooo.test sudo[46305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtjotlnafnpekgkafooewloujkaakbmm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:16 np0005486759.ooo.test sudo[46305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:17 np0005486759.ooo.test python3[46307]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:17 np0005486759.ooo.test sudo[46305]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:17 np0005486759.ooo.test sudo[46323]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moiidklkbntldzhqjqhybgztxkgbclxo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:17 np0005486759.ooo.test sudo[46323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:17 np0005486759.ooo.test python3[46325]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:17 np0005486759.ooo.test sudo[46323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:17 np0005486759.ooo.test sudo[46385]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsbfepgolhbkwxrlvpiabeourzfqmghd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:17 np0005486759.ooo.test sudo[46385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:17 np0005486759.ooo.test python3[46387]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:17 np0005486759.ooo.test sudo[46385]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:17 np0005486759.ooo.test sudo[46403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nomdagurqevftusgubjiqaeyohvjfzbw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:17 np0005486759.ooo.test sudo[46403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:18 np0005486759.ooo.test python3[46405]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:18 np0005486759.ooo.test sudo[46403]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:18 np0005486759.ooo.test sudo[46465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgaqwdwtjmdiiiutehbonpoduonchqcs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:18 np0005486759.ooo.test sudo[46465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:18 np0005486759.ooo.test python3[46467]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:18 np0005486759.ooo.test sudo[46465]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:18 np0005486759.ooo.test sudo[46483]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqcinslhgzyzqupawdydpjrypmfpdsuk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:18 np0005486759.ooo.test sudo[46483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:18 np0005486759.ooo.test python3[46485]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:18 np0005486759.ooo.test sudo[46483]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:19 np0005486759.ooo.test sudo[46545]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edcajgkhezljkcqdyaqzbrvxhjnosbgb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:19 np0005486759.ooo.test sudo[46545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:19 np0005486759.ooo.test python3[46547]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:19 np0005486759.ooo.test sudo[46545]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:19 np0005486759.ooo.test sudo[46563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irzvsryumehxjjsofvheseqtukbdjpfd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:19 np0005486759.ooo.test sudo[46563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:19 np0005486759.ooo.test python3[46565]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:19 np0005486759.ooo.test sudo[46563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:19 np0005486759.ooo.test sudo[46593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nshymvzgmptsqgkswuyceduhfxfrsghh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:19 np0005486759.ooo.test sudo[46593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:20 np0005486759.ooo.test python3[46595]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:20 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:20 np0005486759.ooo.test systemd-rc-local-generator[46623]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:20 np0005486759.ooo.test systemd-sysv-generator[46626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:20 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:20 np0005486759.ooo.test systemd[1]: Starting dnf makecache...
Oct 14 08:16:20 np0005486759.ooo.test sudo[46593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:20 np0005486759.ooo.test dnf[46633]: Updating Subscription Management repositories.
Oct 14 08:16:20 np0005486759.ooo.test sudo[46680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kskpsufawheyklceiqjefnkhnewmmbcb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:16:20 np0005486759.ooo.test sudo[46680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:20 np0005486759.ooo.test podman[46682]: 2025-10-14 08:16:20.833286073 +0000 UTC m=+0.083878052 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:16:20 np0005486759.ooo.test python3[46683]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:20 np0005486759.ooo.test sudo[46680]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:21 np0005486759.ooo.test sudo[46725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atmxylxsbfwpoawxvyrktekneqvcelqe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:21 np0005486759.ooo.test sudo[46725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:21 np0005486759.ooo.test podman[46682]: 2025-10-14 08:16:21.019227359 +0000 UTC m=+0.269819278 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 08:16:21 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:16:21 np0005486759.ooo.test python3[46727]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:21 np0005486759.ooo.test sudo[46725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:21 np0005486759.ooo.test sudo[46788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yufiturufjvbfhrsylfwuovurimsoioo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:21 np0005486759.ooo.test sudo[46788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:21 np0005486759.ooo.test python3[46790]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:21 np0005486759.ooo.test sudo[46788]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:21 np0005486759.ooo.test sudo[46806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocsedwgaehujbyuifwexbinnbwtcelbs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:21 np0005486759.ooo.test sudo[46806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:21 np0005486759.ooo.test python3[46808]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:21 np0005486759.ooo.test sudo[46806]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:22 np0005486759.ooo.test sudo[46836]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhvmzrixavqhjaczirhldvossgtvozei ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:22 np0005486759.ooo.test sudo[46836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:22 np0005486759.ooo.test python3[46838]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:22 np0005486759.ooo.test dnf[46633]: Metadata cache refreshed recently.
Oct 14 08:16:22 np0005486759.ooo.test systemd-rc-local-generator[46862]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:22 np0005486759.ooo.test systemd-sysv-generator[46865]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: Finished dnf makecache.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: dnf-makecache.service: Consumed 2.050s CPU time.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 08:16:22 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 08:16:22 np0005486759.ooo.test sudo[46836]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:23 np0005486759.ooo.test sudo[46893]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxlqqfrihnxnczfofhkikbaqsggrxhre ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:23 np0005486759.ooo.test sudo[46893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:23 np0005486759.ooo.test python3[46895]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Oct 14 08:16:23 np0005486759.ooo.test sudo[46893]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:23 np0005486759.ooo.test sudo[46909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxwoavuanvwbspdnqmcgfwujlyuudfpn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:23 np0005486759.ooo.test sudo[46909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:23 np0005486759.ooo.test sudo[46909]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:24 np0005486759.ooo.test sudo[46949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dopxpazvxyqthdawqupouzyibilumiub ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:24 np0005486759.ooo.test sudo[46949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:24 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Oct 14 08:16:24 np0005486759.ooo.test podman[47100]: 2025-10-14 08:16:24.950251246 +0000 UTC m=+0.063570688 container create b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, release=2, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container)
Oct 14 08:16:24 np0005486759.ooo.test podman[47101]: 2025-10-14 08:16:24.96556759 +0000 UTC m=+0.073744149 container create 4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_statedir_owner, managed_by=tripleo_ansible, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Oct 14 08:16:24 np0005486759.ooo.test podman[47138]: 2025-10-14 08:16:24.994112805 +0000 UTC m=+0.069455419 container create 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vcs-type=git, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T12:58:40, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.scope.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a.scope.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test podman[47100]: 2025-10-14 08:16:24.921333601 +0000 UTC m=+0.034653093 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6160dcf8522bce4a4dbb2d001d64131000991ae258c303cf32d4eb798c35afe5/merged/scripts supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6160dcf8522bce4a4dbb2d001d64131000991ae258c303cf32d4eb798c35afe5/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/699db7de424016e8e9e98e1f1065f647b839bf0571fc867d8e43c26f1a3cbb5b/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/699db7de424016e8e9e98e1f1065f647b839bf0571fc867d8e43c26f1a3cbb5b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/699db7de424016e8e9e98e1f1065f647b839bf0571fc867d8e43c26f1a3cbb5b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test podman[47147]: 2025-10-14 08:16:25.027471904 +0000 UTC m=+0.086419406 container create 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtlogd_wrapper, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=2, build-date=2025-07-21T14:56:59, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:16:25 np0005486759.ooo.test podman[47101]: 2025-10-14 08:16:24.928520746 +0000 UTC m=+0.036697305 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef.scope.
Oct 14 08:16:25 np0005486759.ooo.test podman[47109]: 2025-10-14 08:16:25.054638884 +0000 UTC m=+0.142630764 container create f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ceilometer_init_log, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47)
Oct 14 08:16:25 np0005486759.ooo.test podman[47109]: 2025-10-14 08:16:24.957622035 +0000 UTC m=+0.045613965 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Oct 14 08:16:25 np0005486759.ooo.test podman[47138]: 2025-10-14 08:16:24.958040416 +0000 UTC m=+0.033383080 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d.scope.
Oct 14 08:16:25 np0005486759.ooo.test podman[47147]: 2025-10-14 08:16:25.070695037 +0000 UTC m=+0.129642539 container init 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:56:59, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e661e53b0e669d3de31c0b2a91724e8c1c9edc8f2ed2c51ea6a3d77739f7d57/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test podman[47101]: 2025-10-14 08:16:25.084589565 +0000 UTC m=+0.192766094 container init 4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, config_id=tripleo_step3, container_name=nova_statedir_owner, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:16:25 np0005486759.ooo.test podman[47147]: 2025-10-14 08:16:24.988046179 +0000 UTC m=+0.046993691 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:25 np0005486759.ooo.test podman[47109]: 2025-10-14 08:16:25.087644484 +0000 UTC m=+0.175636384 container init f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, distribution-scope=public, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:16:25 np0005486759.ooo.test podman[47101]: 2025-10-14 08:16:25.094402837 +0000 UTC m=+0.202579366 container start 4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, build-date=2025-07-21T14:48:37, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_statedir_owner, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=)
Oct 14 08:16:25 np0005486759.ooo.test podman[47100]: 2025-10-14 08:16:25.094972002 +0000 UTC m=+0.208291454 container init b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, architecture=x86_64, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:16:25 np0005486759.ooo.test podman[47101]: 2025-10-14 08:16:25.095134426 +0000 UTC m=+0.203310985 container attach 4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., container_name=nova_statedir_owner, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: libpod-f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test sudo[47206]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:25 np0005486759.ooo.test sudo[47195]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:16:25 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:25 np0005486759.ooo.test podman[47100]: 2025-10-14 08:16:25.12286755 +0000 UTC m=+0.236186992 container start b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T13:04:03, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2)
Oct 14 08:16:25 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=324199e84b6ced954fd0cecf75a965ca --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: libpod-4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test podman[47147]: 2025-10-14 08:16:25.133770981 +0000 UTC m=+0.192718483 container start 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-type=git, build-date=2025-07-21T14:56:59, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, version=17.1.9, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Oct 14 08:16:25 np0005486759.ooo.test podman[47101]: 2025-10-14 08:16:25.13566245 +0000 UTC m=+0.243838999 container died 4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, distribution-scope=public, release=1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:16:25 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 08:16:25 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:25 np0005486759.ooo.test podman[47109]: 2025-10-14 08:16:25.145276427 +0000 UTC m=+0.233268317 container start f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1, container_name=ceilometer_init_log, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:16:25 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 08:16:25 np0005486759.ooo.test podman[47207]: 2025-10-14 08:16:25.163321512 +0000 UTC m=+0.045662437 container died f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step3, release=1, container_name=ceilometer_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 08:16:25 np0005486759.ooo.test podman[47207]: 2025-10-14 08:16:25.194149366 +0000 UTC m=+0.076490271 container cleanup f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.openshift.expose-services=, config_id=tripleo_step3, name=rhosp17/openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, build-date=2025-07-21T15:29:47, distribution-scope=public, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: libpod-conmon-f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:25 np0005486759.ooo.test podman[47227]: 2025-10-14 08:16:25.216409289 +0000 UTC m=+0.069269385 container cleanup 4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step3, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_statedir_owner, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: libpod-conmon-4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test podman[47138]: 2025-10-14 08:16:25.233638123 +0000 UTC m=+0.308980747 container init 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, batch=17.1_20250721.1, container_name=rsyslog, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, build-date=2025-07-21T12:58:40, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1)
Oct 14 08:16:25 np0005486759.ooo.test podman[47138]: 2025-10-14 08:16:25.2424737 +0000 UTC m=+0.317816314 container start 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, com.redhat.component=openstack-rsyslog-container, build-date=2025-07-21T12:58:40, release=1, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog)
Oct 14 08:16:25 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=09cd203329a42b234cd1e76ba6006819 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Oct 14 08:16:25 np0005486759.ooo.test podman[47208]: 2025-10-14 08:16:25.247821588 +0000 UTC m=+0.120818842 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, release=2, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:16:25 np0005486759.ooo.test sudo[47303]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:25 np0005486759.ooo.test sudo[47303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:25 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1760428936 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Oct 14 08:16:25 np0005486759.ooo.test podman[47208]: 2025-10-14 08:16:25.281181887 +0000 UTC m=+0.154179141 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:16:25 np0005486759.ooo.test podman[47208]: unhealthy
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Queued start job for default target Main User Target.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Created slice User Application Slice.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Reached target Paths.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Reached target Timers.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Starting D-Bus User Message Bus Socket...
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Starting Create User's Volatile Files and Directories...
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Failed with result 'exit-code'.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Finished Create User's Volatile Files and Directories.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Listening on D-Bus User Message Bus Socket.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Reached target Sockets.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Reached target Basic System.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Reached target Main User Target.
Oct 14 08:16:25 np0005486759.ooo.test systemd[47252]: Startup finished in 97ms.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started Session c1 of User root.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started Session c2 of User root.
Oct 14 08:16:25 np0005486759.ooo.test sudo[47195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:25 np0005486759.ooo.test sudo[47303]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:25 np0005486759.ooo.test sudo[47206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: libpod-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test sudo[47195]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: session-c1.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test sudo[47206]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: session-c2.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test podman[47305]: 2025-10-14 08:16:25.390919972 +0000 UTC m=+0.136433994 container died 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-07-21T12:58:40, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, name=rhosp17/openstack-rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container)
Oct 14 08:16:25 np0005486759.ooo.test podman[47358]: 2025-10-14 08:16:25.472586945 +0000 UTC m=+0.117545058 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-07-21T12:58:40, container_name=rsyslog, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, tcib_managed=true, version=17.1.9, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vendor=Red Hat, Inc., release=1)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: libpod-conmon-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test podman[47481]: 2025-10-14 08:16:25.653209816 +0000 UTC m=+0.043734957 container create 405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2)
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03.scope.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca4da532e9e4321ff6fa78b70e2d8e6834968b1021a5a882c80b2624d34637c6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca4da532e9e4321ff6fa78b70e2d8e6834968b1021a5a882c80b2624d34637c6/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca4da532e9e4321ff6fa78b70e2d8e6834968b1021a5a882c80b2624d34637c6/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca4da532e9e4321ff6fa78b70e2d8e6834968b1021a5a882c80b2624d34637c6/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test podman[47481]: 2025-10-14 08:16:25.71398324 +0000 UTC m=+0.104508361 container init 405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:16:25 np0005486759.ooo.test podman[47495]: 2025-10-14 08:16:25.724756328 +0000 UTC m=+0.090455970 container create fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.33.12, vcs-type=git, container_name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team)
Oct 14 08:16:25 np0005486759.ooo.test podman[47481]: 2025-10-14 08:16:25.632629005 +0000 UTC m=+0.023154156 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461.scope.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:25 np0005486759.ooo.test podman[47495]: 2025-10-14 08:16:25.772008545 +0000 UTC m=+0.137708197 container init fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, vcs-type=git, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.33.12, container_name=nova_virtsecretd, batch=17.1_20250721.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public)
Oct 14 08:16:25 np0005486759.ooo.test podman[47481]: 2025-10-14 08:16:25.775782221 +0000 UTC m=+0.166307332 container start 405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, version=17.1.9, architecture=x86_64)
Oct 14 08:16:25 np0005486759.ooo.test podman[47495]: 2025-10-14 08:16:25.779535579 +0000 UTC m=+0.145235231 container start fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, build-date=2025-07-21T14:56:59, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, vendor=Red Hat, Inc., release=2, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtsecretd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64)
Oct 14 08:16:25 np0005486759.ooo.test podman[47495]: 2025-10-14 08:16:25.679812201 +0000 UTC m=+0.045511863 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:25 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:25 np0005486759.ooo.test sudo[47525]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:25 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: Started Session c3 of User root.
Oct 14 08:16:25 np0005486759.ooo.test sudo[47525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:25 np0005486759.ooo.test sudo[47525]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: session-c3.scope: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99-merged.mount: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e-userdata-shm.mount: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-699db7de424016e8e9e98e1f1065f647b839bf0571fc867d8e43c26f1a3cbb5b-merged.mount: Deactivated successfully.
Oct 14 08:16:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b03b3886143e0b8e7429525a9ec7232c64ca2d7bfda53aa9122ea53785c073a-userdata-shm.mount: Deactivated successfully.
Oct 14 08:16:26 np0005486759.ooo.test podman[47646]: 2025-10-14 08:16:26.116854674 +0000 UTC m=+0.056850785 container create 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, release=1, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3)
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started libpod-conmon-6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.scope.
Oct 14 08:16:26 np0005486759.ooo.test podman[47653]: 2025-10-14 08:16:26.159659106 +0000 UTC m=+0.083846600 container create 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-type=git, name=rhosp17/openstack-nova-libvirt, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtnodedevd, release=2, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64)
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46260fec5bdd83621572ffa7df8d8d0e559aaff55a250b87810024aca47c9813/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46260fec5bdd83621572ffa7df8d8d0e559aaff55a250b87810024aca47c9813/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test podman[47646]: 2025-10-14 08:16:26.090155297 +0000 UTC m=+0.030151418 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Oct 14 08:16:26 np0005486759.ooo.test podman[47653]: 2025-10-14 08:16:26.112195053 +0000 UTC m=+0.036382567 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started libpod-conmon-609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42.scope.
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:16:26 np0005486759.ooo.test podman[47646]: 2025-10-14 08:16:26.217833924 +0000 UTC m=+0.157830045 container init 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, container_name=iscsid, version=17.1.9, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc.)
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:16:26 np0005486759.ooo.test podman[47653]: 2025-10-14 08:16:26.236741101 +0000 UTC m=+0.160928595 container init 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, config_id=tripleo_step3, batch=17.1_20250721.1, container_name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, release=2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:56:59, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, version=17.1.9)
Oct 14 08:16:26 np0005486759.ooo.test sudo[47687]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:26 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:26 np0005486759.ooo.test podman[47653]: 2025-10-14 08:16:26.24642935 +0000 UTC m=+0.170616854 container start 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, vcs-type=git, container_name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:56:59, vendor=Red Hat, Inc., release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt, version=17.1.9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started Session c4 of User root.
Oct 14 08:16:26 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:26 np0005486759.ooo.test sudo[47687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:26 np0005486759.ooo.test podman[47646]: 2025-10-14 08:16:26.288207176 +0000 UTC m=+0.228203327 container start 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:16:26 np0005486759.ooo.test sudo[47697]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:26 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=61017e001fc358991ba0100081a72ad5 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Oct 14 08:16:26 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started Session c5 of User root.
Oct 14 08:16:26 np0005486759.ooo.test sudo[47697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:26 np0005486759.ooo.test podman[47688]: 2025-10-14 08:16:26.322999402 +0000 UTC m=+0.081246103 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, vcs-type=git, com.redhat.component=openstack-iscsid-container)
Oct 14 08:16:26 np0005486759.ooo.test podman[47688]: 2025-10-14 08:16:26.33342749 +0000 UTC m=+0.091674201 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:16:26 np0005486759.ooo.test sudo[47687]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: session-c4.scope: Deactivated successfully.
Oct 14 08:16:26 np0005486759.ooo.test podman[47688]: unhealthy
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Failed with result 'exit-code'.
Oct 14 08:16:26 np0005486759.ooo.test kernel: Loading iSCSI transport class v2.0-870.
Oct 14 08:16:26 np0005486759.ooo.test sudo[47697]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: session-c5.scope: Deactivated successfully.
Oct 14 08:16:26 np0005486759.ooo.test podman[47829]: 2025-10-14 08:16:26.676143285 +0000 UTC m=+0.064721677 container create 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, release=2, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, container_name=nova_virtstoraged, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc.)
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started libpod-conmon-17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92.scope.
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:26 np0005486759.ooo.test podman[47829]: 2025-10-14 08:16:26.641287857 +0000 UTC m=+0.029866309 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:26 np0005486759.ooo.test podman[47829]: 2025-10-14 08:16:26.741402495 +0000 UTC m=+0.129980917 container init 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, container_name=nova_virtstoraged, vcs-type=git, release=2, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:16:26 np0005486759.ooo.test podman[47829]: 2025-10-14 08:16:26.751613138 +0000 UTC m=+0.140191560 container start 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=nova_virtstoraged, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, managed_by=tripleo_ansible)
Oct 14 08:16:26 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:26 np0005486759.ooo.test sudo[47848]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:26 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: Started Session c6 of User root.
Oct 14 08:16:26 np0005486759.ooo.test sudo[47848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:26 np0005486759.ooo.test sudo[47848]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:26 np0005486759.ooo.test systemd[1]: session-c6.scope: Deactivated successfully.
Oct 14 08:16:27 np0005486759.ooo.test podman[47933]: 2025-10-14 08:16:27.193868136 +0000 UTC m=+0.084620261 container create 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, release=2, name=rhosp17/openstack-nova-libvirt, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, container_name=nova_virtqemud, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: Started libpod-conmon-2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49.scope.
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:27 np0005486759.ooo.test podman[47933]: 2025-10-14 08:16:27.146997989 +0000 UTC m=+0.037750164 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test podman[47933]: 2025-10-14 08:16:27.257068513 +0000 UTC m=+0.147820598 container init 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, release=2, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, batch=17.1_20250721.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, distribution-scope=public)
Oct 14 08:16:27 np0005486759.ooo.test podman[47933]: 2025-10-14 08:16:27.266243889 +0000 UTC m=+0.156995974 container start 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=nova_virtqemud, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=2, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T14:56:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container)
Oct 14 08:16:27 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:27 np0005486759.ooo.test sudo[47952]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:27 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: Started Session c7 of User root.
Oct 14 08:16:27 np0005486759.ooo.test sudo[47952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:27 np0005486759.ooo.test sudo[47952]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: session-c7.scope: Deactivated successfully.
Oct 14 08:16:27 np0005486759.ooo.test podman[48035]: 2025-10-14 08:16:27.715770034 +0000 UTC m=+0.089433314 container create 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, release=2, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, distribution-scope=public)
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: Started libpod-conmon-3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3.scope.
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:27 np0005486759.ooo.test podman[48035]: 2025-10-14 08:16:27.674144792 +0000 UTC m=+0.047808092 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:27 np0005486759.ooo.test podman[48035]: 2025-10-14 08:16:27.778392476 +0000 UTC m=+0.152055756 container init 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.buildah.version=1.33.12, io.openshift.expose-services=, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59)
Oct 14 08:16:27 np0005486759.ooo.test podman[48035]: 2025-10-14 08:16:27.787601554 +0000 UTC m=+0.161264834 container start 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.33.12, release=2, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, container_name=nova_virtproxyd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0)
Oct 14 08:16:27 np0005486759.ooo.test python3[46951]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Oct 14 08:16:27 np0005486759.ooo.test sudo[48054]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:27 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: Started Session c8 of User root.
Oct 14 08:16:27 np0005486759.ooo.test sudo[48054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:27 np0005486759.ooo.test sudo[48054]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:27 np0005486759.ooo.test systemd[1]: session-c8.scope: Deactivated successfully.
Oct 14 08:16:27 np0005486759.ooo.test sudo[46949]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:28 np0005486759.ooo.test sudo[48115]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewtrirdisjebpgrubppujkbiwrwgpujl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:28 np0005486759.ooo.test sudo[48115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:28 np0005486759.ooo.test python3[48117]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:28 np0005486759.ooo.test sudo[48115]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:28 np0005486759.ooo.test sudo[48131]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqcznltbdduzedmfvtzfdycfwkfvjbml ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:28 np0005486759.ooo.test sudo[48131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:28 np0005486759.ooo.test python3[48133]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:28 np0005486759.ooo.test sudo[48131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:28 np0005486759.ooo.test sudo[48147]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayshpntoxwukjnmuauftnbkmjdaulxbl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:28 np0005486759.ooo.test sudo[48147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:28 np0005486759.ooo.test python3[48149]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:28 np0005486759.ooo.test sudo[48147]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:28 np0005486759.ooo.test sudo[48163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fveioognqlgpvstmnucodzwasqrnussy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:28 np0005486759.ooo.test sudo[48163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:28 np0005486759.ooo.test python3[48165]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:28 np0005486759.ooo.test sudo[48163]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:29 np0005486759.ooo.test sudo[48179]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpkiewqjpmxxprdgjizjflvllmocbkex ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:29 np0005486759.ooo.test sudo[48179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:29 np0005486759.ooo.test python3[48181]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:29 np0005486759.ooo.test sudo[48179]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:29 np0005486759.ooo.test sudo[48195]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-feoonslyrsxcbmipetmxpbnhxqmipuhg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:29 np0005486759.ooo.test sudo[48195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:29 np0005486759.ooo.test python3[48197]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:29 np0005486759.ooo.test sudo[48195]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:29 np0005486759.ooo.test sudo[48211]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sztxjsmmsbxobxrpjbrxwwricjvkjhtr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:29 np0005486759.ooo.test sudo[48211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:29 np0005486759.ooo.test python3[48213]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:29 np0005486759.ooo.test sudo[48211]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:29 np0005486759.ooo.test sudo[48227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziwnhecalvilrcpyuwrbhjxdrucoykpt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:29 np0005486759.ooo.test sudo[48227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:29 np0005486759.ooo.test python3[48229]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:30 np0005486759.ooo.test sudo[48227]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:30 np0005486759.ooo.test sudo[48243]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otizcayrdmbfzrysvsetwuolzjfkanea ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:30 np0005486759.ooo.test sudo[48243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:30 np0005486759.ooo.test python3[48245]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:30 np0005486759.ooo.test sudo[48243]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:30 np0005486759.ooo.test sudo[48259]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pojngsztuuzgtszscsqmpjckqdsbzuvs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:30 np0005486759.ooo.test sudo[48259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:30 np0005486759.ooo.test python3[48261]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:30 np0005486759.ooo.test sudo[48259]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:30 np0005486759.ooo.test sudo[48275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skfnonqumfmnpcslhjqivuplzrosutjp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:30 np0005486759.ooo.test sudo[48275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:30 np0005486759.ooo.test python3[48277]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:30 np0005486759.ooo.test sudo[48275]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:30 np0005486759.ooo.test sudo[48291]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zagovubpcvsjxeorypjjkteogojdlqvz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:30 np0005486759.ooo.test sudo[48291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:30 np0005486759.ooo.test python3[48293]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:30 np0005486759.ooo.test sudo[48291]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:31 np0005486759.ooo.test sudo[48307]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esecoaujijgwrnfnwdjzqchakiqbxgqx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:31 np0005486759.ooo.test sudo[48307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:31 np0005486759.ooo.test python3[48309]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:31 np0005486759.ooo.test sudo[48307]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:31 np0005486759.ooo.test sudo[48323]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxnpkfbxnqngeptzfivcxftbcbhdycsp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:31 np0005486759.ooo.test sudo[48323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:31 np0005486759.ooo.test python3[48325]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:31 np0005486759.ooo.test sudo[48323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:31 np0005486759.ooo.test sudo[48339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-faspnozsozdbdfvcfgmcftnyodiuuknb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:31 np0005486759.ooo.test sudo[48339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:31 np0005486759.ooo.test python3[48341]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:31 np0005486759.ooo.test sudo[48339]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:31 np0005486759.ooo.test sudo[48355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwlaajvmmvhcdlpvpmqavblacfkrvuhj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:31 np0005486759.ooo.test sudo[48355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:31 np0005486759.ooo.test python3[48357]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:31 np0005486759.ooo.test sudo[48355]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:32 np0005486759.ooo.test sudo[48371]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggtppdxwutufszrwfymwbntdphnyvjdf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:32 np0005486759.ooo.test sudo[48371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:32 np0005486759.ooo.test python3[48373]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:32 np0005486759.ooo.test sudo[48371]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:32 np0005486759.ooo.test sudo[48387]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtfsytkugvwhulfnacozwnwltmdbxrst ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:32 np0005486759.ooo.test sudo[48387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:32 np0005486759.ooo.test python3[48389]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:16:32 np0005486759.ooo.test sudo[48387]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:32 np0005486759.ooo.test sudo[48448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyzlahvljxlgcskiblodjbzojdadhgsg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:32 np0005486759.ooo.test sudo[48448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:33 np0005486759.ooo.test python3[48450]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:33 np0005486759.ooo.test sudo[48448]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:33 np0005486759.ooo.test sudo[48477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izmjhowezkfaxllzteralgrtembxxvpx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:33 np0005486759.ooo.test sudo[48477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:33 np0005486759.ooo.test python3[48479]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:33 np0005486759.ooo.test sudo[48477]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:33 np0005486759.ooo.test sudo[48506]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdyjxqosoebwldidqqtmnwpccilfgvpz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:33 np0005486759.ooo.test sudo[48506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:34 np0005486759.ooo.test python3[48508]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:34 np0005486759.ooo.test sudo[48506]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:34 np0005486759.ooo.test sudo[48535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cshtqlvsnqkcigweveupjwqspfphqkig ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:34 np0005486759.ooo.test sudo[48535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:34 np0005486759.ooo.test python3[48537]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:34 np0005486759.ooo.test sudo[48535]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:34 np0005486759.ooo.test sudo[48564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbnagfaznptrarblzwyvajfcrtrxoytj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:34 np0005486759.ooo.test sudo[48564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:35 np0005486759.ooo.test python3[48566]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:35 np0005486759.ooo.test sudo[48564]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:35 np0005486759.ooo.test sudo[48593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fseknqarjztykwbsdmzofjggynyascpw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:35 np0005486759.ooo.test sudo[48593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:35 np0005486759.ooo.test python3[48595]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:35 np0005486759.ooo.test sudo[48593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:35 np0005486759.ooo.test sudo[48622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lipsdvqdzczsmtgfljdukckxkkeundbz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:35 np0005486759.ooo.test sudo[48622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:36 np0005486759.ooo.test python3[48624]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:36 np0005486759.ooo.test sudo[48622]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:36 np0005486759.ooo.test sudo[48651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aonbmuqjinyyrfxecyoudqgeqqikpylb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:36 np0005486759.ooo.test sudo[48651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:36 np0005486759.ooo.test python3[48653]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:36 np0005486759.ooo.test sudo[48651]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:36 np0005486759.ooo.test sudo[48680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srqvbshtpexzmtfiemtpmjaomyxmqhou ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:36 np0005486759.ooo.test sudo[48680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:37 np0005486759.ooo.test python3[48682]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429792.4743822-109634-100249847082147/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:37 np0005486759.ooo.test sudo[48680]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:37 np0005486759.ooo.test sudo[48696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkowckiqtjnlwddvayyahzstdqweacwm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:37 np0005486759.ooo.test sudo[48696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:37 np0005486759.ooo.test python3[48698]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 08:16:37 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:37 np0005486759.ooo.test systemd-rc-local-generator[48721]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:37 np0005486759.ooo.test systemd-sysv-generator[48724]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:37 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:37 np0005486759.ooo.test sudo[48696]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:37 np0005486759.ooo.test sudo[48748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uftenzirwcmkieezxcmbgpheiioamvyn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:37 np0005486759.ooo.test sudo[48748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Activating special unit Exit the Session...
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped target Main User Target.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped target Basic System.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped target Paths.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped target Sockets.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped target Timers.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Closed D-Bus User Message Bus Socket.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Stopped Create User's Volatile Files and Directories.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Removed slice User Application Slice.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Reached target Shutdown.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Finished Exit the Session.
Oct 14 08:16:38 np0005486759.ooo.test systemd[47252]: Reached target Exit the Session.
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 08:16:38 np0005486759.ooo.test python3[48750]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 08:16:38 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 08:16:39 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:39 np0005486759.ooo.test systemd-sysv-generator[48784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:39 np0005486759.ooo.test systemd-rc-local-generator[48778]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:39 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:39 np0005486759.ooo.test systemd[1]: Starting collectd container...
Oct 14 08:16:39 np0005486759.ooo.test systemd[1]: Started collectd container.
Oct 14 08:16:39 np0005486759.ooo.test sudo[48748]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:39 np0005486759.ooo.test sudo[48817]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaiutiryyljgubouzojiphamrufredzv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:39 np0005486759.ooo.test sudo[48817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:40 np0005486759.ooo.test python3[48819]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:40 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:40 np0005486759.ooo.test systemd-sysv-generator[48847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:40 np0005486759.ooo.test systemd-rc-local-generator[48844]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:40 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:40 np0005486759.ooo.test systemd[1]: Starting iscsid container...
Oct 14 08:16:40 np0005486759.ooo.test systemd[1]: Started iscsid container.
Oct 14 08:16:40 np0005486759.ooo.test sudo[48817]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:40 np0005486759.ooo.test sudo[48883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmeublbrvdhjtcnaeyfdopecbdgodrqe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:40 np0005486759.ooo.test sudo[48883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:40 np0005486759.ooo.test python3[48885]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:41 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:41 np0005486759.ooo.test systemd-rc-local-generator[48911]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:41 np0005486759.ooo.test systemd-sysv-generator[48915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:41 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:41 np0005486759.ooo.test systemd[1]: Starting nova_virtlogd_wrapper container...
Oct 14 08:16:41 np0005486759.ooo.test systemd[1]: Started nova_virtlogd_wrapper container.
Oct 14 08:16:41 np0005486759.ooo.test sudo[48883]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:41 np0005486759.ooo.test sudo[48948]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpzsqnfaqpjkethmmmndvcfqzhgqlvkt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:41 np0005486759.ooo.test sudo[48948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:41 np0005486759.ooo.test python3[48950]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:43 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:43 np0005486759.ooo.test systemd-sysv-generator[48982]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:43 np0005486759.ooo.test systemd-rc-local-generator[48979]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:43 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:43 np0005486759.ooo.test systemd[1]: Starting nova_virtnodedevd container...
Oct 14 08:16:43 np0005486759.ooo.test tripleo-start-podman-container[48990]: Creating additional drop-in dependency for "nova_virtnodedevd" (609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42)
Oct 14 08:16:43 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:43 np0005486759.ooo.test systemd-sysv-generator[49050]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:43 np0005486759.ooo.test systemd-rc-local-generator[49047]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:43 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:43 np0005486759.ooo.test systemd[1]: Started nova_virtnodedevd container.
Oct 14 08:16:43 np0005486759.ooo.test sudo[48948]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:44 np0005486759.ooo.test sudo[49072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfgdiewqzgopnkembynhlonmbpsemntl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:44 np0005486759.ooo.test sudo[49072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:44 np0005486759.ooo.test python3[49074]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:44 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:44 np0005486759.ooo.test systemd-rc-local-generator[49098]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:44 np0005486759.ooo.test systemd-sysv-generator[49105]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:44 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:44 np0005486759.ooo.test systemd[1]: Starting nova_virtproxyd container...
Oct 14 08:16:44 np0005486759.ooo.test tripleo-start-podman-container[49113]: Creating additional drop-in dependency for "nova_virtproxyd" (3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3)
Oct 14 08:16:44 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:44 np0005486759.ooo.test systemd-sysv-generator[49177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:44 np0005486759.ooo.test systemd-rc-local-generator[49173]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:44 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:45 np0005486759.ooo.test systemd[1]: Started nova_virtproxyd container.
Oct 14 08:16:45 np0005486759.ooo.test sudo[49072]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:45 np0005486759.ooo.test sudo[49196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdipnieecgqyjhiahvcvanjgyhpfvtfc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:45 np0005486759.ooo.test sudo[49196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:45 np0005486759.ooo.test python3[49198]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:45 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:45 np0005486759.ooo.test systemd-rc-local-generator[49224]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:45 np0005486759.ooo.test systemd-sysv-generator[49227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:45 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:45 np0005486759.ooo.test systemd[1]: Starting nova_virtqemud container...
Oct 14 08:16:45 np0005486759.ooo.test tripleo-start-podman-container[49238]: Creating additional drop-in dependency for "nova_virtqemud" (2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49)
Oct 14 08:16:45 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:46 np0005486759.ooo.test systemd-sysv-generator[49295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:46 np0005486759.ooo.test systemd-rc-local-generator[49292]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:46 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:46 np0005486759.ooo.test systemd[1]: Started nova_virtqemud container.
Oct 14 08:16:46 np0005486759.ooo.test sudo[49196]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:46 np0005486759.ooo.test sudo[49320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yorpqxjravwijfrdpvcuflolcmvvrbvt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:46 np0005486759.ooo.test sudo[49320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:46 np0005486759.ooo.test python3[49322]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:46 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:47 np0005486759.ooo.test systemd-rc-local-generator[49347]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:47 np0005486759.ooo.test systemd-sysv-generator[49350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:47 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:47 np0005486759.ooo.test systemd[1]: Starting nova_virtsecretd container...
Oct 14 08:16:47 np0005486759.ooo.test tripleo-start-podman-container[49362]: Creating additional drop-in dependency for "nova_virtsecretd" (fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461)
Oct 14 08:16:47 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:47 np0005486759.ooo.test systemd-rc-local-generator[49418]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:47 np0005486759.ooo.test systemd-sysv-generator[49421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:47 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:47 np0005486759.ooo.test systemd[1]: Started nova_virtsecretd container.
Oct 14 08:16:47 np0005486759.ooo.test sudo[49320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:47 np0005486759.ooo.test sudo[49444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbiqogdnxrlwyklpbypfqjhrnlltjnyz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:47 np0005486759.ooo.test sudo[49444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:48 np0005486759.ooo.test python3[49446]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:48 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:48 np0005486759.ooo.test systemd-sysv-generator[49478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:48 np0005486759.ooo.test systemd-rc-local-generator[49472]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:48 np0005486759.ooo.test systemd[1]: Starting nova_virtstoraged container...
Oct 14 08:16:48 np0005486759.ooo.test tripleo-start-podman-container[49485]: Creating additional drop-in dependency for "nova_virtstoraged" (17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92)
Oct 14 08:16:48 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:48 np0005486759.ooo.test systemd-rc-local-generator[49542]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:48 np0005486759.ooo.test systemd-sysv-generator[49546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:48 np0005486759.ooo.test systemd[1]: Started nova_virtstoraged container.
Oct 14 08:16:48 np0005486759.ooo.test sudo[49444]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:49 np0005486759.ooo.test sudo[49568]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siivswvsbkcfhinydatezcmsetijpqkl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:16:49 np0005486759.ooo.test sudo[49568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:49 np0005486759.ooo.test python3[49570]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:49 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:49 np0005486759.ooo.test systemd-rc-local-generator[49597]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:49 np0005486759.ooo.test systemd-sysv-generator[49601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:49 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:49 np0005486759.ooo.test systemd[1]: Starting rsyslog container...
Oct 14 08:16:49 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:49 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:49 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:49 np0005486759.ooo.test podman[49611]: 2025-10-14 08:16:49.88058863 +0000 UTC m=+0.106879733 container init 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T12:58:40, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, tcib_managed=true, io.buildah.version=1.33.12)
Oct 14 08:16:49 np0005486759.ooo.test podman[49611]: 2025-10-14 08:16:49.887109648 +0000 UTC m=+0.113400741 container start 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, build-date=2025-07-21T12:58:40, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=)
Oct 14 08:16:49 np0005486759.ooo.test podman[49611]: rsyslog
Oct 14 08:16:49 np0005486759.ooo.test systemd[1]: Started rsyslog container.
Oct 14 08:16:49 np0005486759.ooo.test sudo[49630]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:49 np0005486759.ooo.test sudo[49630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:49 np0005486759.ooo.test sudo[49568]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:49 np0005486759.ooo.test sudo[49630]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:49 np0005486759.ooo.test systemd[1]: libpod-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:50 np0005486759.ooo.test podman[49647]: 2025-10-14 08:16:50.03558555 +0000 UTC m=+0.044789444 container died 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T12:58:40, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.expose-services=, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step3, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog)
Oct 14 08:16:50 np0005486759.ooo.test podman[49647]: 2025-10-14 08:16:50.061238631 +0000 UTC m=+0.070442495 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, vcs-type=git, container_name=rsyslog, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, build-date=2025-07-21T12:58:40, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9)
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:50 np0005486759.ooo.test podman[49659]: 2025-10-14 08:16:50.119968833 +0000 UTC m=+0.038935713 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, architecture=x86_64, version=17.1.9, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, distribution-scope=public, name=rhosp17/openstack-rsyslog, build-date=2025-07-21T12:58:40, container_name=rsyslog, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.buildah.version=1.33.12)
Oct 14 08:16:50 np0005486759.ooo.test podman[49659]: rsyslog
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Oct 14 08:16:50 np0005486759.ooo.test sudo[49683]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peleoxxjfurnqmarazbvqjzngdkofvfn ; /usr/bin/python3
Oct 14 08:16:50 np0005486759.ooo.test sudo[49683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:50 np0005486759.ooo.test python3[49685]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:50 np0005486759.ooo.test sudo[49683]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Stopped rsyslog container.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Starting rsyslog container...
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:50 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:50 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:50 np0005486759.ooo.test podman[49687]: 2025-10-14 08:16:50.48374452 +0000 UTC m=+0.111322457 container init 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, tcib_managed=true, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-rsyslog, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:16:50 np0005486759.ooo.test podman[49687]: 2025-10-14 08:16:50.492552716 +0000 UTC m=+0.120130653 container start 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, build-date=2025-07-21T12:58:40, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 08:16:50 np0005486759.ooo.test podman[49687]: rsyslog
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Started rsyslog container.
Oct 14 08:16:50 np0005486759.ooo.test sudo[49706]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:50 np0005486759.ooo.test sudo[49706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:50 np0005486759.ooo.test sudo[49706]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: libpod-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:50 np0005486759.ooo.test podman[49709]: 2025-10-14 08:16:50.642674102 +0000 UTC m=+0.054100144 container died 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-rsyslog, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T12:58:40, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1)
Oct 14 08:16:50 np0005486759.ooo.test podman[49709]: 2025-10-14 08:16:50.664549435 +0000 UTC m=+0.075975447 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T12:58:40, container_name=rsyslog, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:50 np0005486759.ooo.test podman[49747]: 2025-10-14 08:16:50.740654425 +0000 UTC m=+0.050391509 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=rsyslog, build-date=2025-07-21T12:58:40, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, release=1, version=17.1.9, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Oct 14 08:16:50 np0005486759.ooo.test podman[49747]: rsyslog
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Oct 14 08:16:50 np0005486759.ooo.test sudo[49779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxfyafbvtkerdkpskkjjxixxrteerdgu ; /usr/bin/python3
Oct 14 08:16:50 np0005486759.ooo.test sudo[49779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tmp-crun.l3fU6c.mount: Deactivated successfully.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99-merged.mount: Deactivated successfully.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e-userdata-shm.mount: Deactivated successfully.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Stopped rsyslog container.
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Starting rsyslog container...
Oct 14 08:16:50 np0005486759.ooo.test sudo[49779]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:50 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:50 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:50 np0005486759.ooo.test podman[49782]: 2025-10-14 08:16:50.974103045 +0000 UTC m=+0.101526785 container init 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20250721.1, build-date=2025-07-21T12:58:40, vendor=Red Hat, Inc., container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-rsyslog, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, architecture=x86_64)
Oct 14 08:16:50 np0005486759.ooo.test podman[49782]: 2025-10-14 08:16:50.980574003 +0000 UTC m=+0.107997733 container start 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.buildah.version=1.33.12, build-date=2025-07-21T12:58:40, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-rsyslog, distribution-scope=public)
Oct 14 08:16:50 np0005486759.ooo.test podman[49782]: rsyslog
Oct 14 08:16:50 np0005486759.ooo.test systemd[1]: Started rsyslog container.
Oct 14 08:16:50 np0005486759.ooo.test sudo[49810]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:51 np0005486759.ooo.test sudo[49810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:51 np0005486759.ooo.test sudo[49810]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: libpod-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:16:51 np0005486759.ooo.test podman[49829]: 2025-10-14 08:16:51.151161085 +0000 UTC m=+0.049424524 container died 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, batch=17.1_20250721.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1)
Oct 14 08:16:51 np0005486759.ooo.test sudo[49862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-taqvwuzfqomqfubajrgehuhiuzfpcqya ; /usr/bin/python3
Oct 14 08:16:51 np0005486759.ooo.test sudo[49862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:51 np0005486759.ooo.test podman[49829]: 2025-10-14 08:16:51.172588436 +0000 UTC m=+0.070851845 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, name=rhosp17/openstack-rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, batch=17.1_20250721.1, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:51 np0005486759.ooo.test podman[49869]: 2025-10-14 08:16:51.258097088 +0000 UTC m=+0.060809377 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20250721.1, build-date=2025-07-21T12:58:40, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167)
Oct 14 08:16:51 np0005486759.ooo.test podman[49869]: rsyslog
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Oct 14 08:16:51 np0005486759.ooo.test podman[49830]: 2025-10-14 08:16:51.245029501 +0000 UTC m=+0.135246543 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible)
Oct 14 08:16:51 np0005486759.ooo.test sudo[49862]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:51 np0005486759.ooo.test podman[49830]: 2025-10-14 08:16:51.443973214 +0000 UTC m=+0.334190196 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, release=1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: Stopped rsyslog container.
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: Starting rsyslog container...
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:51 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:51 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:51 np0005486759.ooo.test podman[49916]: 2025-10-14 08:16:51.569148767 +0000 UTC m=+0.097291666 container init 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, architecture=x86_64, container_name=rsyslog, build-date=2025-07-21T12:58:40, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:16:51 np0005486759.ooo.test sudo[49946]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hymacogtbcfbiyyafjqjkexjxkymlvau ; /usr/bin/python3
Oct 14 08:16:51 np0005486759.ooo.test podman[49916]: 2025-10-14 08:16:51.576646359 +0000 UTC m=+0.104789258 container start 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-rsyslog-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, vcs-type=git, distribution-scope=public)
Oct 14 08:16:51 np0005486759.ooo.test podman[49916]: rsyslog
Oct 14 08:16:51 np0005486759.ooo.test sudo[49946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: Started rsyslog container.
Oct 14 08:16:51 np0005486759.ooo.test sudo[49950]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:51 np0005486759.ooo.test sudo[49950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:51 np0005486759.ooo.test sudo[49950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: libpod-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:51 np0005486759.ooo.test podman[49954]: 2025-10-14 08:16:51.735941491 +0000 UTC m=+0.052239235 container died 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, container_name=rsyslog, build-date=2025-07-21T12:58:40, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Oct 14 08:16:51 np0005486759.ooo.test podman[49954]: 2025-10-14 08:16:51.757353563 +0000 UTC m=+0.073651277 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-rsyslog, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, build-date=2025-07-21T12:58:40, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog)
Oct 14 08:16:51 np0005486759.ooo.test python3[49951]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005486759 step=3 update_config_hash_only=False
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:51 np0005486759.ooo.test sudo[49946]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99-merged.mount: Deactivated successfully.
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e-userdata-shm.mount: Deactivated successfully.
Oct 14 08:16:51 np0005486759.ooo.test podman[49967]: 2025-10-14 08:16:51.826910854 +0000 UTC m=+0.043247555 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T12:58:40, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, version=17.1.9, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, name=rhosp17/openstack-rsyslog, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, release=1, architecture=x86_64)
Oct 14 08:16:51 np0005486759.ooo.test podman[49967]: rsyslog
Oct 14 08:16:51 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: Stopped rsyslog container.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: Starting rsyslog container...
Oct 14 08:16:52 np0005486759.ooo.test sudo[49994]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqucevdflaungtaupecoswqkafsilzda ; /usr/bin/python3
Oct 14 08:16:52 np0005486759.ooo.test sudo[49994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:16:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Oct 14 08:16:52 np0005486759.ooo.test podman[49995]: 2025-10-14 08:16:52.201993372 +0000 UTC m=+0.082139736 container init 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, build-date=2025-07-21T12:58:40, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, container_name=rsyslog, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.9, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Oct 14 08:16:52 np0005486759.ooo.test podman[49995]: 2025-10-14 08:16:52.209005862 +0000 UTC m=+0.089152216 container start 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-07-21T12:58:40, tcib_managed=true, container_name=rsyslog, version=17.1.9, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git)
Oct 14 08:16:52 np0005486759.ooo.test podman[49995]: rsyslog
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: Started rsyslog container.
Oct 14 08:16:52 np0005486759.ooo.test sudo[50015]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:16:52 np0005486759.ooo.test sudo[50015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:16:52 np0005486759.ooo.test sudo[50015]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: libpod-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e.scope: Deactivated successfully.
Oct 14 08:16:52 np0005486759.ooo.test python3[50007]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:52 np0005486759.ooo.test sudo[49994]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:52 np0005486759.ooo.test podman[50018]: 2025-10-14 08:16:52.366295772 +0000 UTC m=+0.060615501 container died 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, container_name=rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, version=17.1.9, name=rhosp17/openstack-rsyslog, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-07-21T12:58:40, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:16:52 np0005486759.ooo.test podman[50018]: 2025-10-14 08:16:52.390783773 +0000 UTC m=+0.085103412 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, batch=17.1_20250721.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T12:58:40, com.redhat.component=openstack-rsyslog-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., version=17.1.9)
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:16:52 np0005486759.ooo.test sudo[50052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhmnqajodnkumdppopdozxlmfsdlkmeo ; /usr/bin/python3
Oct 14 08:16:52 np0005486759.ooo.test sudo[50052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:52 np0005486759.ooo.test podman[50032]: 2025-10-14 08:16:52.477243689 +0000 UTC m=+0.054423993 container cleanup 254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, batch=17.1_20250721.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '09cd203329a42b234cd1e76ba6006819'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-rsyslog/images/17.1.9-1, build-date=2025-07-21T12:58:40, vcs-ref=38a223d7b691af709e0a5f628409462e34eea167, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 08:16:52 np0005486759.ooo.test podman[50032]: rsyslog
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: Stopped rsyslog container.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: Failed to start rsyslog container.
Oct 14 08:16:52 np0005486759.ooo.test python3[50058]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Oct 14 08:16:52 np0005486759.ooo.test sudo[50052]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99-merged.mount: Deactivated successfully.
Oct 14 08:16:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e-userdata-shm.mount: Deactivated successfully.
Oct 14 08:16:54 np0005486759.ooo.test sudo[50104]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtcqmcecrkvczezysrdfupcgmofdwyfo ; /usr/bin/python3
Oct 14 08:16:54 np0005486759.ooo.test sudo[50104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:54 np0005486759.ooo.test python3[50106]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:54 np0005486759.ooo.test sudo[50104]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:54 np0005486759.ooo.test sudo[50149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drrjhayqhkzuwrntjwfnygyytmzaqqer ; /usr/bin/python3
Oct 14 08:16:54 np0005486759.ooo.test sudo[50149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:54 np0005486759.ooo.test python3[50151]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429814.2714584-110271-274714798190627/source _original_basename=tmp2fs0y_8h follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:54 np0005486759.ooo.test sudo[50149]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:16:55 np0005486759.ooo.test podman[50179]: 2025-10-14 08:16:55.462173477 +0000 UTC m=+0.090370507 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=2)
Oct 14 08:16:55 np0005486759.ooo.test sudo[50227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twnfvzpgtpfwlrjmcinumetlxgcugajv ; /usr/bin/python3
Oct 14 08:16:55 np0005486759.ooo.test podman[50179]: 2025-10-14 08:16:55.49992877 +0000 UTC m=+0.128125760 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:16:55 np0005486759.ooo.test sudo[50227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:55 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:16:55 np0005486759.ooo.test python3[50232]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:55 np0005486759.ooo.test sudo[50227]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:55 np0005486759.ooo.test sudo[50273]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpizzfvrsilxbcuuyadheaswrgwizkxc ; /usr/bin/python3
Oct 14 08:16:55 np0005486759.ooo.test sudo[50273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:56 np0005486759.ooo.test python3[50275]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429815.334667-110338-213606392388308/source _original_basename=tmp9kkoa7jd follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:56 np0005486759.ooo.test sudo[50273]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:56 np0005486759.ooo.test sudo[50335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afinbfnvgoextwhienhscyekpzhvhccw ; /usr/bin/python3
Oct 14 08:16:56 np0005486759.ooo.test sudo[50335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:16:56 np0005486759.ooo.test podman[50338]: 2025-10-14 08:16:56.527495918 +0000 UTC m=+0.112772695 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:16:56 np0005486759.ooo.test python3[50337]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:56 np0005486759.ooo.test podman[50338]: 2025-10-14 08:16:56.565430694 +0000 UTC m=+0.150707451 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-iscsid-container, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible)
Oct 14 08:16:56 np0005486759.ooo.test sudo[50335]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:56 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:16:56 np0005486759.ooo.test sudo[50397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmecipyjopjeprknoxlinjkdpeeemgzd ; /usr/bin/python3
Oct 14 08:16:56 np0005486759.ooo.test sudo[50397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:56 np0005486759.ooo.test python3[50399]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429816.2467666-110407-115472763857848/source _original_basename=tmphfi0xfvr follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:56 np0005486759.ooo.test sudo[50397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:57 np0005486759.ooo.test sudo[50459]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csoaifroyfdmzwrlqxgpxodviqvpyqqg ; /usr/bin/python3
Oct 14 08:16:57 np0005486759.ooo.test sudo[50459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:57 np0005486759.ooo.test python3[50461]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:16:57 np0005486759.ooo.test sudo[50459]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:57 np0005486759.ooo.test sudo[50502]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbcjfrnfpvvgvnwwogodzwqtghxdibqf ; /usr/bin/python3
Oct 14 08:16:57 np0005486759.ooo.test sudo[50502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:57 np0005486759.ooo.test python3[50504]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429817.1789894-110430-107482601302156/source _original_basename=tmpdto0feen follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:16:57 np0005486759.ooo.test sudo[50502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:58 np0005486759.ooo.test sudo[50532]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbgwzkdjmeonogbkmjqpubsudxmxvggf ; /usr/bin/python3
Oct 14 08:16:58 np0005486759.ooo.test sudo[50532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:58 np0005486759.ooo.test python3[50534]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 14 08:16:58 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:58 np0005486759.ooo.test systemd-sysv-generator[50562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:58 np0005486759.ooo.test systemd-rc-local-generator[50559]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:58 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:58 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:58 np0005486759.ooo.test systemd-rc-local-generator[50598]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:58 np0005486759.ooo.test systemd-sysv-generator[50601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:58 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:59 np0005486759.ooo.test sudo[50532]: pam_unix(sudo:session): session closed for user root
Oct 14 08:16:59 np0005486759.ooo.test sudo[50621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glemvbupbxjzpjxwvhvtxuvnjxgzbswb ; /usr/bin/python3
Oct 14 08:16:59 np0005486759.ooo.test sudo[50621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:16:59 np0005486759.ooo.test python3[50623]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:16:59 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:59 np0005486759.ooo.test systemd-rc-local-generator[50648]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:59 np0005486759.ooo.test systemd-sysv-generator[50654]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:59 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:16:59 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:16:59 np0005486759.ooo.test systemd-rc-local-generator[50688]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:16:59 np0005486759.ooo.test systemd-sysv-generator[50691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:16:59 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:00 np0005486759.ooo.test systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Oct 14 08:17:00 np0005486759.ooo.test sudo[50621]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:00 np0005486759.ooo.test sudo[50711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwbnousxwznannimmrvamyyzhhwrppna ; /usr/bin/python3
Oct 14 08:17:00 np0005486759.ooo.test sudo[50711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:00 np0005486759.ooo.test python3[50713]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:17:00 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:00 np0005486759.ooo.test systemd-rc-local-generator[50739]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:00 np0005486759.ooo.test systemd-sysv-generator[50742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:00 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:00 np0005486759.ooo.test sudo[50711]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:00 np0005486759.ooo.test sudo[50796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmyoiyeqmipmqamwbloijbitlohskhdc ; /usr/bin/python3
Oct 14 08:17:00 np0005486759.ooo.test sudo[50796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:01 np0005486759.ooo.test python3[50798]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:01 np0005486759.ooo.test sudo[50796]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:01 np0005486759.ooo.test sudo[50839]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqahktcizfccefdhsipviimbqpgcdvfv ; /usr/bin/python3
Oct 14 08:17:01 np0005486759.ooo.test sudo[50839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:01 np0005486759.ooo.test python3[50841]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429820.8768296-110456-66680850570534/source _original_basename=tmp3tkgmt2x follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:01 np0005486759.ooo.test sudo[50839]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:01 np0005486759.ooo.test sudo[50869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pteakcvwllbogxfscqdpfxyztsjvyvfs ; /usr/bin/python3
Oct 14 08:17:01 np0005486759.ooo.test sudo[50869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:01 np0005486759.ooo.test python3[50871]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:01 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:02 np0005486759.ooo.test systemd-sysv-generator[50902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:02 np0005486759.ooo.test systemd-rc-local-generator[50898]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:02 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:02 np0005486759.ooo.test systemd[1]: Reached target tripleo_nova_libvirt.target.
Oct 14 08:17:02 np0005486759.ooo.test sudo[50869]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:02 np0005486759.ooo.test sudo[50924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blpxrxmbkrxpjgnzlglewmrpcryhrcxw ; /usr/bin/python3
Oct 14 08:17:02 np0005486759.ooo.test sudo[50924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:02 np0005486759.ooo.test python3[50926]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:02 np0005486759.ooo.test sudo[50924]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:03 np0005486759.ooo.test sudo[50974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocmeathkcvhlcewbrtempildjieionry ; /usr/bin/python3
Oct 14 08:17:03 np0005486759.ooo.test sudo[50974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:03 np0005486759.ooo.test sudo[50974]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:03 np0005486759.ooo.test sudo[50992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pagpkshvhqcwkjpaqwczldrasitpsdde ; /usr/bin/python3
Oct 14 08:17:03 np0005486759.ooo.test sudo[50992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:03 np0005486759.ooo.test sudo[50992]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:04 np0005486759.ooo.test sudo[51096]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aneqjtqirgmngenzimjgliptognnxwxe ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429823.6825614-110482-43385083106452/async_wrapper.py 526256081272 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429823.6825614-110482-43385083106452/AnsiballZ_command.py _
Oct 14 08:17:04 np0005486759.ooo.test sudo[51096]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:17:04 np0005486759.ooo.test ansible-async_wrapper.py[51098]: Invoked with 526256081272 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429823.6825614-110482-43385083106452/AnsiballZ_command.py _
Oct 14 08:17:04 np0005486759.ooo.test ansible-async_wrapper.py[51101]: Starting module and watcher
Oct 14 08:17:04 np0005486759.ooo.test ansible-async_wrapper.py[51101]: Start watching 51102 (3600)
Oct 14 08:17:04 np0005486759.ooo.test ansible-async_wrapper.py[51102]: Start module (51102)
Oct 14 08:17:04 np0005486759.ooo.test ansible-async_wrapper.py[51098]: Return async_wrapper task started.
Oct 14 08:17:04 np0005486759.ooo.test sudo[51096]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:04 np0005486759.ooo.test sudo[51117]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aekkevcqckafrtzhpsjnfcuttdxhfvgp ; /usr/bin/python3
Oct 14 08:17:04 np0005486759.ooo.test sudo[51117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:04 np0005486759.ooo.test python3[51119]: ansible-ansible.legacy.async_status Invoked with jid=526256081272.51098 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:17:04 np0005486759.ooo.test sudo[51117]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (file & line not available)
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (file & line not available)
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Oct 14 08:17:08 np0005486759.ooo.test puppet-user[51122]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.23 seconds
Oct 14 08:17:09 np0005486759.ooo.test ansible-async_wrapper.py[51101]: 51102 still running (3600)
Oct 14 08:17:14 np0005486759.ooo.test ansible-async_wrapper.py[51101]: 51102 still running (3595)
Oct 14 08:17:14 np0005486759.ooo.test sudo[51321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nddpntsqkvpfhsmdkdmakdletvjhjzkd ; /usr/bin/python3
Oct 14 08:17:14 np0005486759.ooo.test sudo[51321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:14 np0005486759.ooo.test python3[51323]: ansible-ansible.legacy.async_status Invoked with jid=526256081272.51098 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:17:14 np0005486759.ooo.test sudo[51321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:17:15 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 08:17:15 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:15 np0005486759.ooo.test systemd-sysv-generator[51393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:15 np0005486759.ooo.test systemd-rc-local-generator[51387]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:15 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:16 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 08:17:16 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 08:17:16 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 08:17:16 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Consumed 1.238s CPU time.
Oct 14 08:17:16 np0005486759.ooo.test systemd[1]: run-rfc0c936593d645eba9e78a4e8b95f87d.service: Deactivated successfully.
Oct 14 08:17:17 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Oct 14 08:17:17 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}a58400a7ca6d2f5924cd86801436f477f35d48294c3d5279e5ba54e7f958cb81'
Oct 14 08:17:17 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Oct 14 08:17:17 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Oct 14 08:17:17 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Oct 14 08:17:17 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Oct 14 08:17:19 np0005486759.ooo.test ansible-async_wrapper.py[51101]: 51102 still running (3590)
Oct 14 08:17:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:17:22 np0005486759.ooo.test systemd[1]: tmp-crun.DZbMcP.mount: Deactivated successfully.
Oct 14 08:17:22 np0005486759.ooo.test podman[52423]: 2025-10-14 08:17:22.481962852 +0000 UTC m=+0.101896105 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, architecture=x86_64, release=1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:17:22 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Oct 14 08:17:22 np0005486759.ooo.test podman[52423]: 2025-10-14 08:17:22.680524844 +0000 UTC m=+0.300458087 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:17:22 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:17:22 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:22 np0005486759.ooo.test systemd-rc-local-generator[52476]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:22 np0005486759.ooo.test systemd-sysv-generator[52480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:22 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:23 np0005486759.ooo.test systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Oct 14 08:17:23 np0005486759.ooo.test snmpd[52493]: Can't find directory of RPM packages
Oct 14 08:17:23 np0005486759.ooo.test snmpd[52493]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Oct 14 08:17:23 np0005486759.ooo.test systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Oct 14 08:17:23 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:23 np0005486759.ooo.test systemd-rc-local-generator[52522]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:23 np0005486759.ooo.test systemd-sysv-generator[52525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:23 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:23 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:23 np0005486759.ooo.test systemd-sysv-generator[52555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:23 np0005486759.ooo.test systemd-rc-local-generator[52547]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:23 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Notice: Applied catalog in 15.23 seconds
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Application:
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:    Initial environment: production
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:    Converged environment: production
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:          Run mode: user
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Changes:
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:             Total: 8
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Events:
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:           Success: 8
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:             Total: 8
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Resources:
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:         Restarted: 1
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:           Changed: 8
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:       Out of sync: 8
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:             Total: 19
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Time:
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:        Filebucket: 0.00
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:          Schedule: 0.00
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:            Augeas: 0.01
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:              File: 0.11
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:    Config retrieval: 0.29
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:           Service: 1.19
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:    Transaction evaluation: 15.19
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:    Catalog application: 15.23
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:          Last run: 1760429843
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:              Exec: 5.06
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:           Package: 8.65
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:             Total: 15.24
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]: Version:
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:            Config: 1760429828
Oct 14 08:17:23 np0005486759.ooo.test puppet-user[51122]:            Puppet: 7.10.0
Oct 14 08:17:23 np0005486759.ooo.test ansible-async_wrapper.py[51102]: Module complete (51102)
Oct 14 08:17:24 np0005486759.ooo.test ansible-async_wrapper.py[51101]: Done in kid B.
Oct 14 08:17:24 np0005486759.ooo.test sudo[52580]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqcsqcglfrgawlrpxtzwfizzfitffatf ; /usr/bin/python3
Oct 14 08:17:24 np0005486759.ooo.test sudo[52580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:25 np0005486759.ooo.test python3[52582]: ansible-ansible.legacy.async_status Invoked with jid=526256081272.51098 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:17:25 np0005486759.ooo.test sudo[52580]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:25 np0005486759.ooo.test sudo[52596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgjrxktewbxvyhfrplmdmxznhvsfdhkl ; /usr/bin/python3
Oct 14 08:17:25 np0005486759.ooo.test sudo[52596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:17:25 np0005486759.ooo.test podman[52598]: 2025-10-14 08:17:25.691175094 +0000 UTC m=+0.085500522 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:17:25 np0005486759.ooo.test podman[52598]: 2025-10-14 08:17:25.706404567 +0000 UTC m=+0.100729985 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, container_name=collectd, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:17:25 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:17:25 np0005486759.ooo.test python3[52599]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:17:25 np0005486759.ooo.test sudo[52596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:25 np0005486759.ooo.test sudo[52632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvescflpfawyhyftbwdnrxnmbjdgbkwc ; /usr/bin/python3
Oct 14 08:17:25 np0005486759.ooo.test sudo[52632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:26 np0005486759.ooo.test python3[52634]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:26 np0005486759.ooo.test sudo[52632]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:26 np0005486759.ooo.test sudo[52682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyrwjviqwkexpypoyflffdxcmbrwlxnc ; /usr/bin/python3
Oct 14 08:17:26 np0005486759.ooo.test sudo[52682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:26 np0005486759.ooo.test python3[52684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:26 np0005486759.ooo.test sudo[52682]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:26 np0005486759.ooo.test sudo[52700]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrmikaiazngmlvszygttcrvnckllwxxv ; /usr/bin/python3
Oct 14 08:17:26 np0005486759.ooo.test sudo[52700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:17:26 np0005486759.ooo.test podman[52703]: 2025-10-14 08:17:26.8870423 +0000 UTC m=+0.063135416 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, container_name=iscsid, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container)
Oct 14 08:17:26 np0005486759.ooo.test podman[52703]: 2025-10-14 08:17:26.897296316 +0000 UTC m=+0.073389392 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.expose-services=)
Oct 14 08:17:26 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:17:26 np0005486759.ooo.test python3[52702]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp0gs63r27 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:17:26 np0005486759.ooo.test sudo[52700]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:27 np0005486759.ooo.test sudo[52749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mupxecudsfulrnbfccwhbkvdmdfmsjxq ; /usr/bin/python3
Oct 14 08:17:27 np0005486759.ooo.test sudo[52749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:27 np0005486759.ooo.test python3[52751]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:27 np0005486759.ooo.test sudo[52749]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:27 np0005486759.ooo.test sudo[52765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eawwzxdwjcbkcbscqdayoglepmsxeiqn ; /usr/bin/python3
Oct 14 08:17:27 np0005486759.ooo.test sudo[52765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:28 np0005486759.ooo.test sudo[52765]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:28 np0005486759.ooo.test sudo[52852]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifbkekitprtmizcvkmwkqomynlhrjido ; /usr/bin/python3
Oct 14 08:17:28 np0005486759.ooo.test sudo[52852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:28 np0005486759.ooo.test python3[52854]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Oct 14 08:17:28 np0005486759.ooo.test sudo[52852]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:28 np0005486759.ooo.test sudo[52871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abdgeofiydkzzjlgbtimumqdkyukmtug ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:28 np0005486759.ooo.test sudo[52871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:29 np0005486759.ooo.test python3[52873]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:29 np0005486759.ooo.test sudo[52871]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:29 np0005486759.ooo.test sudo[52887]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pokxvlvnisptvbxowteaebymncvkckth ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:29 np0005486759.ooo.test sudo[52887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:29 np0005486759.ooo.test sudo[52887]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:29 np0005486759.ooo.test sudo[52903]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkbzmdkzgnqrxtoomiwumyebvqcwfnuq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:29 np0005486759.ooo.test sudo[52903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:30 np0005486759.ooo.test python3[52905]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:30 np0005486759.ooo.test sudo[52903]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:30 np0005486759.ooo.test sudo[52953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iouwusvpkbuwrweivhfgdifvidyxvliu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:30 np0005486759.ooo.test sudo[52953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:30 np0005486759.ooo.test python3[52955]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:30 np0005486759.ooo.test sudo[52953]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:30 np0005486759.ooo.test sudo[52971]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pacoeitvlyvdmfpqhnejbrrtyrihklij ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:30 np0005486759.ooo.test sudo[52971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:30 np0005486759.ooo.test python3[52973]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:30 np0005486759.ooo.test sudo[52971]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:31 np0005486759.ooo.test sudo[53033]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfusyjokjggzfxjlrzxcdiccsqlwtdaq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:31 np0005486759.ooo.test sudo[53033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:31 np0005486759.ooo.test python3[53035]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:31 np0005486759.ooo.test sudo[53033]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:31 np0005486759.ooo.test sudo[53051]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrqfuclbnyklpkenfrhusktbygpbyxfn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:31 np0005486759.ooo.test sudo[53051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:31 np0005486759.ooo.test python3[53053]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:31 np0005486759.ooo.test sudo[53051]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:31 np0005486759.ooo.test sudo[53113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apasvklwhkyalttchzhmtbliwphzqaba ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:31 np0005486759.ooo.test sudo[53113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:32 np0005486759.ooo.test python3[53115]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:32 np0005486759.ooo.test sudo[53113]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:32 np0005486759.ooo.test sudo[53131]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydawvbjyljbvzmaamgmsepwbntedowsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:32 np0005486759.ooo.test sudo[53131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:32 np0005486759.ooo.test python3[53133]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:32 np0005486759.ooo.test sudo[53131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:32 np0005486759.ooo.test sudo[53193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgrecnkrzslznjodqwzoazjjbkozcrnz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:32 np0005486759.ooo.test sudo[53193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:32 np0005486759.ooo.test python3[53195]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:32 np0005486759.ooo.test sudo[53193]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:33 np0005486759.ooo.test sudo[53211]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpqzholttucbsxncasanjtkgydporhyh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:33 np0005486759.ooo.test sudo[53211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:33 np0005486759.ooo.test python3[53213]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:33 np0005486759.ooo.test sudo[53211]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:33 np0005486759.ooo.test sudo[53241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnetaatrkkoeuvfoepoieztqlhachuoa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:33 np0005486759.ooo.test sudo[53241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:33 np0005486759.ooo.test python3[53243]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:33 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:33 np0005486759.ooo.test systemd-rc-local-generator[53260]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:33 np0005486759.ooo.test systemd-sysv-generator[53266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:33 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:34 np0005486759.ooo.test sudo[53241]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:34 np0005486759.ooo.test sudo[53326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfhptavwcpvwxivwyjmvnvzrywhpzzqc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:34 np0005486759.ooo.test sudo[53326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:34 np0005486759.ooo.test python3[53328]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:34 np0005486759.ooo.test sudo[53326]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:34 np0005486759.ooo.test sudo[53344]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pucsekndudkbwdnxexyvzuokiaoxiuof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:34 np0005486759.ooo.test sudo[53344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:34 np0005486759.ooo.test python3[53346]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:34 np0005486759.ooo.test sudo[53344]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:35 np0005486759.ooo.test sudo[53406]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmdxgazdzcoltflrbonplfugdddbttrz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:35 np0005486759.ooo.test sudo[53406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:35 np0005486759.ooo.test python3[53408]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:17:35 np0005486759.ooo.test sudo[53406]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:35 np0005486759.ooo.test sudo[53424]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhqbkvmcjliqfzgpkgwevyvymueldifr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:35 np0005486759.ooo.test sudo[53424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:35 np0005486759.ooo.test python3[53426]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:35 np0005486759.ooo.test sudo[53424]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:35 np0005486759.ooo.test sudo[53454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqffhemyzwuhexccrjxmxupqsuikcquk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:35 np0005486759.ooo.test sudo[53454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:35 np0005486759.ooo.test python3[53456]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:36 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:36 np0005486759.ooo.test systemd-rc-local-generator[53480]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:36 np0005486759.ooo.test systemd-sysv-generator[53483]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:36 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:36 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 08:17:36 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 08:17:36 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 08:17:36 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 08:17:36 np0005486759.ooo.test sudo[53454]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:36 np0005486759.ooo.test sudo[53512]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjvsdfbudypvffhmetdgngwqshkpzylo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:36 np0005486759.ooo.test sudo[53512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:36 np0005486759.ooo.test python3[53514]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Oct 14 08:17:36 np0005486759.ooo.test sudo[53512]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:37 np0005486759.ooo.test sudo[53528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxffbnjndsiodvjbqwzrcysuthogjoxg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:37 np0005486759.ooo.test sudo[53528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:37 np0005486759.ooo.test sudo[53528]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:38 np0005486759.ooo.test sudo[53569]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqibwvqajmbbvcnftbjazvezosmtulkx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:38 np0005486759.ooo.test sudo[53569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:38 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Oct 14 08:17:38 np0005486759.ooo.test podman[53741]: 2025-10-14 08:17:38.861146852 +0000 UTC m=+0.056861043 container create 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, release=1, distribution-scope=public, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33)
Oct 14 08:17:38 np0005486759.ooo.test podman[53764]: 2025-10-14 08:17:38.890036803 +0000 UTC m=+0.063093126 container create 582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, container_name=configure_cms_options, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1)
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started libpod-conmon-48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.scope.
Oct 14 08:17:38 np0005486759.ooo.test podman[53734]: 2025-10-14 08:17:38.901285929 +0000 UTC m=+0.099266240 container create 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, release=1, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52)
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started libpod-conmon-582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607.scope.
Oct 14 08:17:38 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f024985d8551ca70e6842fc9e2330bedf170ac29e8faf8031d1077d2a8fc6f8/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started libpod-conmon-0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.scope.
Oct 14 08:17:38 np0005486759.ooo.test podman[53741]: 2025-10-14 08:17:38.831372465 +0000 UTC m=+0.027086686 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:38 np0005486759.ooo.test podman[53734]: 2025-10-14 08:17:38.842351703 +0000 UTC m=+0.040332034 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:38 np0005486759.ooo.test podman[53764]: 2025-10-14 08:17:38.947338648 +0000 UTC m=+0.120394971 container init 582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:17:38 np0005486759.ooo.test podman[53741]: 2025-10-14 08:17:38.953362453 +0000 UTC m=+0.149076644 container init 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, architecture=x86_64, container_name=ceilometer_agent_compute)
Oct 14 08:17:38 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ada07e432d0e43eebd550951648a1927a38ab08f9b982361ae15057deb14876d/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:38 np0005486759.ooo.test podman[53777]: 2025-10-14 08:17:38.954000754 +0000 UTC m=+0.112745466 container create 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, vcs-type=git, container_name=nova_migration_target)
Oct 14 08:17:38 np0005486759.ooo.test podman[53764]: 2025-10-14 08:17:38.85619605 +0000 UTC m=+0.029252393 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Oct 14 08:17:38 np0005486759.ooo.test podman[53764]: 2025-10-14 08:17:38.959863934 +0000 UTC m=+0.132920267 container start 582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, release=1, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=configure_cms_options, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:17:38 np0005486759.ooo.test podman[53764]: 2025-10-14 08:17:38.960091021 +0000 UTC m=+0.133147364 container attach 582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, distribution-scope=public, container_name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12)
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:17:38 np0005486759.ooo.test sudo[53827]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:17:38 np0005486759.ooo.test sudo[53827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:17:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:17:38 np0005486759.ooo.test podman[53734]: 2025-10-14 08:17:38.978015863 +0000 UTC m=+0.175996224 container init 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=1, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, distribution-scope=public, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:17:38 np0005486759.ooo.test podman[53759]: 2025-10-14 08:17:38.985133673 +0000 UTC m=+0.157599868 container create 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 08:17:38 np0005486759.ooo.test sudo[53840]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:17:38 np0005486759.ooo.test sudo[53840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:17:39 np0005486759.ooo.test podman[53777]: 2025-10-14 08:17:38.901221187 +0000 UTC m=+0.059965919 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started libpod-conmon-44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.scope.
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started libpod-conmon-7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.scope.
Oct 14 08:17:39 np0005486759.ooo.test podman[53734]: 2025-10-14 08:17:39.023124243 +0000 UTC m=+0.221104554 container start 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-cron, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1)
Oct 14 08:17:39 np0005486759.ooo.test podman[53759]: 2025-10-14 08:17:38.926591859 +0000 UTC m=+0.099058024 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Oct 14 08:17:39 np0005486759.ooo.test sudo[53827]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:39 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Oct 14 08:17:39 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3e85029358dd8d56349cc32f50b1a52c41f0f3e6ed9b12e81f053986ea14f71/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:39 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/137c1921cf29a18f5788ee7ca89cb32d77b40e4f2bc3359cd1d75d04c15761c5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:39 np0005486759.ooo.test sudo[53840]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:39 np0005486759.ooo.test crond[53838]: (CRON) STARTUP (1.5.7)
Oct 14 08:17:39 np0005486759.ooo.test crond[53838]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 94% if used.)
Oct 14 08:17:39 np0005486759.ooo.test crond[53838]: (CRON) INFO (running with inotify support)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:17:39 np0005486759.ooo.test podman[53777]: 2025-10-14 08:17:39.058977518 +0000 UTC m=+0.217722250 container init 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4)
Oct 14 08:17:39 np0005486759.ooo.test sudo[53879]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:17:39 np0005486759.ooo.test sudo[53879]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 08:17:39 np0005486759.ooo.test podman[53741]: 2025-10-14 08:17:39.075038863 +0000 UTC m=+0.270753044 container start 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 08:17:39 np0005486759.ooo.test sudo[53879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:17:39 np0005486759.ooo.test ovs-vsctl[53881]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:ovn-cms-options=enable-chassis-as-gw
Oct 14 08:17:39 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9509e102f1abab83a0acc6d291975c60 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Oct 14 08:17:39 np0005486759.ooo.test sudo[53879]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: libpod-582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607.scope: Deactivated successfully.
Oct 14 08:17:39 np0005486759.ooo.test sshd[53907]: Server listening on 0.0.0.0 port 2022.
Oct 14 08:17:39 np0005486759.ooo.test sshd[53907]: Server listening on :: port 2022.
Oct 14 08:17:39 np0005486759.ooo.test podman[53829]: 2025-10-14 08:17:39.150993994 +0000 UTC m=+0.177005855 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, distribution-scope=public, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:17:39 np0005486759.ooo.test podman[53759]: 2025-10-14 08:17:39.164519751 +0000 UTC m=+0.336985916 container init 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:17:39 np0005486759.ooo.test sudo[53931]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:17:39 np0005486759.ooo.test sudo[53931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:17:39 np0005486759.ooo.test podman[53764]: 2025-10-14 08:17:39.190910904 +0000 UTC m=+0.363967227 container died 582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, container_name=configure_cms_options, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:17:39 np0005486759.ooo.test podman[53847]: 2025-10-14 08:17:39.096095362 +0000 UTC m=+0.079241513 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-cron-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:17:39 np0005486759.ooo.test sudo[53931]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:39 np0005486759.ooo.test podman[53914]: 2025-10-14 08:17:39.23458832 +0000 UTC m=+0.086226038 container cleanup 582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, config_id=tripleo_step4, release=1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=configure_cms_options, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: libpod-conmon-582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607.scope: Deactivated successfully.
Oct 14 08:17:39 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1760428936 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Oct 14 08:17:39 np0005486759.ooo.test podman[53759]: 2025-10-14 08:17:39.247111706 +0000 UTC m=+0.419577881 container start 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, vcs-type=git, version=17.1.9, architecture=x86_64)
Oct 14 08:17:39 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9509e102f1abab83a0acc6d291975c60 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Oct 14 08:17:39 np0005486759.ooo.test podman[53829]: 2025-10-14 08:17:39.29171644 +0000 UTC m=+0.317728301 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git)
Oct 14 08:17:39 np0005486759.ooo.test podman[53829]: unhealthy
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:17:39 np0005486759.ooo.test podman[53932]: 2025-10-14 08:17:39.308649412 +0000 UTC m=+0.110822546 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, vcs-type=git, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true)
Oct 14 08:17:39 np0005486759.ooo.test podman[53847]: 2025-10-14 08:17:39.382049184 +0000 UTC m=+0.365195355 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 08:17:39 np0005486759.ooo.test podman[53777]: 2025-10-14 08:17:39.386239633 +0000 UTC m=+0.544984345 container start 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, container_name=nova_migration_target, distribution-scope=public, release=1, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:17:39 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b29b30662a12a8864f5ea0f40846b2cc --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:17:39 np0005486759.ooo.test podman[53880]: 2025-10-14 08:17:39.404992311 +0000 UTC m=+0.317076623 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:17:39 np0005486759.ooo.test podman[53932]: 2025-10-14 08:17:39.440594598 +0000 UTC m=+0.242767752 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:17:39 np0005486759.ooo.test podman[53932]: unhealthy
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:17:39 np0005486759.ooo.test podman[54062]: 2025-10-14 08:17:39.493970213 +0000 UTC m=+0.066909753 container create 15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, release=1, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., vcs-type=git, container_name=setup_ovs_manager, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started libpod-conmon-15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723.scope.
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:39 np0005486759.ooo.test podman[54062]: 2025-10-14 08:17:39.559370179 +0000 UTC m=+0.132309719 container init 15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=setup_ovs_manager, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:17:39 np0005486759.ooo.test podman[54062]: 2025-10-14 08:17:39.463032689 +0000 UTC m=+0.035972329 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Oct 14 08:17:39 np0005486759.ooo.test podman[54062]: 2025-10-14 08:17:39.566864989 +0000 UTC m=+0.139804529 container start 15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-type=git, version=17.1.9, container_name=setup_ovs_manager, distribution-scope=public, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1)
Oct 14 08:17:39 np0005486759.ooo.test podman[54062]: 2025-10-14 08:17:39.56721672 +0000 UTC m=+0.140156280 container attach 15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, release=1, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:17:39 np0005486759.ooo.test podman[53880]: 2025-10-14 08:17:39.767456941 +0000 UTC m=+0.679541223 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., version=17.1.9, container_name=nova_migration_target, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-7e3b7dbefcbb84080782963e37dee9e7b27d279c0d8fc921be0c707bdde182ef-merged.mount: Deactivated successfully.
Oct 14 08:17:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-582184d85381f2dddd4cd63ea17ef22c4ad72744794b36e989189c33451e3607-userdata-shm.mount: Deactivated successfully.
Oct 14 08:17:40 np0005486759.ooo.test sudo[54119]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsjuie9ow/privsep.sock
Oct 14 08:17:40 np0005486759.ooo.test sudo[54119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:40 np0005486759.ooo.test kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 14 08:17:40 np0005486759.ooo.test sudo[54119]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:41 np0005486759.ooo.test sudo[54151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2oc3945d/privsep.sock
Oct 14 08:17:41 np0005486759.ooo.test sudo[54151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:41 np0005486759.ooo.test sudo[54151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:42 np0005486759.ooo.test sudo[54242]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj77hby01/privsep.sock
Oct 14 08:17:42 np0005486759.ooo.test sudo[54242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:42 np0005486759.ooo.test ovs-vsctl[54252]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 14 08:17:42 np0005486759.ooo.test sudo[54242]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:42 np0005486759.ooo.test systemd[1]: libpod-15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723.scope: Deactivated successfully.
Oct 14 08:17:42 np0005486759.ooo.test systemd[1]: libpod-15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723.scope: Consumed 3.099s CPU time.
Oct 14 08:17:42 np0005486759.ooo.test podman[54062]: 2025-10-14 08:17:42.755594013 +0000 UTC m=+3.328533593 container died 15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:17:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723-userdata-shm.mount: Deactivated successfully.
Oct 14 08:17:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b38c1a94310de94fdaff9837ece5582692a37d83810e45311fea0b8d6975fc83-merged.mount: Deactivated successfully.
Oct 14 08:17:42 np0005486759.ooo.test podman[54256]: 2025-10-14 08:17:42.83793407 +0000 UTC m=+0.071659600 container cleanup 15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, vcs-type=git, batch=17.1_20250721.1, container_name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, tcib_managed=true, architecture=x86_64)
Oct 14 08:17:42 np0005486759.ooo.test systemd[1]: libpod-conmon-15bd2829b95f8b14c42525aab97df89189cf652650c3890dd99c4f427a336723.scope: Deactivated successfully.
Oct 14 08:17:42 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1760428936 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1760428936'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Oct 14 08:17:42 np0005486759.ooo.test sudo[54295]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk0rbf1jm/privsep.sock
Oct 14 08:17:42 np0005486759.ooo.test sudo[54295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:43 np0005486759.ooo.test podman[54368]: 2025-10-14 08:17:43.310555764 +0000 UTC m=+0.086874688 container create 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, tcib_managed=true, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible)
Oct 14 08:17:43 np0005486759.ooo.test podman[54369]: 2025-10-14 08:17:43.321219852 +0000 UTC m=+0.092882133 container create c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, release=1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started libpod-conmon-46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.scope.
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started libpod-conmon-c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.scope.
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:43 np0005486759.ooo.test podman[54368]: 2025-10-14 08:17:43.262661619 +0000 UTC m=+0.038980553 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:17:43 np0005486759.ooo.test podman[54369]: 2025-10-14 08:17:43.266332931 +0000 UTC m=+0.037995192 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Oct 14 08:17:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90b1721303d95e8939b4592e60f18afa0e38ea17042ea3b2a8546358a562b24/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f3843305e98ca43fa4870afdb54b5c368e3c8047a8e821c108bd75c91b0d14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f3843305e98ca43fa4870afdb54b5c368e3c8047a8e821c108bd75c91b0d14/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90b1721303d95e8939b4592e60f18afa0e38ea17042ea3b2a8546358a562b24/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f3843305e98ca43fa4870afdb54b5c368e3c8047a8e821c108bd75c91b0d14/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90b1721303d95e8939b4592e60f18afa0e38ea17042ea3b2a8546358a562b24/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:17:43 np0005486759.ooo.test podman[54368]: 2025-10-14 08:17:43.39641885 +0000 UTC m=+0.172737794 container init 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, tcib_managed=true, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Oct 14 08:17:43 np0005486759.ooo.test sudo[54405]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:17:43 np0005486759.ooo.test sudo[54405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:17:43 np0005486759.ooo.test podman[54368]: 2025-10-14 08:17:43.42760372 +0000 UTC m=+0.203922644 container start 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, release=1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:17:43 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=3fc36489e0095da197228558d2f007a2 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Oct 14 08:17:43 np0005486759.ooo.test sudo[54405]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:17:43 np0005486759.ooo.test podman[54369]: 2025-10-14 08:17:43.507033289 +0000 UTC m=+0.278695580 container init c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, release=1)
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:17:43 np0005486759.ooo.test podman[54369]: 2025-10-14 08:17:43.539108777 +0000 UTC m=+0.310771028 container start c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:17:43 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:17:43 np0005486759.ooo.test python3[53571]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 08:17:43 np0005486759.ooo.test sudo[54295]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:17:43 np0005486759.ooo.test podman[54406]: 2025-10-14 08:17:43.629562135 +0000 UTC m=+0.187676555 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, build-date=2025-07-21T16:28:53, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9)
Oct 14 08:17:43 np0005486759.ooo.test podman[54406]: 2025-10-14 08:17:43.713385057 +0000 UTC m=+0.271499487 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:17:43 np0005486759.ooo.test podman[54406]: unhealthy
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 08:17:43 np0005486759.ooo.test podman[54433]: 2025-10-14 08:17:43.632278089 +0000 UTC m=+0.090584594 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, batch=17.1_20250721.1, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team)
Oct 14 08:17:43 np0005486759.ooo.test podman[54433]: 2025-10-14 08:17:43.763166251 +0000 UTC m=+0.221472766 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, vcs-type=git, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:17:43 np0005486759.ooo.test podman[54433]: unhealthy
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Queued start job for default target Main User Target.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Created slice User Application Slice.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Reached target Paths.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Reached target Timers.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Starting D-Bus User Message Bus Socket...
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Starting Create User's Volatile Files and Directories...
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Finished Create User's Volatile Files and Directories.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Listening on D-Bus User Message Bus Socket.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Reached target Sockets.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Reached target Basic System.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Reached target Main User Target.
Oct 14 08:17:43 np0005486759.ooo.test systemd[54465]: Startup finished in 146ms.
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: Started Session c9 of User root.
Oct 14 08:17:43 np0005486759.ooo.test sudo[54518]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8qc6bd0z/privsep.sock
Oct 14 08:17:43 np0005486759.ooo.test sudo[54518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:43 np0005486759.ooo.test sudo[53569]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:43 np0005486759.ooo.test systemd[1]: session-c9.scope: Deactivated successfully.
Oct 14 08:17:43 np0005486759.ooo.test kernel: device br-int entered promiscuous mode
Oct 14 08:17:43 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760429863.9114] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Oct 14 08:17:43 np0005486759.ooo.test systemd-udevd[54529]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 08:17:43 np0005486759.ooo.test kernel: device genev_sys_6081 entered promiscuous mode
Oct 14 08:17:43 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760429863.9430] device (genev_sys_6081): carrier: link connected
Oct 14 08:17:43 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760429863.9434] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Oct 14 08:17:44 np0005486759.ooo.test sudo[54549]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydnxgghzqzftghllpwoxfdmalwetidwc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:44 np0005486759.ooo.test sudo[54549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:44 np0005486759.ooo.test python3[54551]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:44 np0005486759.ooo.test sudo[54549]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:44 np0005486759.ooo.test sudo[54566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulsupqjlkewvnmcexeiybcdbzcriiyyt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:44 np0005486759.ooo.test sudo[54566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:44 np0005486759.ooo.test sudo[54518]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:44 np0005486759.ooo.test python3[54568]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:44 np0005486759.ooo.test sudo[54566]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:44 np0005486759.ooo.test sudo[54585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kckltikgrdyjfowbkmxbrrfoiimjwhkk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:44 np0005486759.ooo.test sudo[54585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:44 np0005486759.ooo.test sudo[54592]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_s5akbsn/privsep.sock
Oct 14 08:17:44 np0005486759.ooo.test sudo[54592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:44 np0005486759.ooo.test python3[54588]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:44 np0005486759.ooo.test sudo[54585]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:44 np0005486759.ooo.test sudo[54607]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnbqotosagsblzltaaenrhnhsmfmlrlp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:44 np0005486759.ooo.test sudo[54607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:45 np0005486759.ooo.test python3[54610]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:45 np0005486759.ooo.test sudo[54607]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:45 np0005486759.ooo.test sudo[54624]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlwfizvzddgxdevswlvydoooqcruvshs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:45 np0005486759.ooo.test sudo[54624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:45 np0005486759.ooo.test sudo[54628]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmpomlcooqi/privsep.sock
Oct 14 08:17:45 np0005486759.ooo.test sudo[54628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Oct 14 08:17:45 np0005486759.ooo.test python3[54626]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:45 np0005486759.ooo.test sudo[54624]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:45 np0005486759.ooo.test sudo[54592]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:45 np0005486759.ooo.test sudo[54646]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sohcstttofjygwmgcnxmkgcnfnmmutke ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:45 np0005486759.ooo.test sudo[54646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:45 np0005486759.ooo.test python3[54650]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:45 np0005486759.ooo.test sudo[54646]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:45 np0005486759.ooo.test sudo[54669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkaxmjattppldsmahrwubdkpergajkhp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:45 np0005486759.ooo.test sudo[54669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:45 np0005486759.ooo.test sudo[54677]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwi8l27ia/privsep.sock
Oct 14 08:17:45 np0005486759.ooo.test sudo[54677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:45 np0005486759.ooo.test python3[54672]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:45 np0005486759.ooo.test sudo[54669]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:45 np0005486759.ooo.test sudo[54693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzmqzkbprizrgbouhzfltylaykzavszt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:45 np0005486759.ooo.test sudo[54693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:45 np0005486759.ooo.test sudo[54628]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:45 np0005486759.ooo.test python3[54695]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:45 np0005486759.ooo.test sudo[54693]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:46 np0005486759.ooo.test sudo[54711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ditclapustssusrjmmcbepheszqmkiks ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:46 np0005486759.ooo.test sudo[54711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:46 np0005486759.ooo.test python3[54713]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:46 np0005486759.ooo.test sudo[54677]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:46 np0005486759.ooo.test sudo[54711]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:46 np0005486759.ooo.test sudo[54732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbjcreerletaqtbzinzrfpzbtzyiuuhi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:46 np0005486759.ooo.test sudo[54732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:46 np0005486759.ooo.test python3[54734]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:46 np0005486759.ooo.test sudo[54740]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgasvv7xf/privsep.sock
Oct 14 08:17:46 np0005486759.ooo.test sudo[54740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:46 np0005486759.ooo.test sudo[54732]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:46 np0005486759.ooo.test sudo[54755]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oykncqjscrgbglkzbumceieqwwpmgpsr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:46 np0005486759.ooo.test sudo[54755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:46 np0005486759.ooo.test python3[54757]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:46 np0005486759.ooo.test sudo[54755]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:46 np0005486759.ooo.test sudo[54772]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwwefijyljoqnvcpnzvdfslosfjxfqpd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:46 np0005486759.ooo.test sudo[54772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:46 np0005486759.ooo.test python3[54774]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:17:46 np0005486759.ooo.test sudo[54772]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:47 np0005486759.ooo.test sudo[54740]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:47 np0005486759.ooo.test sudo[54837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwklvtstuxdllpfgeqlysslegniagbtg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:47 np0005486759.ooo.test sudo[54837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:47 np0005486759.ooo.test sudo[54844]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprs16xp1_/privsep.sock
Oct 14 08:17:47 np0005486759.ooo.test sudo[54844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:47 np0005486759.ooo.test python3[54840]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429866.972487-111305-15339296729330/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:47 np0005486759.ooo.test sudo[54837]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:47 np0005486759.ooo.test sudo[54873]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ythfijdjehszqhnckykebayoidautbqe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:47 np0005486759.ooo.test sudo[54873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:47 np0005486759.ooo.test sudo[54844]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:47 np0005486759.ooo.test python3[54875]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429866.972487-111305-15339296729330/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:47 np0005486759.ooo.test sudo[54873]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:48 np0005486759.ooo.test sudo[54897]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfzk74wwc/privsep.sock
Oct 14 08:17:48 np0005486759.ooo.test sudo[54897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:48 np0005486759.ooo.test sudo[54912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwzevqqrdmlmmlharopbrngzhwfoljpc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:48 np0005486759.ooo.test sudo[54912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:48 np0005486759.ooo.test python3[54914]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429866.972487-111305-15339296729330/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:48 np0005486759.ooo.test sudo[54912]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:48 np0005486759.ooo.test sudo[54943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeaktqglnwkftgttlkaqgsgncpeyhkld ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:48 np0005486759.ooo.test sudo[54943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:48 np0005486759.ooo.test sudo[54897]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:48 np0005486759.ooo.test python3[54945]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429866.972487-111305-15339296729330/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:48 np0005486759.ooo.test sudo[54943]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:48 np0005486759.ooo.test sudo[54953]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfuolxkg3/privsep.sock
Oct 14 08:17:48 np0005486759.ooo.test sudo[54953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:49 np0005486759.ooo.test sudo[54982]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkbkiafqbyigiecmmswaejhghblfdrcq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:49 np0005486759.ooo.test sudo[54982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:49 np0005486759.ooo.test python3[54984]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429866.972487-111305-15339296729330/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:49 np0005486759.ooo.test sudo[54982]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:49 np0005486759.ooo.test sudo[54953]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:49 np0005486759.ooo.test sudo[55014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkaaotgrhtavoxcfjatbtkbknylvlsoi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:49 np0005486759.ooo.test sudo[55014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:49 np0005486759.ooo.test sudo[55022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps1s1mpks/privsep.sock
Oct 14 08:17:49 np0005486759.ooo.test sudo[55022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:49 np0005486759.ooo.test python3[55017]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429866.972487-111305-15339296729330/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:49 np0005486759.ooo.test sudo[55014]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:49 np0005486759.ooo.test sudo[55038]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evqnpoevfssmcbgdojlmecjioihocpfd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:49 np0005486759.ooo.test sudo[55038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:50 np0005486759.ooo.test python3[55040]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 08:17:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:50 np0005486759.ooo.test systemd-rc-local-generator[55062]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:50 np0005486759.ooo.test systemd-sysv-generator[55065]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:50 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:50 np0005486759.ooo.test sudo[55022]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:50 np0005486759.ooo.test sudo[55038]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:50 np0005486759.ooo.test sudo[55085]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6_hvnuvi/privsep.sock
Oct 14 08:17:50 np0005486759.ooo.test sudo[55085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:50 np0005486759.ooo.test sudo[55102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wknptfengzpajlrfhwkrztnmkipskzen ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:50 np0005486759.ooo.test sudo[55102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:51 np0005486759.ooo.test python3[55106]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:51 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:51 np0005486759.ooo.test systemd-sysv-generator[55134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:51 np0005486759.ooo.test systemd-rc-local-generator[55131]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:51 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:51 np0005486759.ooo.test sudo[55085]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:51 np0005486759.ooo.test systemd[1]: Starting ceilometer_agent_compute container...
Oct 14 08:17:51 np0005486759.ooo.test tripleo-start-podman-container[55151]: Creating additional drop-in dependency for "ceilometer_agent_compute" (48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c)
Oct 14 08:17:51 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:51 np0005486759.ooo.test systemd-sysv-generator[55211]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:51 np0005486759.ooo.test systemd-rc-local-generator[55208]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:51 np0005486759.ooo.test sudo[55219]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppuaalp5i/privsep.sock
Oct 14 08:17:51 np0005486759.ooo.test sudo[55219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:51 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:51 np0005486759.ooo.test systemd[1]: Started ceilometer_agent_compute container.
Oct 14 08:17:51 np0005486759.ooo.test sudo[55102]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:52 np0005486759.ooo.test sudo[55239]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wddhtaspttnrksghochkvgdnlrzaqkmw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:52 np0005486759.ooo.test sudo[55239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:52 np0005486759.ooo.test sudo[55219]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:52 np0005486759.ooo.test python3[55242]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:52 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:52 np0005486759.ooo.test sudo[55253]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmrp2079/privsep.sock
Oct 14 08:17:52 np0005486759.ooo.test sudo[55253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:52 np0005486759.ooo.test systemd-sysv-generator[55280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:52 np0005486759.ooo.test systemd-rc-local-generator[55275]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:17:52 np0005486759.ooo.test systemd[1]: Starting ceilometer_agent_ipmi container...
Oct 14 08:17:52 np0005486759.ooo.test systemd[1]: Started ceilometer_agent_ipmi container.
Oct 14 08:17:52 np0005486759.ooo.test sudo[55239]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:52 np0005486759.ooo.test podman[55291]: 2025-10-14 08:17:52.802073133 +0000 UTC m=+0.082157333 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_id=tripleo_step1, release=1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:17:52 np0005486759.ooo.test sudo[55253]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:53 np0005486759.ooo.test podman[55291]: 2025-10-14 08:17:53.019014188 +0000 UTC m=+0.299098358 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Oct 14 08:17:53 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:17:53 np0005486759.ooo.test sudo[55345]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syiinteqxnjeiksrxrmygflpgugcnlki ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:53 np0005486759.ooo.test sudo[55345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:53 np0005486759.ooo.test sudo[55353]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgbzd3bfx/privsep.sock
Oct 14 08:17:53 np0005486759.ooo.test sudo[55353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:53 np0005486759.ooo.test python3[55347]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:53 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:53 np0005486759.ooo.test systemd-rc-local-generator[55384]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:53 np0005486759.ooo.test systemd-sysv-generator[55387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:53 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:53 np0005486759.ooo.test systemd[1]: Starting logrotate_crond container...
Oct 14 08:17:53 np0005486759.ooo.test sudo[55353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:53 np0005486759.ooo.test systemd[1]: Started logrotate_crond container.
Oct 14 08:17:53 np0005486759.ooo.test sudo[55345]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:54 np0005486759.ooo.test sudo[55428]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnb_jvrto/privsep.sock
Oct 14 08:17:54 np0005486759.ooo.test sudo[55428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:54 np0005486759.ooo.test sudo[55429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhvikwfmjoafuoaatapjyvvklrhfodlz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 08:17:54 np0005486759.ooo.test sudo[55429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Activating special unit Exit the Session...
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped target Main User Target.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped target Basic System.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped target Paths.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped target Sockets.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped target Timers.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Closed D-Bus User Message Bus Socket.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Stopped Create User's Volatile Files and Directories.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Removed slice User Application Slice.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Reached target Shutdown.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Finished Exit the Session.
Oct 14 08:17:54 np0005486759.ooo.test systemd[54465]: Reached target Exit the Session.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 08:17:54 np0005486759.ooo.test python3[55432]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:54 np0005486759.ooo.test systemd-rc-local-generator[55458]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:54 np0005486759.ooo.test systemd-sysv-generator[55462]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Starting nova_migration_target container...
Oct 14 08:17:54 np0005486759.ooo.test sudo[55428]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:54 np0005486759.ooo.test systemd[1]: Started nova_migration_target container.
Oct 14 08:17:54 np0005486759.ooo.test sudo[55429]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:55 np0005486759.ooo.test sudo[55493]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzahl545z/privsep.sock
Oct 14 08:17:55 np0005486759.ooo.test sudo[55493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:55 np0005486759.ooo.test sudo[55508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvbdcfclwylyxmimluxgrgmliphbqdos ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:55 np0005486759.ooo.test sudo[55508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:55 np0005486759.ooo.test python3[55511]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:55 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:55 np0005486759.ooo.test systemd-rc-local-generator[55539]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:55 np0005486759.ooo.test systemd-sysv-generator[55544]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:55 np0005486759.ooo.test sudo[55493]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:55 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:17:55 np0005486759.ooo.test systemd[1]: Starting ovn_controller container...
Oct 14 08:17:55 np0005486759.ooo.test sudo[55580]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5tmkrze6/privsep.sock
Oct 14 08:17:55 np0005486759.ooo.test sudo[55580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:55 np0005486759.ooo.test podman[55555]: 2025-10-14 08:17:55.903151966 +0000 UTC m=+0.093261955 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, release=2, container_name=collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9)
Oct 14 08:17:55 np0005486759.ooo.test podman[55555]: 2025-10-14 08:17:55.912465862 +0000 UTC m=+0.102575851 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, release=2, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 08:17:55 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:17:56 np0005486759.ooo.test tripleo-start-podman-container[55557]: Creating additional drop-in dependency for "ovn_controller" (c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6)
Oct 14 08:17:56 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:56 np0005486759.ooo.test systemd-rc-local-generator[55638]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:56 np0005486759.ooo.test systemd-sysv-generator[55641]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:56 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:56 np0005486759.ooo.test systemd[1]: Started ovn_controller container.
Oct 14 08:17:56 np0005486759.ooo.test sudo[55508]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:56 np0005486759.ooo.test sudo[55580]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:56 np0005486759.ooo.test sudo[55668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkeadtcjpopiintdjozddahsncmwcbhe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:17:56 np0005486759.ooo.test sudo[55668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:56 np0005486759.ooo.test sudo[55676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplg88lp8v/privsep.sock
Oct 14 08:17:56 np0005486759.ooo.test sudo[55676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:56 np0005486759.ooo.test python3[55671]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:17:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:17:57 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:17:57 np0005486759.ooo.test podman[55680]: 2025-10-14 08:17:57.14714374 +0000 UTC m=+0.111289991 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, release=1, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12)
Oct 14 08:17:57 np0005486759.ooo.test systemd-rc-local-generator[55713]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:17:57 np0005486759.ooo.test systemd-sysv-generator[55716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:17:57 np0005486759.ooo.test podman[55680]: 2025-10-14 08:17:57.186346518 +0000 UTC m=+0.150492779 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-iscsid-container, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git)
Oct 14 08:17:57 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:17:57 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:17:57 np0005486759.ooo.test systemd[1]: Starting ovn_metadata_agent container...
Oct 14 08:17:57 np0005486759.ooo.test sudo[55676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:57 np0005486759.ooo.test systemd[1]: Started ovn_metadata_agent container.
Oct 14 08:17:57 np0005486759.ooo.test sudo[55668]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:57 np0005486759.ooo.test sudo[55785]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps7gctvy0/privsep.sock
Oct 14 08:17:57 np0005486759.ooo.test sudo[55785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:57 np0005486759.ooo.test sudo[55783]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqqmbrgaywstxltwxohoudbzqvqhvrzj ; /usr/bin/python3
Oct 14 08:17:57 np0005486759.ooo.test sudo[55783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:57 np0005486759.ooo.test python3[55788]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:57 np0005486759.ooo.test sudo[55783]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:58 np0005486759.ooo.test sudo[55785]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:58 np0005486759.ooo.test sudo[55838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghgojfdqiqrlhhmhyelhfyxkmrqkbffr ; /usr/bin/python3
Oct 14 08:17:58 np0005486759.ooo.test sudo[55838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:58 np0005486759.ooo.test sudo[55838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:58 np0005486759.ooo.test sudo[55859]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpts1idvgl/privsep.sock
Oct 14 08:17:58 np0005486759.ooo.test sudo[55859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:58 np0005486759.ooo.test sudo[55888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnokggnxliuvlxcoywqvpoljjupraoon ; /usr/bin/python3
Oct 14 08:17:58 np0005486759.ooo.test sudo[55888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:58 np0005486759.ooo.test sudo[55888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:59 np0005486759.ooo.test sudo[55919]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acxhfvigwwpdzbjdldicaetdztfsnzvw ; /usr/bin/python3
Oct 14 08:17:59 np0005486759.ooo.test sudo[55919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:59 np0005486759.ooo.test sudo[55859]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:59 np0005486759.ooo.test python3[55922]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005486759 step=4 update_config_hash_only=False
Oct 14 08:17:59 np0005486759.ooo.test sudo[55919]: pam_unix(sudo:session): session closed for user root
Oct 14 08:17:59 np0005486759.ooo.test sudo[55930]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz1ul5hqc/privsep.sock
Oct 14 08:17:59 np0005486759.ooo.test sudo[55930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:17:59 np0005486759.ooo.test sudo[55945]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqucdthkbxjcjjdmsgilsgjsvbijbssk ; /usr/bin/python3
Oct 14 08:17:59 np0005486759.ooo.test sudo[55945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:17:59 np0005486759.ooo.test python3[55948]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:17:59 np0005486759.ooo.test sudo[55945]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:00 np0005486759.ooo.test sudo[55962]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kooahozxbwygxnpmixoeetjocubftmdj ; /usr/bin/python3
Oct 14 08:18:00 np0005486759.ooo.test sudo[55962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:00 np0005486759.ooo.test sudo[55930]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:00 np0005486759.ooo.test python3[55964]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Oct 14 08:18:00 np0005486759.ooo.test sudo[55962]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:00 np0005486759.ooo.test sudo[55973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0gqzd8_m/privsep.sock
Oct 14 08:18:00 np0005486759.ooo.test sudo[55973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:00 np0005486759.ooo.test sudo[55973]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:01 np0005486759.ooo.test sudo[56024]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vghbskgwjohdbgyiytupraxxypxbxfpr ; /usr/bin/python3
Oct 14 08:18:01 np0005486759.ooo.test sudo[56024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:01 np0005486759.ooo.test python3[56026]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:01 np0005486759.ooo.test sudo[56033]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp68vochmv/privsep.sock
Oct 14 08:18:01 np0005486759.ooo.test sudo[56024]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:01 np0005486759.ooo.test sudo[56033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:01 np0005486759.ooo.test sudo[56077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yavwuqamoysukbdowyziinrnjhhabyjl ; /usr/bin/python3
Oct 14 08:18:01 np0005486759.ooo.test sudo[56077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:01 np0005486759.ooo.test python3[56079]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429880.9450042-111858-233876067686802/source _original_basename=tmp_0_om7aq follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:01 np0005486759.ooo.test sudo[56077]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:01 np0005486759.ooo.test sudo[56033]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:01 np0005486759.ooo.test sudo[56117]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrrgjonorqbtvnhuglczjpbstgiqglpy ; /usr/bin/python3
Oct 14 08:18:01 np0005486759.ooo.test sudo[56117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:02 np0005486759.ooo.test sudo[56125]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3es1o38h/privsep.sock
Oct 14 08:18:02 np0005486759.ooo.test sudo[56125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:02 np0005486759.ooo.test python3[56119]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:18:02 np0005486759.ooo.test sudo[56117]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:02 np0005486759.ooo.test sudo[56175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbyvksolqjwwkerjqosdennpulstfnnm ; /usr/bin/python3
Oct 14 08:18:02 np0005486759.ooo.test sudo[56175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:02 np0005486759.ooo.test sudo[56125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:02 np0005486759.ooo.test sudo[56175]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:02 np0005486759.ooo.test sudo[56196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiyovyqhchlswflwyxmspnybkpkfxrkl ; /usr/bin/python3
Oct 14 08:18:02 np0005486759.ooo.test sudo[56196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:02 np0005486759.ooo.test sudo[56196]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:02 np0005486759.ooo.test sudo[56204]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpedpiug3h/privsep.sock
Oct 14 08:18:02 np0005486759.ooo.test sudo[56204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:03 np0005486759.ooo.test sudo[56308]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaqvujbduaseikxxqzldzcdhrawrmuus ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429883.0257738-111910-145769564457928/async_wrapper.py 501205278267 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429883.0257738-111910-145769564457928/AnsiballZ_command.py _
Oct 14 08:18:03 np0005486759.ooo.test sudo[56308]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:18:03 np0005486759.ooo.test ansible-async_wrapper.py[56310]: Invoked with 501205278267 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429883.0257738-111910-145769564457928/AnsiballZ_command.py _
Oct 14 08:18:03 np0005486759.ooo.test ansible-async_wrapper.py[56314]: Starting module and watcher
Oct 14 08:18:03 np0005486759.ooo.test ansible-async_wrapper.py[56314]: Start watching 56315 (3600)
Oct 14 08:18:03 np0005486759.ooo.test ansible-async_wrapper.py[56315]: Start module (56315)
Oct 14 08:18:03 np0005486759.ooo.test ansible-async_wrapper.py[56310]: Return async_wrapper task started.
Oct 14 08:18:03 np0005486759.ooo.test sudo[56308]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:03 np0005486759.ooo.test sudo[56204]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:03 np0005486759.ooo.test sudo[56332]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlvoqvosrocvxlxqtdlvyfndervbwzhm ; /usr/bin/python3
Oct 14 08:18:03 np0005486759.ooo.test sudo[56332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:03 np0005486759.ooo.test python3[56334]: ansible-ansible.legacy.async_status Invoked with jid=501205278267.56310 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:18:03 np0005486759.ooo.test sudo[56332]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:03 np0005486759.ooo.test sudo[56343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptw2qhkl_/privsep.sock
Oct 14 08:18:03 np0005486759.ooo.test sudo[56343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:04 np0005486759.ooo.test sudo[56343]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:04 np0005486759.ooo.test sudo[56369]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbq_e6n8a/privsep.sock
Oct 14 08:18:04 np0005486759.ooo.test sudo[56369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:05 np0005486759.ooo.test sudo[56369]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:05 np0005486759.ooo.test sudo[56385]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9j5jmi_4/privsep.sock
Oct 14 08:18:05 np0005486759.ooo.test sudo[56385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:06 np0005486759.ooo.test sudo[56385]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:06 np0005486759.ooo.test sudo[56489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprrb7ri2s/privsep.sock
Oct 14 08:18:06 np0005486759.ooo.test sudo[56489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:06 np0005486759.ooo.test sudo[56489]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:07 np0005486759.ooo.test sudo[56503]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfwbnniaj/privsep.sock
Oct 14 08:18:07 np0005486759.ooo.test sudo[56503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (file: /etc/puppet/hiera.yaml)
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: Undefined variable '::deploy_config_name';
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (file & line not available)
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (file & line not available)
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Notice: Compiled catalog for np0005486759.ooo.test in environment production in 0.21 seconds
Oct 14 08:18:07 np0005486759.ooo.test sudo[56503]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Notice: Applied catalog in 0.29 seconds
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Application:
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    Initial environment: production
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    Converged environment: production
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:          Run mode: user
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Changes:
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Events:
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Resources:
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:             Total: 19
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Time:
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:        Filebucket: 0.00
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:           Package: 0.00
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:          Schedule: 0.00
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:              Exec: 0.01
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:            Augeas: 0.01
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:              File: 0.02
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:           Service: 0.07
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    Config retrieval: 0.27
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    Transaction evaluation: 0.28
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:    Catalog application: 0.29
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:          Last run: 1760429887
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:             Total: 0.29
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]: Version:
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:            Config: 1760429887
Oct 14 08:18:07 np0005486759.ooo.test puppet-user[56337]:            Puppet: 7.10.0
Oct 14 08:18:07 np0005486759.ooo.test ansible-async_wrapper.py[56315]: Module complete (56315)
Oct 14 08:18:07 np0005486759.ooo.test sudo[56526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1u3lq0p8/privsep.sock
Oct 14 08:18:07 np0005486759.ooo.test sudo[56526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:08 np0005486759.ooo.test ansible-async_wrapper.py[56314]: Done in kid B.
Oct 14 08:18:08 np0005486759.ooo.test sudo[56526]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:08 np0005486759.ooo.test sudo[56537]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa6hrcy2o/privsep.sock
Oct 14 08:18:08 np0005486759.ooo.test sudo[56537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:09 np0005486759.ooo.test sudo[56537]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:18:09 np0005486759.ooo.test podman[56543]: 2025-10-14 08:18:09.480050869 +0000 UTC m=+0.109139955 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, build-date=2025-07-21T14:45:33, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Oct 14 08:18:09 np0005486759.ooo.test podman[56543]: 2025-10-14 08:18:09.51126354 +0000 UTC m=+0.140352606 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:18:09 np0005486759.ooo.test sudo[56589]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkwrysv_v/privsep.sock
Oct 14 08:18:09 np0005486759.ooo.test sudo[56589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:09 np0005486759.ooo.test podman[56543]: unhealthy
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: tmp-crun.neztwC.mount: Deactivated successfully.
Oct 14 08:18:09 np0005486759.ooo.test podman[56564]: 2025-10-14 08:18:09.576939155 +0000 UTC m=+0.089211420 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, architecture=x86_64, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 08:18:09 np0005486759.ooo.test podman[56564]: 2025-10-14 08:18:09.589746339 +0000 UTC m=+0.102018584 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:18:09 np0005486759.ooo.test podman[56564]: unhealthy
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:18:09 np0005486759.ooo.test podman[56561]: 2025-10-14 08:18:09.672823849 +0000 UTC m=+0.187976093 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, release=1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, version=17.1.9, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=)
Oct 14 08:18:09 np0005486759.ooo.test podman[56561]: 2025-10-14 08:18:09.677807463 +0000 UTC m=+0.192959727 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Oct 14 08:18:09 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:18:10 np0005486759.ooo.test sudo[56589]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:18:10 np0005486759.ooo.test podman[56608]: 2025-10-14 08:18:10.240941437 +0000 UTC m=+0.067242404 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:18:10 np0005486759.ooo.test sudo[56635]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjb8faece/privsep.sock
Oct 14 08:18:10 np0005486759.ooo.test sudo[56635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:10 np0005486759.ooo.test podman[56608]: 2025-10-14 08:18:10.61067907 +0000 UTC m=+0.436980087 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, release=1, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:18:10 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:18:11 np0005486759.ooo.test sudo[56635]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:11 np0005486759.ooo.test sudo[56648]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqxwnkvbh/privsep.sock
Oct 14 08:18:11 np0005486759.ooo.test sudo[56648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:11 np0005486759.ooo.test sudo[56648]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:12 np0005486759.ooo.test sudo[56659]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwmdbdhy_/privsep.sock
Oct 14 08:18:12 np0005486759.ooo.test sudo[56659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:12 np0005486759.ooo.test sudo[56659]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:13 np0005486759.ooo.test sudo[56676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx0wehnph/privsep.sock
Oct 14 08:18:13 np0005486759.ooo.test sudo[56676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:13 np0005486759.ooo.test sudo[56676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:13 np0005486759.ooo.test sudo[56696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzlfnvnqraiwadkkxrequsipjhucjdky ; /usr/bin/python3
Oct 14 08:18:13 np0005486759.ooo.test sudo[56696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:18:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:18:13 np0005486759.ooo.test podman[56698]: 2025-10-14 08:18:13.914704846 +0000 UTC m=+0.074463955 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12)
Oct 14 08:18:13 np0005486759.ooo.test sudo[56731]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt7oufxyz/privsep.sock
Oct 14 08:18:13 np0005486759.ooo.test sudo[56731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:13 np0005486759.ooo.test podman[56698]: 2025-10-14 08:18:13.957150604 +0000 UTC m=+0.116909733 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1)
Oct 14 08:18:13 np0005486759.ooo.test podman[56700]: 2025-10-14 08:18:13.975610333 +0000 UTC m=+0.132780012 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=)
Oct 14 08:18:13 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:18:13 np0005486759.ooo.test python3[56699]: ansible-ansible.legacy.async_status Invoked with jid=501205278267.56310 mode=status _async_dir=/tmp/.ansible_async
Oct 14 08:18:14 np0005486759.ooo.test sudo[56696]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:14 np0005486759.ooo.test podman[56700]: 2025-10-14 08:18:14.024495979 +0000 UTC m=+0.181665648 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, batch=17.1_20250721.1, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container)
Oct 14 08:18:14 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:18:14 np0005486759.ooo.test sudo[56764]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjvapxvrjnxlmqdhfwrubhttsvhogwpu ; /usr/bin/python3
Oct 14 08:18:14 np0005486759.ooo.test sudo[56764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:14 np0005486759.ooo.test python3[56766]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:18:14 np0005486759.ooo.test sudo[56764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:14 np0005486759.ooo.test sudo[56731]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:14 np0005486759.ooo.test sudo[56783]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjfhcfifbkzkogbmlzpozbkxlnneskpv ; /usr/bin/python3
Oct 14 08:18:14 np0005486759.ooo.test sudo[56783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:14 np0005486759.ooo.test python3[56785]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:18:14 np0005486759.ooo.test sudo[56792]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8srfm7vz/privsep.sock
Oct 14 08:18:14 np0005486759.ooo.test sudo[56792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:14 np0005486759.ooo.test sudo[56783]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:15 np0005486759.ooo.test sudo[56841]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhecediamlyguhbbsiryvlwqvhysmhnk ; /usr/bin/python3
Oct 14 08:18:15 np0005486759.ooo.test sudo[56841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:15 np0005486759.ooo.test python3[56843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:15 np0005486759.ooo.test sudo[56841]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:15 np0005486759.ooo.test sudo[56860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtmvdhorbbbmyfennejevdrgniuexvsx ; /usr/bin/python3
Oct 14 08:18:15 np0005486759.ooo.test sudo[56860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:15 np0005486759.ooo.test sudo[56792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:15 np0005486759.ooo.test python3[56862]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpd5rd_s12 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 08:18:15 np0005486759.ooo.test sudo[56860]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:15 np0005486759.ooo.test sudo[56890]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm_r5mnq7/privsep.sock
Oct 14 08:18:15 np0005486759.ooo.test sudo[56890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:15 np0005486759.ooo.test sudo[56898]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrvijsjvlojebzxsilfqyzpzmdbrrims ; /usr/bin/python3
Oct 14 08:18:15 np0005486759.ooo.test sudo[56898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:15 np0005486759.ooo.test python3[56901]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:15 np0005486759.ooo.test sudo[56898]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:16 np0005486759.ooo.test sudo[56916]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytfimspcgjadcrjiezodjmkgocenyzlw ; /usr/bin/python3
Oct 14 08:18:16 np0005486759.ooo.test sudo[56916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:16 np0005486759.ooo.test sudo[56890]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:16 np0005486759.ooo.test sudo[56916]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:16 np0005486759.ooo.test sudo[57000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbs6y307d/privsep.sock
Oct 14 08:18:16 np0005486759.ooo.test sudo[57000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:16 np0005486759.ooo.test sudo[57016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnhivoxullnlpopepgwfyrmlgwmxxbfc ; /usr/bin/python3
Oct 14 08:18:16 np0005486759.ooo.test sudo[57016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:17 np0005486759.ooo.test python3[57018]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Oct 14 08:18:17 np0005486759.ooo.test sudo[57016]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:17 np0005486759.ooo.test sudo[57000]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:17 np0005486759.ooo.test sudo[57044]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf7u9fxxo/privsep.sock
Oct 14 08:18:17 np0005486759.ooo.test sudo[57044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:17 np0005486759.ooo.test sudo[57042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfjviotqdoichdwzhynstsasvmwyniuy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:17 np0005486759.ooo.test sudo[57042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:17 np0005486759.ooo.test python3[57047]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:17 np0005486759.ooo.test sudo[57042]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:17 np0005486759.ooo.test sudo[57062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmmcdfffdwbhpdfqrpigtrvdtccewjpk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:17 np0005486759.ooo.test sudo[57062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:17 np0005486759.ooo.test sudo[57062]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:18 np0005486759.ooo.test sudo[57044]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:18 np0005486759.ooo.test sudo[57085]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idjjbtosjahriqahiabidszbjtxcbrgs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:18 np0005486759.ooo.test sudo[57085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:18 np0005486759.ooo.test python3[57089]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:18:18 np0005486759.ooo.test sudo[57085]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:18 np0005486759.ooo.test sudo[57097]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1i2g3uf6/privsep.sock
Oct 14 08:18:18 np0005486759.ooo.test sudo[57097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:18 np0005486759.ooo.test sudo[57145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwbzunppklyjepdtwporsqrgtqvkalyw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:18 np0005486759.ooo.test sudo[57145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:18 np0005486759.ooo.test python3[57147]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:18 np0005486759.ooo.test sudo[57145]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:18 np0005486759.ooo.test sudo[57163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oydhdzqdrcvcyyvuleansqasbypjkqcx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:18 np0005486759.ooo.test sudo[57163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:18 np0005486759.ooo.test sudo[57097]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:19 np0005486759.ooo.test python3[57165]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:19 np0005486759.ooo.test sudo[57163]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:19 np0005486759.ooo.test sudo[57220]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiw1qg8rq/privsep.sock
Oct 14 08:18:19 np0005486759.ooo.test sudo[57220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:19 np0005486759.ooo.test sudo[57235]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcazdemsslxdeayhviyktguukrjjiyrl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:19 np0005486759.ooo.test sudo[57235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:19 np0005486759.ooo.test python3[57237]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:19 np0005486759.ooo.test sudo[57235]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:19 np0005486759.ooo.test sudo[57254]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbaipbnqjdelwabidqelruvktvgwuojs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:19 np0005486759.ooo.test sudo[57254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:19 np0005486759.ooo.test python3[57256]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:19 np0005486759.ooo.test sudo[57254]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:19 np0005486759.ooo.test sudo[57220]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:20 np0005486759.ooo.test sudo[57321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzgqtvqnuqizszzqyppgpywguhywgwdo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:20 np0005486759.ooo.test sudo[57321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:20 np0005486759.ooo.test sudo[57327]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpodc58o7_/privsep.sock
Oct 14 08:18:20 np0005486759.ooo.test sudo[57327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:20 np0005486759.ooo.test python3[57325]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:20 np0005486759.ooo.test sudo[57321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:20 np0005486759.ooo.test sudo[57345]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqqjpnrynqqmdrqqfcgijfofcogehlpn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:20 np0005486759.ooo.test sudo[57345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:20 np0005486759.ooo.test python3[57347]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:20 np0005486759.ooo.test sudo[57345]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:20 np0005486759.ooo.test sudo[57327]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:20 np0005486759.ooo.test sudo[57410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azsabgtqtkkbyajomgffvljuxybzwmbw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:20 np0005486759.ooo.test sudo[57410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:20 np0005486759.ooo.test python3[57412]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:20 np0005486759.ooo.test sudo[57410]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:20 np0005486759.ooo.test sudo[57420]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr6ewfqau/privsep.sock
Oct 14 08:18:20 np0005486759.ooo.test sudo[57420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:21 np0005486759.ooo.test sudo[57435]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzqhacibdlglrvjxwamdwfwlifsdpncr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:21 np0005486759.ooo.test sudo[57435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:21 np0005486759.ooo.test python3[57437]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:21 np0005486759.ooo.test sudo[57435]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:21 np0005486759.ooo.test sudo[57466]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qikaukmejkawwoapujdfwxzpkwwysetc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:21 np0005486759.ooo.test sudo[57466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:21 np0005486759.ooo.test sudo[57420]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:21 np0005486759.ooo.test python3[57468]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:18:21 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:18:21 np0005486759.ooo.test systemd-rc-local-generator[57497]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:18:21 np0005486759.ooo.test systemd-sysv-generator[57502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:18:21 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:18:21 np0005486759.ooo.test sudo[57512]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnqe6v0hm/privsep.sock
Oct 14 08:18:21 np0005486759.ooo.test sudo[57512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:21 np0005486759.ooo.test sudo[57466]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:22 np0005486759.ooo.test sudo[57562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqxriwmwkdirvkzavryygiujlvbscctd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:22 np0005486759.ooo.test sudo[57562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:22 np0005486759.ooo.test sudo[57512]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:22 np0005486759.ooo.test python3[57564]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:22 np0005486759.ooo.test sudo[57562]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:22 np0005486759.ooo.test sudo[57583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvsqjbqwsigtxaockmotinrhbemealsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:22 np0005486759.ooo.test sudo[57583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:22 np0005486759.ooo.test python3[57585]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:22 np0005486759.ooo.test sudo[57583]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:22 np0005486759.ooo.test sudo[57591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvl38rh94/privsep.sock
Oct 14 08:18:22 np0005486759.ooo.test sudo[57591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:22 np0005486759.ooo.test sudo[57653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-focvdvdiroyjxicbhreptdvaejujpwym ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:22 np0005486759.ooo.test sudo[57653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:23 np0005486759.ooo.test python3[57655]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 14 08:18:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 08:18:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 08:18:23 np0005486759.ooo.test sudo[57653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:23 np0005486759.ooo.test sudo[57673]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-woxmrhgsmdgcofiqgugadfndwgfomoof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:23 np0005486759.ooo.test sudo[57673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:18:23 np0005486759.ooo.test podman[57676]: 2025-10-14 08:18:23.367533964 +0000 UTC m=+0.087294912 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 08:18:23 np0005486759.ooo.test sudo[57591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:23 np0005486759.ooo.test python3[57675]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:23 np0005486759.ooo.test sudo[57673]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:23 np0005486759.ooo.test podman[57676]: 2025-10-14 08:18:23.553168794 +0000 UTC m=+0.272929772 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:18:23 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:18:23 np0005486759.ooo.test sudo[57741]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxirzlkngxlpfweohiuvufkvcquontzr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:23 np0005486759.ooo.test sudo[57741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:23 np0005486759.ooo.test sudo[57744]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjrzlw3jm/privsep.sock
Oct 14 08:18:23 np0005486759.ooo.test sudo[57744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:23 np0005486759.ooo.test python3[57745]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:18:23 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:18:24 np0005486759.ooo.test systemd-rc-local-generator[57769]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:18:24 np0005486759.ooo.test systemd-sysv-generator[57775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:18:24 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:18:24 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 08:18:24 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 08:18:24 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 08:18:24 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 08:18:24 np0005486759.ooo.test sudo[57741]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:24 np0005486759.ooo.test sudo[57744]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:24 np0005486759.ooo.test sudo[57811]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwfbnv3t6/privsep.sock
Oct 14 08:18:24 np0005486759.ooo.test sudo[57811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:24 np0005486759.ooo.test sudo[57809]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oezcstsntwododybbwehtivcygjwwcix ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:24 np0005486759.ooo.test sudo[57809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:24 np0005486759.ooo.test python3[57814]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Oct 14 08:18:24 np0005486759.ooo.test sudo[57809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:25 np0005486759.ooo.test sudo[57829]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdrrdjcnzuubyxjwgnrtdrlqqylsyryo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:25 np0005486759.ooo.test sudo[57829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:25 np0005486759.ooo.test sudo[57811]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:25 np0005486759.ooo.test sudo[57867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ybyszoq/privsep.sock
Oct 14 08:18:25 np0005486759.ooo.test sudo[57867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:25 np0005486759.ooo.test sudo[57829]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:26 np0005486759.ooo.test sudo[57867]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: tmp-crun.eczgtr.mount: Deactivated successfully.
Oct 14 08:18:26 np0005486759.ooo.test podman[57872]: 2025-10-14 08:18:26.141013921 +0000 UTC m=+0.110369083 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, batch=17.1_20250721.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:04:03, distribution-scope=public, managed_by=tripleo_ansible, release=2, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12)
Oct 14 08:18:26 np0005486759.ooo.test podman[57872]: 2025-10-14 08:18:26.153763993 +0000 UTC m=+0.123119195 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, release=2, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, tcib_managed=true, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:18:26 np0005486759.ooo.test sudo[57910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdrhthqxkgdalqbbihuurwicyyfpmnyk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:26 np0005486759.ooo.test sudo[57910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:26 np0005486759.ooo.test sudo[57913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj79azf28/privsep.sock
Oct 14 08:18:26 np0005486759.ooo.test sudo[57913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:26 np0005486759.ooo.test python3[57914]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Oct 14 08:18:26 np0005486759.ooo.test podman[57954]: 2025-10-14 08:18:26.773027966 +0000 UTC m=+0.096083071 container create aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, container_name=nova_compute)
Oct 14 08:18:26 np0005486759.ooo.test podman[57954]: 2025-10-14 08:18:26.709303313 +0000 UTC m=+0.032358428 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Started libpod-conmon-aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.scope.
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:18:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda03ea32496994d88a46cb581656ffbbab2c391369247a306314cbff2d505cd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:18:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda03ea32496994d88a46cb581656ffbbab2c391369247a306314cbff2d505cd/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 08:18:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda03ea32496994d88a46cb581656ffbbab2c391369247a306314cbff2d505cd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 08:18:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda03ea32496994d88a46cb581656ffbbab2c391369247a306314cbff2d505cd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 08:18:26 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda03ea32496994d88a46cb581656ffbbab2c391369247a306314cbff2d505cd/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:18:26 np0005486759.ooo.test podman[57954]: 2025-10-14 08:18:26.8565369 +0000 UTC m=+0.179592005 container init aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.9, config_id=tripleo_step5, container_name=nova_compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64)
Oct 14 08:18:26 np0005486759.ooo.test sudo[57976]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:18:26 np0005486759.ooo.test sudo[57913]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:26 np0005486759.ooo.test podman[57954]: 2025-10-14 08:18:26.894575112 +0000 UTC m=+0.217630237 container start aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team)
Oct 14 08:18:26 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:18:26 np0005486759.ooo.test python3[57914]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 08:18:26 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 08:18:26 np0005486759.ooo.test systemd[57998]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:18:26 np0005486759.ooo.test podman[57977]: 2025-10-14 08:18:26.976698233 +0000 UTC m=+0.078329565 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:18:27 np0005486759.ooo.test podman[57977]: 2025-10-14 08:18:27.021987328 +0000 UTC m=+0.123618660 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc.)
Oct 14 08:18:27 np0005486759.ooo.test podman[57977]: unhealthy
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Failed with result 'exit-code'.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Queued start job for default target Main User Target.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Created slice User Application Slice.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Reached target Paths.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Reached target Timers.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Starting D-Bus User Message Bus Socket...
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Starting Create User's Volatile Files and Directories...
Oct 14 08:18:27 np0005486759.ooo.test sudo[57910]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Finished Create User's Volatile Files and Directories.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Listening on D-Bus User Message Bus Socket.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Reached target Sockets.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Reached target Basic System.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Reached target Main User Target.
Oct 14 08:18:27 np0005486759.ooo.test systemd[57998]: Startup finished in 137ms.
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: Started Session c10 of User root.
Oct 14 08:18:27 np0005486759.ooo.test sudo[57976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Oct 14 08:18:27 np0005486759.ooo.test sudo[57976]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:27 np0005486759.ooo.test sudo[58049]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp19iosw2m/privsep.sock
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: session-c10.scope: Deactivated successfully.
Oct 14 08:18:27 np0005486759.ooo.test sudo[58049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:27 np0005486759.ooo.test sudo[58065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfoaidjlknlxmrypgrcozrjyctcnvvli ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:27 np0005486759.ooo.test sudo[58065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:27 np0005486759.ooo.test python3[58067]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:27 np0005486759.ooo.test sudo[58065]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:27 np0005486759.ooo.test sudo[58082]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wunotlgljapyrzopidwanorftxctpipl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:27 np0005486759.ooo.test sudo[58082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:18:27 np0005486759.ooo.test podman[58085]: 2025-10-14 08:18:27.698025021 +0000 UTC m=+0.100851229 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container)
Oct 14 08:18:27 np0005486759.ooo.test podman[58085]: 2025-10-14 08:18:27.710297969 +0000 UTC m=+0.113124177 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:18:27 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:18:27 np0005486759.ooo.test python3[58084]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 08:18:27 np0005486759.ooo.test sudo[58082]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:27 np0005486759.ooo.test sudo[58049]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:28 np0005486759.ooo.test sudo[58168]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omrlicgimtdnjctefsxtyufmkdxftvds ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:28 np0005486759.ooo.test sudo[58168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:28 np0005486759.ooo.test sudo[58174]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbk3tprpu/privsep.sock
Oct 14 08:18:28 np0005486759.ooo.test sudo[58174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:28 np0005486759.ooo.test python3[58172]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429907.7914135-112458-160283917961225/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:28 np0005486759.ooo.test sudo[58168]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:28 np0005486759.ooo.test sudo[58190]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppavnjjdfbxkykpaobpddlxshzqtglpo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:28 np0005486759.ooo.test sudo[58190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:28 np0005486759.ooo.test python3[58192]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 08:18:28 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:18:28 np0005486759.ooo.test systemd-rc-local-generator[58221]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:18:28 np0005486759.ooo.test systemd-sysv-generator[58224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:18:28 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:18:28 np0005486759.ooo.test sudo[58174]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:28 np0005486759.ooo.test sudo[58190]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:29 np0005486759.ooo.test sudo[58245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5m878bk7/privsep.sock
Oct 14 08:18:29 np0005486759.ooo.test sudo[58245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:29 np0005486759.ooo.test sudo[58262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udhknnjabnbnovkefuwduplhptfocomp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Oct 14 08:18:29 np0005486759.ooo.test sudo[58262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:29 np0005486759.ooo.test python3[58264]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 08:18:29 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:18:29 np0005486759.ooo.test systemd-rc-local-generator[58285]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:18:29 np0005486759.ooo.test systemd-sysv-generator[58291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:18:29 np0005486759.ooo.test sudo[58245]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:29 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:18:29 np0005486759.ooo.test systemd[1]: Starting nova_compute container...
Oct 14 08:18:30 np0005486759.ooo.test sudo[58334]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpko0f5sez/privsep.sock
Oct 14 08:18:30 np0005486759.ooo.test sudo[58334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:30 np0005486759.ooo.test tripleo-start-podman-container[58331]: Creating additional drop-in dependency for "nova_compute" (aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb)
Oct 14 08:18:30 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 08:18:30 np0005486759.ooo.test systemd-sysv-generator[58406]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 08:18:30 np0005486759.ooo.test systemd-rc-local-generator[58401]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 08:18:30 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 08:18:30 np0005486759.ooo.test systemd[1]: Started nova_compute container.
Oct 14 08:18:30 np0005486759.ooo.test sudo[58262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:30 np0005486759.ooo.test sudo[58334]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:30 np0005486759.ooo.test sudo[58458]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axpnpvevrhwptehqmyzgatokqsgjoznu ; /usr/bin/python3
Oct 14 08:18:30 np0005486759.ooo.test sudo[58458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:30 np0005486759.ooo.test sudo[58466]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3esrzry3/privsep.sock
Oct 14 08:18:30 np0005486759.ooo.test sudo[58466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:30 np0005486759.ooo.test python3[58460]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:30 np0005486759.ooo.test sudo[58458]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:31 np0005486759.ooo.test sudo[58514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijxbpumrsacvfucqertdxiqfrdebvnsd ; /usr/bin/python3
Oct 14 08:18:31 np0005486759.ooo.test sudo[58514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:31 np0005486759.ooo.test sudo[58466]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:31 np0005486759.ooo.test sudo[58514]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:31 np0005486759.ooo.test sudo[58562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsbgfrrafwnmkeqryznmucfyjvgatkfe ; /usr/bin/python3
Oct 14 08:18:31 np0005486759.ooo.test sudo[58562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:31 np0005486759.ooo.test sudo[58568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgm8w0jec/privsep.sock
Oct 14 08:18:31 np0005486759.ooo.test sudo[58568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:31 np0005486759.ooo.test sudo[58562]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:32 np0005486759.ooo.test sudo[58598]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpvcvantxovhbgdopeqzpdqgdwpfvshk ; /usr/bin/python3
Oct 14 08:18:32 np0005486759.ooo.test sudo[58598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:32 np0005486759.ooo.test python3[58600]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005486759 step=5 update_config_hash_only=False
Oct 14 08:18:32 np0005486759.ooo.test sudo[58598]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:32 np0005486759.ooo.test sudo[58568]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:32 np0005486759.ooo.test sudo[58619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-taetuldgpwrulfsqrolgdlxvnzmxwnts ; /usr/bin/python3
Oct 14 08:18:32 np0005486759.ooo.test sudo[58619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:32 np0005486759.ooo.test sudo[58625]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_hm5vwn5/privsep.sock
Oct 14 08:18:32 np0005486759.ooo.test sudo[58625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:32 np0005486759.ooo.test python3[58623]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:18:32 np0005486759.ooo.test sudo[58619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:32 np0005486759.ooo.test sudo[58641]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opiynvigrsjycwidpoadtvvprjgvbcca ; /usr/bin/python3
Oct 14 08:18:32 np0005486759.ooo.test sudo[58641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Oct 14 08:18:33 np0005486759.ooo.test python3[58643]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Oct 14 08:18:33 np0005486759.ooo.test sudo[58641]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:33 np0005486759.ooo.test sudo[58625]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:33 np0005486759.ooo.test sudo[58652]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpis5utpe0/privsep.sock
Oct 14 08:18:33 np0005486759.ooo.test sudo[58652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:34 np0005486759.ooo.test sudo[58652]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:34 np0005486759.ooo.test sudo[58669]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdzvxtl1m/privsep.sock
Oct 14 08:18:34 np0005486759.ooo.test sudo[58669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:35 np0005486759.ooo.test sudo[58669]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:35 np0005486759.ooo.test sudo[58680]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmposmm3mdy/privsep.sock
Oct 14 08:18:35 np0005486759.ooo.test sudo[58680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:36 np0005486759.ooo.test sudo[58680]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:36 np0005486759.ooo.test sudo[58691]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdbsrt7d1/privsep.sock
Oct 14 08:18:36 np0005486759.ooo.test sudo[58691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:36 np0005486759.ooo.test sudo[58691]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:37 np0005486759.ooo.test sudo[58702]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxuufdbwq/privsep.sock
Oct 14 08:18:37 np0005486759.ooo.test sudo[58702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Activating special unit Exit the Session...
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped target Main User Target.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped target Basic System.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped target Paths.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped target Sockets.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped target Timers.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Closed D-Bus User Message Bus Socket.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Stopped Create User's Volatile Files and Directories.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Removed slice User Application Slice.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Reached target Shutdown.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Finished Exit the Session.
Oct 14 08:18:37 np0005486759.ooo.test systemd[57998]: Reached target Exit the Session.
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 08:18:37 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 08:18:37 np0005486759.ooo.test sudo[58702]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:38 np0005486759.ooo.test sudo[58714]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzcaqzems/privsep.sock
Oct 14 08:18:38 np0005486759.ooo.test sudo[58714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:38 np0005486759.ooo.test sudo[58714]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:39 np0005486759.ooo.test sudo[58725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp39irvacf/privsep.sock
Oct 14 08:18:39 np0005486759.ooo.test sudo[58725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:39 np0005486759.ooo.test sudo[58725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: tmp-crun.IxfU0y.mount: Deactivated successfully.
Oct 14 08:18:39 np0005486759.ooo.test podman[58735]: 2025-10-14 08:18:39.85053042 +0000 UTC m=+0.111059522 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step4, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:18:39 np0005486759.ooo.test podman[58735]: 2025-10-14 08:18:39.90139126 +0000 UTC m=+0.161920362 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Oct 14 08:18:39 np0005486759.ooo.test podman[58735]: unhealthy
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:18:39 np0005486759.ooo.test podman[58736]: 2025-10-14 08:18:39.923331865 +0000 UTC m=+0.180035763 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:18:39 np0005486759.ooo.test podman[58736]: 2025-10-14 08:18:39.956409334 +0000 UTC m=+0.213113202 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64)
Oct 14 08:18:39 np0005486759.ooo.test podman[58736]: unhealthy
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:18:39 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:18:39 np0005486759.ooo.test sudo[58788]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8f16ob_b/privsep.sock
Oct 14 08:18:39 np0005486759.ooo.test sudo[58788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:40 np0005486759.ooo.test podman[58734]: 2025-10-14 08:18:40.000865064 +0000 UTC m=+0.260413585 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:18:40 np0005486759.ooo.test podman[58734]: 2025-10-14 08:18:40.03660326 +0000 UTC m=+0.296151751 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git)
Oct 14 08:18:40 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:18:40 np0005486759.ooo.test sudo[58788]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:40 np0005486759.ooo.test systemd[1]: tmp-crun.PLx7Qi.mount: Deactivated successfully.
Oct 14 08:18:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:18:40 np0005486759.ooo.test sudo[58819]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4dm3whdn/privsep.sock
Oct 14 08:18:40 np0005486759.ooo.test sudo[58819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:40 np0005486759.ooo.test systemd[1]: tmp-crun.LhfC9F.mount: Deactivated successfully.
Oct 14 08:18:40 np0005486759.ooo.test podman[58804]: 2025-10-14 08:18:40.93834909 +0000 UTC m=+0.084203737 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:18:41 np0005486759.ooo.test podman[58804]: 2025-10-14 08:18:41.293235794 +0000 UTC m=+0.439090401 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12)
Oct 14 08:18:41 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:18:41 np0005486759.ooo.test sudo[58819]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:41 np0005486759.ooo.test sudo[58841]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppbe01ibd/privsep.sock
Oct 14 08:18:41 np0005486759.ooo.test sudo[58841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:42 np0005486759.ooo.test sudo[58841]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:42 np0005486759.ooo.test sudo[58950]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivqhshrwslobyktfyfgtqgfbmwpzottu ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429921.9277625-112791-104401084028879/AnsiballZ_setup.py
Oct 14 08:18:42 np0005486759.ooo.test sudo[58950]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:18:42 np0005486759.ooo.test sudo[58958]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm_ojrcdn/privsep.sock
Oct 14 08:18:42 np0005486759.ooo.test sudo[58958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:42 np0005486759.ooo.test python3[58952]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 08:18:42 np0005486759.ooo.test sudo[58950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:42 np0005486759.ooo.test sudo[59007]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmavzecxjrrwjxxgzxnhhkwfonpvoyaf ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760429921.9277625-112791-104401084028879/AnsiballZ_dnf.py
Oct 14 08:18:43 np0005486759.ooo.test sudo[59007]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:18:43 np0005486759.ooo.test python3[59009]: ansible-ansible.legacy.dnf Invoked with name=['crudini'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Oct 14 08:18:43 np0005486759.ooo.test sudo[58958]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:43 np0005486759.ooo.test sudo[59019]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwd7_vrg6/privsep.sock
Oct 14 08:18:43 np0005486759.ooo.test sudo[59019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:44 np0005486759.ooo.test sudo[59019]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:18:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:18:44 np0005486759.ooo.test systemd[1]: tmp-crun.vLmTIR.mount: Deactivated successfully.
Oct 14 08:18:44 np0005486759.ooo.test systemd[1]: tmp-crun.O8Iw2d.mount: Deactivated successfully.
Oct 14 08:18:44 np0005486759.ooo.test podman[59027]: 2025-10-14 08:18:44.402200572 +0000 UTC m=+0.108843891 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:18:44 np0005486759.ooo.test podman[59025]: 2025-10-14 08:18:44.369583569 +0000 UTC m=+0.082825173 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:18:44 np0005486759.ooo.test podman[59027]: 2025-10-14 08:18:44.471567805 +0000 UTC m=+0.178211134 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, release=1, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=)
Oct 14 08:18:44 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:18:44 np0005486759.ooo.test podman[59025]: 2025-10-14 08:18:44.531313123 +0000 UTC m=+0.244554757 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:18:44 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:18:44 np0005486759.ooo.test sudo[59077]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf9822gel/privsep.sock
Oct 14 08:18:44 np0005486759.ooo.test sudo[59077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:45 np0005486759.ooo.test sudo[59077]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:45 np0005486759.ooo.test sudo[59094]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyxbx3fat/privsep.sock
Oct 14 08:18:45 np0005486759.ooo.test sudo[59094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:46 np0005486759.ooo.test sudo[59094]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:46 np0005486759.ooo.test sudo[59107]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi50h5ko5/privsep.sock
Oct 14 08:18:46 np0005486759.ooo.test sudo[59107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:47 np0005486759.ooo.test sudo[59107]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:47 np0005486759.ooo.test sudo[59007]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:47 np0005486759.ooo.test sudo[59132]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8_8jfqqn/privsep.sock
Oct 14 08:18:47 np0005486759.ooo.test sudo[59132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:48 np0005486759.ooo.test sudo[59132]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:48 np0005486759.ooo.test sudo[59143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2duhy6oc/privsep.sock
Oct 14 08:18:48 np0005486759.ooo.test sudo[59143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:48 np0005486759.ooo.test sudo[59143]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:49 np0005486759.ooo.test sudo[59154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3fsya0vt/privsep.sock
Oct 14 08:18:49 np0005486759.ooo.test sudo[59154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:49 np0005486759.ooo.test sudo[59154]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:50 np0005486759.ooo.test sudo[59165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1rr_d1bd/privsep.sock
Oct 14 08:18:50 np0005486759.ooo.test sudo[59165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:50 np0005486759.ooo.test sudo[59165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:50 np0005486759.ooo.test sudo[59182]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxhu0io25/privsep.sock
Oct 14 08:18:50 np0005486759.ooo.test sudo[59182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:51 np0005486759.ooo.test sudo[59182]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:51 np0005486759.ooo.test sudo[59193]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb4m5exv2/privsep.sock
Oct 14 08:18:51 np0005486759.ooo.test sudo[59193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:52 np0005486759.ooo.test sudo[59193]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:52 np0005486759.ooo.test sudo[59204]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg_ndluw9/privsep.sock
Oct 14 08:18:52 np0005486759.ooo.test sudo[59204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:53 np0005486759.ooo.test sudo[59204]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:53 np0005486759.ooo.test sudo[59215]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn1xtx7gb/privsep.sock
Oct 14 08:18:53 np0005486759.ooo.test sudo[59215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:18:53 np0005486759.ooo.test systemd[1]: tmp-crun.GuFdbT.mount: Deactivated successfully.
Oct 14 08:18:53 np0005486759.ooo.test podman[59217]: 2025-10-14 08:18:53.821604884 +0000 UTC m=+0.121602968 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1, container_name=metrics_qdr, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:18:54 np0005486759.ooo.test podman[59217]: 2025-10-14 08:18:54.002913196 +0000 UTC m=+0.302911310 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:18:54 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:18:54 np0005486759.ooo.test sudo[59215]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:54 np0005486759.ooo.test sudo[59255]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmac5sq7u/privsep.sock
Oct 14 08:18:54 np0005486759.ooo.test sudo[59255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:55 np0005486759.ooo.test sudo[59255]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:55 np0005486759.ooo.test sudo[59266]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxaxl6hvh/privsep.sock
Oct 14 08:18:55 np0005486759.ooo.test sudo[59266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:56 np0005486759.ooo.test sudo[59266]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:18:56 np0005486759.ooo.test podman[59276]: 2025-10-14 08:18:56.346462051 +0000 UTC m=+0.090345247 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:18:56 np0005486759.ooo.test podman[59276]: 2025-10-14 08:18:56.382419304 +0000 UTC m=+0.126302500 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.buildah.version=1.33.12, release=2, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1)
Oct 14 08:18:56 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:18:56 np0005486759.ooo.test sudo[59303]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn9r5smpd/privsep.sock
Oct 14 08:18:56 np0005486759.ooo.test sudo[59303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:57 np0005486759.ooo.test sudo[59303]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:18:57 np0005486759.ooo.test systemd[1]: tmp-crun.ipuRyz.mount: Deactivated successfully.
Oct 14 08:18:57 np0005486759.ooo.test podman[59308]: 2025-10-14 08:18:57.271395517 +0000 UTC m=+0.101996077 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:18:57 np0005486759.ooo.test podman[59308]: 2025-10-14 08:18:57.32575597 +0000 UTC m=+0.156356540 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:18:57 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:18:57 np0005486759.ooo.test sudo[59339]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_mt4tt7h/privsep.sock
Oct 14 08:18:57 np0005486759.ooo.test sudo[59339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:58 np0005486759.ooo.test sudo[59339]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:18:58 np0005486759.ooo.test podman[59344]: 2025-10-14 08:18:58.113145791 +0000 UTC m=+0.064807555 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.openshift.expose-services=)
Oct 14 08:18:58 np0005486759.ooo.test podman[59344]: 2025-10-14 08:18:58.11986372 +0000 UTC m=+0.071525484 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Oct 14 08:18:58 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:18:58 np0005486759.ooo.test sudo[59368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5br627zw/privsep.sock
Oct 14 08:18:58 np0005486759.ooo.test sudo[59368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:58 np0005486759.ooo.test sudo[59368]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:59 np0005486759.ooo.test sudo[59379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps87614qo/privsep.sock
Oct 14 08:18:59 np0005486759.ooo.test sudo[59379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:18:59 np0005486759.ooo.test sudo[59379]: pam_unix(sudo:session): session closed for user root
Oct 14 08:18:59 np0005486759.ooo.test sudo[59390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxa2b8h2m/privsep.sock
Oct 14 08:18:59 np0005486759.ooo.test sudo[59390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:00 np0005486759.ooo.test sudo[59390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:00 np0005486759.ooo.test sudo[59401]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7vdia2v3/privsep.sock
Oct 14 08:19:00 np0005486759.ooo.test sudo[59401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:01 np0005486759.ooo.test sudo[59401]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:01 np0005486759.ooo.test sudo[59418]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpixdnpt4v/privsep.sock
Oct 14 08:19:01 np0005486759.ooo.test sudo[59418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:02 np0005486759.ooo.test sudo[59418]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:02 np0005486759.ooo.test sudo[59429]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_wedy5r1/privsep.sock
Oct 14 08:19:02 np0005486759.ooo.test sudo[59429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:03 np0005486759.ooo.test sudo[59429]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:03 np0005486759.ooo.test sudo[59440]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbs8pd4xh/privsep.sock
Oct 14 08:19:03 np0005486759.ooo.test sudo[59440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:03 np0005486759.ooo.test sudo[59440]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:04 np0005486759.ooo.test sudo[59451]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfv51621x/privsep.sock
Oct 14 08:19:04 np0005486759.ooo.test sudo[59451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:04 np0005486759.ooo.test sudo[59451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:05 np0005486759.ooo.test sudo[59462]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5ksv4n7/privsep.sock
Oct 14 08:19:05 np0005486759.ooo.test sudo[59462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:05 np0005486759.ooo.test sudo[59462]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:05 np0005486759.ooo.test sudo[59473]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplep9o3fs/privsep.sock
Oct 14 08:19:05 np0005486759.ooo.test sudo[59473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:06 np0005486759.ooo.test sudo[59473]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:06 np0005486759.ooo.test sudo[59486]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplv7a7_hf/privsep.sock
Oct 14 08:19:06 np0005486759.ooo.test sudo[59486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:07 np0005486759.ooo.test sudo[59486]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:07 np0005486759.ooo.test sudo[59501]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz9xtu63k/privsep.sock
Oct 14 08:19:07 np0005486759.ooo.test sudo[59501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:08 np0005486759.ooo.test sudo[59501]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:08 np0005486759.ooo.test sudo[59512]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsvrphf8o/privsep.sock
Oct 14 08:19:08 np0005486759.ooo.test sudo[59512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:09 np0005486759.ooo.test sudo[59512]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:09 np0005486759.ooo.test sudo[59523]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg3ecz5io/privsep.sock
Oct 14 08:19:09 np0005486759.ooo.test sudo[59523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:09 np0005486759.ooo.test sudo[59523]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:19:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: tmp-crun.58gR7Z.mount: Deactivated successfully.
Oct 14 08:19:10 np0005486759.ooo.test podman[59529]: 2025-10-14 08:19:10.091266813 +0000 UTC m=+0.091661789 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:19:10 np0005486759.ooo.test podman[59529]: 2025-10-14 08:19:10.131784365 +0000 UTC m=+0.132179221 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T14:45:33, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, distribution-scope=public)
Oct 14 08:19:10 np0005486759.ooo.test podman[59529]: unhealthy
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:19:10 np0005486759.ooo.test podman[59530]: 2025-10-14 08:19:10.144424817 +0000 UTC m=+0.138975273 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, architecture=x86_64, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:19:10 np0005486759.ooo.test podman[59530]: 2025-10-14 08:19:10.189734385 +0000 UTC m=+0.184284831 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=ceilometer_agent_ipmi, version=17.1.9, maintainer=OpenStack TripleO Team)
Oct 14 08:19:10 np0005486759.ooo.test podman[59530]: unhealthy
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:19:10 np0005486759.ooo.test podman[59553]: 2025-10-14 08:19:10.229037048 +0000 UTC m=+0.132448012 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1)
Oct 14 08:19:10 np0005486759.ooo.test podman[59553]: 2025-10-14 08:19:10.237286776 +0000 UTC m=+0.140697730 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, release=1, io.buildah.version=1.33.12, version=17.1.9, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:19:10 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:19:10 np0005486759.ooo.test sudo[59593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwou7lt7s/privsep.sock
Oct 14 08:19:10 np0005486759.ooo.test sudo[59593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:10 np0005486759.ooo.test sudo[59593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:11 np0005486759.ooo.test sudo[59604]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqkb0x6c3/privsep.sock
Oct 14 08:19:11 np0005486759.ooo.test sudo[59604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:19:11 np0005486759.ooo.test podman[59607]: 2025-10-14 08:19:11.45865485 +0000 UTC m=+0.078605854 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1)
Oct 14 08:19:11 np0005486759.ooo.test sudo[59604]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:11 np0005486759.ooo.test podman[59607]: 2025-10-14 08:19:11.839058527 +0000 UTC m=+0.459009591 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.33.12, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:19:11 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:19:11 np0005486759.ooo.test sudo[59636]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8kr72fws/privsep.sock
Oct 14 08:19:11 np0005486759.ooo.test sudo[59636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:12 np0005486759.ooo.test sudo[59636]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:12 np0005486759.ooo.test sudo[59653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm40d256y/privsep.sock
Oct 14 08:19:12 np0005486759.ooo.test sudo[59653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:13 np0005486759.ooo.test sudo[59653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:13 np0005486759.ooo.test sudo[59664]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpov564uio/privsep.sock
Oct 14 08:19:13 np0005486759.ooo.test sudo[59664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:14 np0005486759.ooo.test sudo[59664]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:14 np0005486759.ooo.test sudo[59675]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgjp_zc65/privsep.sock
Oct 14 08:19:14 np0005486759.ooo.test sudo[59675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:19:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:19:14 np0005486759.ooo.test podman[59678]: 2025-10-14 08:19:14.70624391 +0000 UTC m=+0.073879331 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-07-21T13:28:44)
Oct 14 08:19:14 np0005486759.ooo.test systemd[1]: tmp-crun.tplnPH.mount: Deactivated successfully.
Oct 14 08:19:14 np0005486759.ooo.test podman[59677]: 2025-10-14 08:19:14.793173385 +0000 UTC m=+0.159814323 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T16:28:53, release=1, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:19:14 np0005486759.ooo.test podman[59678]: 2025-10-14 08:19:14.80897917 +0000 UTC m=+0.176614551 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, tcib_managed=true)
Oct 14 08:19:14 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:19:14 np0005486759.ooo.test podman[59677]: 2025-10-14 08:19:14.86352154 +0000 UTC m=+0.230162478 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.9, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64)
Oct 14 08:19:14 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:19:15 np0005486759.ooo.test sudo[59675]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:15 np0005486759.ooo.test sudo[59733]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphr9340ke/privsep.sock
Oct 14 08:19:15 np0005486759.ooo.test sudo[59733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:16 np0005486759.ooo.test sudo[59733]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:16 np0005486759.ooo.test sudo[59744]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8qfrqlhy/privsep.sock
Oct 14 08:19:16 np0005486759.ooo.test sudo[59744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:16 np0005486759.ooo.test sudo[59744]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:17 np0005486759.ooo.test sudo[59755]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_0jjkq6e/privsep.sock
Oct 14 08:19:17 np0005486759.ooo.test sudo[59755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:17 np0005486759.ooo.test sudo[59755]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:18 np0005486759.ooo.test sudo[59772]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdkc30egm/privsep.sock
Oct 14 08:19:18 np0005486759.ooo.test sudo[59772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:18 np0005486759.ooo.test sudo[59772]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:18 np0005486759.ooo.test sudo[59783]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpifaz74ye/privsep.sock
Oct 14 08:19:18 np0005486759.ooo.test sudo[59783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:19 np0005486759.ooo.test sudo[59783]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:19 np0005486759.ooo.test sudo[59794]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd6tz0w35/privsep.sock
Oct 14 08:19:19 np0005486759.ooo.test sudo[59794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:20 np0005486759.ooo.test sudo[59794]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:20 np0005486759.ooo.test sudo[59805]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7jwq0lg1/privsep.sock
Oct 14 08:19:20 np0005486759.ooo.test sudo[59805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:21 np0005486759.ooo.test sudo[59805]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:21 np0005486759.ooo.test sudo[59816]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp885bw8bf/privsep.sock
Oct 14 08:19:21 np0005486759.ooo.test sudo[59816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:22 np0005486759.ooo.test sudo[59816]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:22 np0005486759.ooo.test sudo[59827]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp15cyf6rz/privsep.sock
Oct 14 08:19:22 np0005486759.ooo.test sudo[59827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:23 np0005486759.ooo.test sudo[59827]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:23 np0005486759.ooo.test sudo[59844]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4o784yse/privsep.sock
Oct 14 08:19:23 np0005486759.ooo.test sudo[59844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:24 np0005486759.ooo.test sudo[59844]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:19:24 np0005486759.ooo.test systemd[1]: tmp-crun.6A1O4m.mount: Deactivated successfully.
Oct 14 08:19:24 np0005486759.ooo.test podman[59848]: 2025-10-14 08:19:24.276294325 +0000 UTC m=+0.082147680 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git)
Oct 14 08:19:24 np0005486759.ooo.test podman[59848]: 2025-10-14 08:19:24.46686663 +0000 UTC m=+0.272720045 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, architecture=x86_64)
Oct 14 08:19:24 np0005486759.ooo.test sudo[59883]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppzvqsjev/privsep.sock
Oct 14 08:19:24 np0005486759.ooo.test sudo[59883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:24 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:19:25 np0005486759.ooo.test sudo[59883]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:25 np0005486759.ooo.test sudo[59895]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsd90r9z0/privsep.sock
Oct 14 08:19:25 np0005486759.ooo.test sudo[59895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:25 np0005486759.ooo.test sudo[59895]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:26 np0005486759.ooo.test sudo[59906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprq37oi5q/privsep.sock
Oct 14 08:19:26 np0005486759.ooo.test sudo[59906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:26 np0005486759.ooo.test sudo[59906]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:19:26 np0005486759.ooo.test podman[59911]: 2025-10-14 08:19:26.800317175 +0000 UTC m=+0.068420703 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, version=17.1.9, config_id=tripleo_step3, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, tcib_managed=true, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Oct 14 08:19:26 np0005486759.ooo.test podman[59911]: 2025-10-14 08:19:26.814410624 +0000 UTC m=+0.082514142 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, release=2, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 08:19:26 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:19:26 np0005486759.ooo.test sudo[59937]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpswvzqyd6/privsep.sock
Oct 14 08:19:26 np0005486759.ooo.test sudo[59937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:19:27 np0005486759.ooo.test podman[59940]: 2025-10-14 08:19:27.452391182 +0000 UTC m=+0.080098454 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, release=1, architecture=x86_64, config_id=tripleo_step5)
Oct 14 08:19:27 np0005486759.ooo.test podman[59940]: 2025-10-14 08:19:27.481592394 +0000 UTC m=+0.109299626 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:19:27 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:19:27 np0005486759.ooo.test sudo[59937]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:27 np0005486759.ooo.test sudo[59974]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmparx_swr9/privsep.sock
Oct 14 08:19:27 np0005486759.ooo.test sudo[59974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:19:28 np0005486759.ooo.test podman[59977]: 2025-10-14 08:19:28.456054336 +0000 UTC m=+0.082276124 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1)
Oct 14 08:19:28 np0005486759.ooo.test podman[59977]: 2025-10-14 08:19:28.468221373 +0000 UTC m=+0.094443131 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:27:15, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, vendor=Red Hat, Inc., release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:19:28 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:19:28 np0005486759.ooo.test sudo[59974]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:28 np0005486759.ooo.test sudo[60010]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnq35dwb8/privsep.sock
Oct 14 08:19:28 np0005486759.ooo.test sudo[60010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:29 np0005486759.ooo.test sudo[60010]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:29 np0005486759.ooo.test sudo[60021]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa2wa4hgl/privsep.sock
Oct 14 08:19:29 np0005486759.ooo.test sudo[60021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:30 np0005486759.ooo.test sudo[60021]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:30 np0005486759.ooo.test sudo[60032]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr2u4axw1/privsep.sock
Oct 14 08:19:30 np0005486759.ooo.test sudo[60032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:31 np0005486759.ooo.test sudo[60032]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:31 np0005486759.ooo.test sudo[60043]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5th1ks0n/privsep.sock
Oct 14 08:19:31 np0005486759.ooo.test sudo[60043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:31 np0005486759.ooo.test sudo[60043]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:32 np0005486759.ooo.test sudo[60054]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl01jkuxj/privsep.sock
Oct 14 08:19:32 np0005486759.ooo.test sudo[60054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:32 np0005486759.ooo.test sudo[60054]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:33 np0005486759.ooo.test sudo[60065]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1d3y1dil/privsep.sock
Oct 14 08:19:33 np0005486759.ooo.test sudo[60065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:33 np0005486759.ooo.test sudo[60065]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:33 np0005486759.ooo.test sudo[60079]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe0t_0_i0/privsep.sock
Oct 14 08:19:33 np0005486759.ooo.test sudo[60079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:34 np0005486759.ooo.test sudo[60079]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:34 np0005486759.ooo.test sudo[60093]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmpl5m9tw/privsep.sock
Oct 14 08:19:34 np0005486759.ooo.test sudo[60093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:35 np0005486759.ooo.test sudo[60093]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:35 np0005486759.ooo.test sudo[60104]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcmotyls5/privsep.sock
Oct 14 08:19:35 np0005486759.ooo.test sudo[60104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:36 np0005486759.ooo.test sudo[60104]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:36 np0005486759.ooo.test sudo[60115]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5hduusir/privsep.sock
Oct 14 08:19:36 np0005486759.ooo.test sudo[60115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:37 np0005486759.ooo.test sudo[60115]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:37 np0005486759.ooo.test sudo[60126]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt35ff5ak/privsep.sock
Oct 14 08:19:37 np0005486759.ooo.test sudo[60126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:38 np0005486759.ooo.test sudo[60126]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:38 np0005486759.ooo.test sudo[60137]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1jg7o59t/privsep.sock
Oct 14 08:19:38 np0005486759.ooo.test sudo[60137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:38 np0005486759.ooo.test sudo[60137]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:39 np0005486759.ooo.test sudo[60148]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwu09obip/privsep.sock
Oct 14 08:19:39 np0005486759.ooo.test sudo[60148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:39 np0005486759.ooo.test sudo[60148]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:40 np0005486759.ooo.test sudo[60165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmjdem2r5/privsep.sock
Oct 14 08:19:40 np0005486759.ooo.test sudo[60165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: tmp-crun.QA80RZ.mount: Deactivated successfully.
Oct 14 08:19:40 np0005486759.ooo.test podman[60168]: 2025-10-14 08:19:40.435361724 +0000 UTC m=+0.068148235 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=logrotate_crond, name=rhosp17/openstack-cron, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:19:40 np0005486759.ooo.test podman[60170]: 2025-10-14 08:19:40.45225039 +0000 UTC m=+0.081364678 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:19:40 np0005486759.ooo.test podman[60169]: 2025-10-14 08:19:40.485840615 +0000 UTC m=+0.117691625 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_id=tripleo_step4, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public)
Oct 14 08:19:40 np0005486759.ooo.test podman[60170]: 2025-10-14 08:19:40.490487072 +0000 UTC m=+0.119601380 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.9, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:19:40 np0005486759.ooo.test podman[60170]: unhealthy
Oct 14 08:19:40 np0005486759.ooo.test podman[60169]: 2025-10-14 08:19:40.498663026 +0000 UTC m=+0.130514026 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:19:40 np0005486759.ooo.test podman[60169]: unhealthy
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:19:40 np0005486759.ooo.test podman[60168]: 2025-10-14 08:19:40.516738011 +0000 UTC m=+0.149524512 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, release=1, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:19:40 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:19:40 np0005486759.ooo.test sudo[60165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:40 np0005486759.ooo.test sudo[60234]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaihiowho/privsep.sock
Oct 14 08:19:40 np0005486759.ooo.test sudo[60234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:41 np0005486759.ooo.test sudo[60234]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:41 np0005486759.ooo.test systemd[1]: tmp-crun.saxr7v.mount: Deactivated successfully.
Oct 14 08:19:41 np0005486759.ooo.test sudo[60245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppwr29che/privsep.sock
Oct 14 08:19:41 np0005486759.ooo.test sudo[60245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:42 np0005486759.ooo.test sudo[60245]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:19:42 np0005486759.ooo.test podman[60249]: 2025-10-14 08:19:42.418739031 +0000 UTC m=+0.071648893 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T14:48:37, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12)
Oct 14 08:19:42 np0005486759.ooo.test sudo[60276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx0nsxril/privsep.sock
Oct 14 08:19:42 np0005486759.ooo.test sudo[60276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:42 np0005486759.ooo.test podman[60249]: 2025-10-14 08:19:42.806807956 +0000 UTC m=+0.459717848 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:19:42 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:19:43 np0005486759.ooo.test sudo[60276]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:43 np0005486759.ooo.test sudo[60288]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp39vd7j9/privsep.sock
Oct 14 08:19:43 np0005486759.ooo.test sudo[60288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:44 np0005486759.ooo.test sudo[60288]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:44 np0005486759.ooo.test sudo[60299]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmi4xz0bk/privsep.sock
Oct 14 08:19:44 np0005486759.ooo.test sudo[60299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:45 np0005486759.ooo.test sudo[60299]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:19:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:19:45 np0005486759.ooo.test systemd[1]: tmp-crun.LDVvVC.mount: Deactivated successfully.
Oct 14 08:19:45 np0005486759.ooo.test podman[60312]: 2025-10-14 08:19:45.205375687 +0000 UTC m=+0.074914012 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:19:45 np0005486759.ooo.test podman[60309]: 2025-10-14 08:19:45.185738089 +0000 UTC m=+0.064587606 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, config_id=tripleo_step4)
Oct 14 08:19:45 np0005486759.ooo.test podman[60312]: 2025-10-14 08:19:45.249507186 +0000 UTC m=+0.119045531 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, tcib_managed=true, release=1, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, container_name=ovn_controller)
Oct 14 08:19:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:19:45 np0005486759.ooo.test podman[60309]: 2025-10-14 08:19:45.26517875 +0000 UTC m=+0.144028287 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1)
Oct 14 08:19:45 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:19:45 np0005486759.ooo.test sudo[60364]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkhadkq50/privsep.sock
Oct 14 08:19:45 np0005486759.ooo.test sudo[60364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:45 np0005486759.ooo.test sudo[60364]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:46 np0005486759.ooo.test sudo[60375]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1wnnou00/privsep.sock
Oct 14 08:19:46 np0005486759.ooo.test sudo[60375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:46 np0005486759.ooo.test sudo[60375]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:47 np0005486759.ooo.test sudo[60386]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp96nborhw/privsep.sock
Oct 14 08:19:47 np0005486759.ooo.test sudo[60386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:47 np0005486759.ooo.test sudo[60386]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:47 np0005486759.ooo.test sudo[60397]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq25hrgex/privsep.sock
Oct 14 08:19:47 np0005486759.ooo.test sudo[60397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:48 np0005486759.ooo.test sudo[60397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:48 np0005486759.ooo.test sudo[60408]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp75xp1i59/privsep.sock
Oct 14 08:19:48 np0005486759.ooo.test sudo[60408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:49 np0005486759.ooo.test sudo[60408]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:49 np0005486759.ooo.test sudo[60419]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_ndtbqc2/privsep.sock
Oct 14 08:19:49 np0005486759.ooo.test sudo[60419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:50 np0005486759.ooo.test sudo[60419]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:50 np0005486759.ooo.test sudo[60436]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwpweduou/privsep.sock
Oct 14 08:19:50 np0005486759.ooo.test sudo[60436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:51 np0005486759.ooo.test sudo[60436]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:51 np0005486759.ooo.test sudo[60447]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphurmngs3/privsep.sock
Oct 14 08:19:51 np0005486759.ooo.test sudo[60447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:52 np0005486759.ooo.test sudo[60447]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:52 np0005486759.ooo.test sudo[60458]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0qq6ryd8/privsep.sock
Oct 14 08:19:52 np0005486759.ooo.test sudo[60458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:52 np0005486759.ooo.test sudo[60458]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:53 np0005486759.ooo.test sudo[60469]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjkpuizj2/privsep.sock
Oct 14 08:19:53 np0005486759.ooo.test sudo[60469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:53 np0005486759.ooo.test sudo[60469]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:54 np0005486759.ooo.test sudo[60480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2rz_unqx/privsep.sock
Oct 14 08:19:54 np0005486759.ooo.test sudo[60480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:54 np0005486759.ooo.test sudo[60480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:19:54 np0005486759.ooo.test podman[60485]: 2025-10-14 08:19:54.756433651 +0000 UTC m=+0.083771669 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step1)
Oct 14 08:19:54 np0005486759.ooo.test sudo[60520]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnpe3u55p/privsep.sock
Oct 14 08:19:54 np0005486759.ooo.test sudo[60520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:54 np0005486759.ooo.test podman[60485]: 2025-10-14 08:19:54.970732213 +0000 UTC m=+0.298070171 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, managed_by=tripleo_ansible, release=1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, container_name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1)
Oct 14 08:19:54 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:19:55 np0005486759.ooo.test sudo[60520]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:55 np0005486759.ooo.test sudo[60534]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqtd16nyc/privsep.sock
Oct 14 08:19:55 np0005486759.ooo.test sudo[60534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:56 np0005486759.ooo.test sudo[60534]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:56 np0005486759.ooo.test sudo[60548]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphul05a9h/privsep.sock
Oct 14 08:19:56 np0005486759.ooo.test sudo[60548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:57 np0005486759.ooo.test sudo[60548]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:19:57 np0005486759.ooo.test podman[60554]: 2025-10-14 08:19:57.340176437 +0000 UTC m=+0.055749809 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:04:03, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, release=2, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_id=tripleo_step3)
Oct 14 08:19:57 np0005486759.ooo.test podman[60554]: 2025-10-14 08:19:57.349404477 +0000 UTC m=+0.064977909 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=2, com.redhat.component=openstack-collectd-container, distribution-scope=public, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, container_name=collectd, vendor=Red Hat, Inc.)
Oct 14 08:19:57 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:19:57 np0005486759.ooo.test sudo[60577]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv9fw0cz3/privsep.sock
Oct 14 08:19:57 np0005486759.ooo.test sudo[60577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:19:57 np0005486759.ooo.test systemd[1]: tmp-crun.JQaCy7.mount: Deactivated successfully.
Oct 14 08:19:57 np0005486759.ooo.test podman[60579]: 2025-10-14 08:19:57.654425239 +0000 UTC m=+0.079989062 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 08:19:57 np0005486759.ooo.test podman[60579]: 2025-10-14 08:19:57.705989036 +0000 UTC m=+0.131552929 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Oct 14 08:19:57 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:19:58 np0005486759.ooo.test sudo[60577]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:58 np0005486759.ooo.test sudo[60616]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgjjht9l6/privsep.sock
Oct 14 08:19:58 np0005486759.ooo.test sudo[60616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:19:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:19:58 np0005486759.ooo.test podman[60618]: 2025-10-14 08:19:58.676350245 +0000 UTC m=+0.086742668 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, architecture=x86_64, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:19:58 np0005486759.ooo.test podman[60618]: 2025-10-14 08:19:58.682937986 +0000 UTC m=+0.093330419 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container)
Oct 14 08:19:58 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:19:59 np0005486759.ooo.test sudo[60616]: pam_unix(sudo:session): session closed for user root
Oct 14 08:19:59 np0005486759.ooo.test sudo[60647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdxq9xp1_/privsep.sock
Oct 14 08:19:59 np0005486759.ooo.test sudo[60647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:00 np0005486759.ooo.test sudo[60647]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:00 np0005486759.ooo.test sudo[60658]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpod6xlm_j/privsep.sock
Oct 14 08:20:00 np0005486759.ooo.test sudo[60658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:00 np0005486759.ooo.test sudo[60658]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:01 np0005486759.ooo.test sudo[60674]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd3mncam9/privsep.sock
Oct 14 08:20:01 np0005486759.ooo.test sudo[60674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:01 np0005486759.ooo.test sudo[60674]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:02 np0005486759.ooo.test sudo[60686]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8p0k7bsf/privsep.sock
Oct 14 08:20:02 np0005486759.ooo.test sudo[60686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:02 np0005486759.ooo.test sudo[60686]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:03 np0005486759.ooo.test sudo[60697]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpodyjiyqg/privsep.sock
Oct 14 08:20:03 np0005486759.ooo.test sudo[60697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:03 np0005486759.ooo.test sudo[60697]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:03 np0005486759.ooo.test sudo[60708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0udr3qoc/privsep.sock
Oct 14 08:20:03 np0005486759.ooo.test sudo[60708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:04 np0005486759.ooo.test sudo[60708]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:04 np0005486759.ooo.test sudo[60719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8zkdputf/privsep.sock
Oct 14 08:20:04 np0005486759.ooo.test sudo[60719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:05 np0005486759.ooo.test sudo[60719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:05 np0005486759.ooo.test sudo[60730]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmfl8b2ow/privsep.sock
Oct 14 08:20:05 np0005486759.ooo.test sudo[60730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:06 np0005486759.ooo.test sudo[60730]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:06 np0005486759.ooo.test sudo[60741]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp49oe0eaf/privsep.sock
Oct 14 08:20:06 np0005486759.ooo.test sudo[60741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:06 np0005486759.ooo.test sudo[60741]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:07 np0005486759.ooo.test sudo[60758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxlbsm7_c/privsep.sock
Oct 14 08:20:07 np0005486759.ooo.test sudo[60758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:07 np0005486759.ooo.test sudo[60758]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:08 np0005486759.ooo.test sudo[60769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6bmu2h1h/privsep.sock
Oct 14 08:20:08 np0005486759.ooo.test sudo[60769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:08 np0005486759.ooo.test sudo[60769]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:08 np0005486759.ooo.test sudo[60780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy_r0_8s7/privsep.sock
Oct 14 08:20:08 np0005486759.ooo.test sudo[60780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:09 np0005486759.ooo.test sudo[60780]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:09 np0005486759.ooo.test sudo[60791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwj4383pa/privsep.sock
Oct 14 08:20:09 np0005486759.ooo.test sudo[60791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:10 np0005486759.ooo.test sudo[60791]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:10 np0005486759.ooo.test sudo[60802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsm6vtd2w/privsep.sock
Oct 14 08:20:10 np0005486759.ooo.test sudo[60802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:20:10 np0005486759.ooo.test podman[60804]: 2025-10-14 08:20:10.719713684 +0000 UTC m=+0.068234569 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.33.12, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, com.redhat.component=openstack-cron-container, distribution-scope=public, architecture=x86_64, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 08:20:10 np0005486759.ooo.test podman[60804]: 2025-10-14 08:20:10.729216682 +0000 UTC m=+0.077737587 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, name=rhosp17/openstack-cron, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond)
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: tmp-crun.4oumec.mount: Deactivated successfully.
Oct 14 08:20:10 np0005486759.ooo.test podman[60805]: 2025-10-14 08:20:10.800029295 +0000 UTC m=+0.142450365 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64)
Oct 14 08:20:10 np0005486759.ooo.test podman[60805]: 2025-10-14 08:20:10.81837422 +0000 UTC m=+0.160795340 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, version=17.1.9)
Oct 14 08:20:10 np0005486759.ooo.test podman[60805]: unhealthy
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:20:10 np0005486759.ooo.test podman[60806]: 2025-10-14 08:20:10.778645619 +0000 UTC m=+0.116160585 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.expose-services=, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64)
Oct 14 08:20:10 np0005486759.ooo.test podman[60806]: 2025-10-14 08:20:10.861228626 +0000 UTC m=+0.198743602 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, build-date=2025-07-21T15:29:47, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 08:20:10 np0005486759.ooo.test podman[60806]: unhealthy
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:20:10 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:20:11 np0005486759.ooo.test sudo[60802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:11 np0005486759.ooo.test sudo[60870]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpay35mbrp/privsep.sock
Oct 14 08:20:11 np0005486759.ooo.test sudo[60870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:12 np0005486759.ooo.test sudo[60870]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:12 np0005486759.ooo.test sudo[60887]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu09pry4y/privsep.sock
Oct 14 08:20:12 np0005486759.ooo.test sudo[60887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:12 np0005486759.ooo.test sudo[60887]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:20:13 np0005486759.ooo.test podman[60891]: 2025-10-14 08:20:13.088792786 +0000 UTC m=+0.079924469 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:20:13 np0005486759.ooo.test sudo[60920]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptu_0jl04/privsep.sock
Oct 14 08:20:13 np0005486759.ooo.test sudo[60920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:13 np0005486759.ooo.test podman[60891]: 2025-10-14 08:20:13.485135068 +0000 UTC m=+0.476266791 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Oct 14 08:20:13 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:20:13 np0005486759.ooo.test sudo[60920]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:14 np0005486759.ooo.test sudo[60933]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmui0yac_/privsep.sock
Oct 14 08:20:14 np0005486759.ooo.test sudo[60933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:14 np0005486759.ooo.test sudo[60933]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:14 np0005486759.ooo.test sudo[60944]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm10f40t_/privsep.sock
Oct 14 08:20:14 np0005486759.ooo.test sudo[60944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:20:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:20:15 np0005486759.ooo.test podman[60947]: 2025-10-14 08:20:15.457264208 +0000 UTC m=+0.081091448 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, io.buildah.version=1.33.12, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 08:20:15 np0005486759.ooo.test podman[60947]: 2025-10-14 08:20:15.515281502 +0000 UTC m=+0.139108742 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:20:15 np0005486759.ooo.test systemd[1]: tmp-crun.wzylMz.mount: Deactivated successfully.
Oct 14 08:20:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:20:15 np0005486759.ooo.test podman[60948]: 2025-10-14 08:20:15.521339215 +0000 UTC m=+0.140872242 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12)
Oct 14 08:20:15 np0005486759.ooo.test sudo[60944]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:15 np0005486759.ooo.test podman[60948]: 2025-10-14 08:20:15.605456014 +0000 UTC m=+0.224989081 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:20:15 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:20:15 np0005486759.ooo.test sudo[61003]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm07g3nbp/privsep.sock
Oct 14 08:20:15 np0005486759.ooo.test sudo[61003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:16 np0005486759.ooo.test sudo[61003]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:16 np0005486759.ooo.test sudo[61014]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0_zntc6m/privsep.sock
Oct 14 08:20:16 np0005486759.ooo.test sudo[61014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:17 np0005486759.ooo.test sudo[61014]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:17 np0005486759.ooo.test sudo[61030]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxyxttnja/privsep.sock
Oct 14 08:20:17 np0005486759.ooo.test sudo[61030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:18 np0005486759.ooo.test sudo[61030]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:18 np0005486759.ooo.test sudo[61042]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkfr2j48k/privsep.sock
Oct 14 08:20:18 np0005486759.ooo.test sudo[61042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:19 np0005486759.ooo.test sudo[61042]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:19 np0005486759.ooo.test sudo[61053]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpem74x4nv/privsep.sock
Oct 14 08:20:19 np0005486759.ooo.test sudo[61053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:19 np0005486759.ooo.test sudo[61053]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:20 np0005486759.ooo.test sudo[61064]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyx_zpq73/privsep.sock
Oct 14 08:20:20 np0005486759.ooo.test sudo[61064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:20 np0005486759.ooo.test sudo[61064]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:21 np0005486759.ooo.test sudo[61075]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6lvix83d/privsep.sock
Oct 14 08:20:21 np0005486759.ooo.test sudo[61075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:21 np0005486759.ooo.test sudo[61075]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:21 np0005486759.ooo.test sudo[61086]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgutmzofj/privsep.sock
Oct 14 08:20:21 np0005486759.ooo.test sudo[61086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:22 np0005486759.ooo.test sudo[61086]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:22 np0005486759.ooo.test sudo[61099]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2r67erij/privsep.sock
Oct 14 08:20:22 np0005486759.ooo.test sudo[61099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:23 np0005486759.ooo.test sudo[61099]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:23 np0005486759.ooo.test sudo[61114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp80cvmv_3/privsep.sock
Oct 14 08:20:23 np0005486759.ooo.test sudo[61114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:24 np0005486759.ooo.test sudo[61114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:24 np0005486759.ooo.test sudo[61125]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbcqn7_lj/privsep.sock
Oct 14 08:20:24 np0005486759.ooo.test sudo[61125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:25 np0005486759.ooo.test sudo[61125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:20:25 np0005486759.ooo.test podman[61129]: 2025-10-14 08:20:25.109099419 +0000 UTC m=+0.055639915 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.9, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, distribution-scope=public, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:20:25 np0005486759.ooo.test sudo[61164]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn0wpfn_0/privsep.sock
Oct 14 08:20:25 np0005486759.ooo.test sudo[61164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:25 np0005486759.ooo.test podman[61129]: 2025-10-14 08:20:25.359878034 +0000 UTC m=+0.306418530 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, vcs-type=git, architecture=x86_64)
Oct 14 08:20:25 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:20:25 np0005486759.ooo.test sudo[61164]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:26 np0005486759.ooo.test sudo[61176]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf0yzbr_k/privsep.sock
Oct 14 08:20:26 np0005486759.ooo.test sudo[61176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:26 np0005486759.ooo.test sudo[61176]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:27 np0005486759.ooo.test sudo[61187]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7y6xcqui/privsep.sock
Oct 14 08:20:27 np0005486759.ooo.test sudo[61187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:27 np0005486759.ooo.test sudo[61187]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:20:27 np0005486759.ooo.test podman[61192]: 2025-10-14 08:20:27.802020374 +0000 UTC m=+0.086510150 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, release=2, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd)
Oct 14 08:20:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:20:27 np0005486759.ooo.test podman[61192]: 2025-10-14 08:20:27.823391871 +0000 UTC m=+0.107881717 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team)
Oct 14 08:20:27 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:20:27 np0005486759.ooo.test systemd[1]: tmp-crun.02Qv9A.mount: Deactivated successfully.
Oct 14 08:20:27 np0005486759.ooo.test podman[61210]: 2025-10-14 08:20:27.880547016 +0000 UTC m=+0.063311492 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, release=1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:20:27 np0005486759.ooo.test podman[61210]: 2025-10-14 08:20:27.92841255 +0000 UTC m=+0.111176976 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, distribution-scope=public, container_name=nova_compute, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:20:27 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:20:27 np0005486759.ooo.test sudo[61245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl06v34ri/privsep.sock
Oct 14 08:20:27 np0005486759.ooo.test sudo[61245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:28 np0005486759.ooo.test sudo[61245]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:28 np0005486759.ooo.test sudo[61262]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptd2ut1l7/privsep.sock
Oct 14 08:20:28 np0005486759.ooo.test sudo[61262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:20:28 np0005486759.ooo.test podman[61264]: 2025-10-14 08:20:28.87134475 +0000 UTC m=+0.086644075 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Oct 14 08:20:28 np0005486759.ooo.test podman[61264]: 2025-10-14 08:20:28.878380795 +0000 UTC m=+0.093679440 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3)
Oct 14 08:20:28 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:20:29 np0005486759.ooo.test sudo[61262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:29 np0005486759.ooo.test sudo[61292]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5k18d6kn/privsep.sock
Oct 14 08:20:29 np0005486759.ooo.test sudo[61292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:30 np0005486759.ooo.test sudo[61292]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:30 np0005486759.ooo.test sudo[61303]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpedo64hq2/privsep.sock
Oct 14 08:20:30 np0005486759.ooo.test sudo[61303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:31 np0005486759.ooo.test sudo[61303]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:31 np0005486759.ooo.test sudo[61314]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwaruhu0z/privsep.sock
Oct 14 08:20:31 np0005486759.ooo.test sudo[61314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:31 np0005486759.ooo.test sudo[61314]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:32 np0005486759.ooo.test sudo[61325]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxseijgut/privsep.sock
Oct 14 08:20:32 np0005486759.ooo.test sudo[61325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:32 np0005486759.ooo.test sudo[61325]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:33 np0005486759.ooo.test sudo[61336]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp21hqnz0i/privsep.sock
Oct 14 08:20:33 np0005486759.ooo.test sudo[61336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:33 np0005486759.ooo.test sudo[61336]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:34 np0005486759.ooo.test sudo[61353]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpls88pleu/privsep.sock
Oct 14 08:20:34 np0005486759.ooo.test sudo[61353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:34 np0005486759.ooo.test sudo[61353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:34 np0005486759.ooo.test sudo[61364]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmkus_n84/privsep.sock
Oct 14 08:20:34 np0005486759.ooo.test sudo[61364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:35 np0005486759.ooo.test sudo[61364]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:35 np0005486759.ooo.test sudo[61375]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsblddnjp/privsep.sock
Oct 14 08:20:35 np0005486759.ooo.test sudo[61375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:36 np0005486759.ooo.test sudo[61375]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:36 np0005486759.ooo.test sudo[61386]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp53eo63t7/privsep.sock
Oct 14 08:20:36 np0005486759.ooo.test sudo[61386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:37 np0005486759.ooo.test sudo[61386]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:37 np0005486759.ooo.test sudo[61397]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpopk044wv/privsep.sock
Oct 14 08:20:37 np0005486759.ooo.test sudo[61397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:38 np0005486759.ooo.test sudo[61397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:38 np0005486759.ooo.test sudo[61408]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_iypyo0h/privsep.sock
Oct 14 08:20:38 np0005486759.ooo.test sudo[61408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:38 np0005486759.ooo.test sudo[61408]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:39 np0005486759.ooo.test sudo[61425]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4_0s34s7/privsep.sock
Oct 14 08:20:39 np0005486759.ooo.test sudo[61425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:39 np0005486759.ooo.test sudo[61425]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:40 np0005486759.ooo.test sudo[61436]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphutxgho9/privsep.sock
Oct 14 08:20:40 np0005486759.ooo.test sudo[61436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:40 np0005486759.ooo.test sudo[61436]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:40 np0005486759.ooo.test sudo[61447]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsqrbw7e2/privsep.sock
Oct 14 08:20:40 np0005486759.ooo.test sudo[61447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:20:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:20:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:20:41 np0005486759.ooo.test podman[61450]: 2025-10-14 08:20:41.041422784 +0000 UTC m=+0.077672338 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:20:41 np0005486759.ooo.test podman[61450]: 2025-10-14 08:20:41.079325422 +0000 UTC m=+0.115574986 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1)
Oct 14 08:20:41 np0005486759.ooo.test podman[61450]: unhealthy
Oct 14 08:20:41 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:20:41 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:20:41 np0005486759.ooo.test podman[61456]: 2025-10-14 08:20:41.091029399 +0000 UTC m=+0.123115052 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T15:29:47, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:20:41 np0005486759.ooo.test podman[61449]: 2025-10-14 08:20:41.023145104 +0000 UTC m=+0.067474102 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.expose-services=)
Oct 14 08:20:41 np0005486759.ooo.test podman[61449]: 2025-10-14 08:20:41.163406267 +0000 UTC m=+0.207735325 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, container_name=logrotate_crond, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52)
Oct 14 08:20:41 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:20:41 np0005486759.ooo.test podman[61456]: 2025-10-14 08:20:41.180220278 +0000 UTC m=+0.212305941 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:20:41 np0005486759.ooo.test podman[61456]: unhealthy
Oct 14 08:20:41 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:20:41 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:20:41 np0005486759.ooo.test sudo[61447]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:41 np0005486759.ooo.test sudo[61518]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzjk9igmx/privsep.sock
Oct 14 08:20:41 np0005486759.ooo.test sudo[61518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:42 np0005486759.ooo.test sudo[61518]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:42 np0005486759.ooo.test sudo[61529]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpecpa_22b/privsep.sock
Oct 14 08:20:42 np0005486759.ooo.test sudo[61529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:43 np0005486759.ooo.test sudo[61529]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:43 np0005486759.ooo.test sudo[61540]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbosmu2lo/privsep.sock
Oct 14 08:20:43 np0005486759.ooo.test sudo[61540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:20:43 np0005486759.ooo.test podman[61542]: 2025-10-14 08:20:43.620182521 +0000 UTC m=+0.065620020 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, container_name=nova_migration_target, vcs-type=git, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:20:43 np0005486759.ooo.test podman[61542]: 2025-10-14 08:20:43.969309547 +0000 UTC m=+0.414747086 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, container_name=nova_migration_target, version=17.1.9, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible)
Oct 14 08:20:43 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:20:44 np0005486759.ooo.test sudo[61540]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:44 np0005486759.ooo.test sudo[61576]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmn874kk3/privsep.sock
Oct 14 08:20:44 np0005486759.ooo.test sudo[61576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:44 np0005486759.ooo.test sudo[61576]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:45 np0005486759.ooo.test sudo[61591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0lg5sa5/privsep.sock
Oct 14 08:20:45 np0005486759.ooo.test sudo[61591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:45 np0005486759.ooo.test sudo[61591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:20:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:20:45 np0005486759.ooo.test podman[61598]: 2025-10-14 08:20:45.995041313 +0000 UTC m=+0.073372503 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, container_name=ovn_controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:20:46 np0005486759.ooo.test podman[61598]: 2025-10-14 08:20:46.017095491 +0000 UTC m=+0.095426682 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.9)
Oct 14 08:20:46 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:20:46 np0005486759.ooo.test podman[61595]: 2025-10-14 08:20:46.100041699 +0000 UTC m=+0.177394695 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, vcs-type=git, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:20:46 np0005486759.ooo.test podman[61595]: 2025-10-14 08:20:46.16281497 +0000 UTC m=+0.240167946 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:20:46 np0005486759.ooo.test sudo[61645]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfi55jd_3/privsep.sock
Oct 14 08:20:46 np0005486759.ooo.test sudo[61645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:46 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:20:46 np0005486759.ooo.test sudo[61645]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:47 np0005486759.ooo.test sudo[61658]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplhtwjf_1/privsep.sock
Oct 14 08:20:47 np0005486759.ooo.test sudo[61658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:47 np0005486759.ooo.test sudo[61658]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:47 np0005486759.ooo.test sudo[61669]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfu2694jj/privsep.sock
Oct 14 08:20:47 np0005486759.ooo.test sudo[61669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:48 np0005486759.ooo.test sudo[61669]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:48 np0005486759.ooo.test sudo[61680]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq2anopli/privsep.sock
Oct 14 08:20:48 np0005486759.ooo.test sudo[61680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:49 np0005486759.ooo.test sudo[61680]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:49 np0005486759.ooo.test sudo[61691]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxy5bn9oe/privsep.sock
Oct 14 08:20:49 np0005486759.ooo.test sudo[61691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:50 np0005486759.ooo.test sudo[61691]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:50 np0005486759.ooo.test sudo[61708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk4dardmn/privsep.sock
Oct 14 08:20:50 np0005486759.ooo.test sudo[61708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:51 np0005486759.ooo.test sudo[61708]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:51 np0005486759.ooo.test sudo[61719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuq4suuku/privsep.sock
Oct 14 08:20:51 np0005486759.ooo.test sudo[61719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:51 np0005486759.ooo.test sudo[61719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:52 np0005486759.ooo.test sudo[61730]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2z_p7lxu/privsep.sock
Oct 14 08:20:52 np0005486759.ooo.test sudo[61730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:52 np0005486759.ooo.test sudo[61730]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:53 np0005486759.ooo.test sudo[61741]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7ab7h78k/privsep.sock
Oct 14 08:20:53 np0005486759.ooo.test sudo[61741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:53 np0005486759.ooo.test sudo[61741]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:53 np0005486759.ooo.test sudo[61752]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa1nbhejw/privsep.sock
Oct 14 08:20:53 np0005486759.ooo.test sudo[61752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:54 np0005486759.ooo.test sudo[61752]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:54 np0005486759.ooo.test sudo[61763]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1z9p6a4q/privsep.sock
Oct 14 08:20:54 np0005486759.ooo.test sudo[61763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:55 np0005486759.ooo.test sudo[61763]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:55 np0005486759.ooo.test sudo[61780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvs17vnjw/privsep.sock
Oct 14 08:20:55 np0005486759.ooo.test sudo[61780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:20:55 np0005486759.ooo.test systemd[1]: tmp-crun.Bm7Mag.mount: Deactivated successfully.
Oct 14 08:20:55 np0005486759.ooo.test podman[61781]: 2025-10-14 08:20:55.759121818 +0000 UTC m=+0.105209725 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:20:55 np0005486759.ooo.test podman[61781]: 2025-10-14 08:20:55.951332576 +0000 UTC m=+0.297420403 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9)
Oct 14 08:20:55 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:20:56 np0005486759.ooo.test sudo[61780]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:56 np0005486759.ooo.test sudo[61820]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy7uvsap6/privsep.sock
Oct 14 08:20:56 np0005486759.ooo.test sudo[61820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:57 np0005486759.ooo.test sudo[61820]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:57 np0005486759.ooo.test sudo[61831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7kkv9jk6/privsep.sock
Oct 14 08:20:57 np0005486759.ooo.test sudo[61831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:58 np0005486759.ooo.test sudo[61831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:20:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:20:58 np0005486759.ooo.test podman[61838]: 2025-10-14 08:20:58.098536126 +0000 UTC m=+0.069907675 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.33.12, version=17.1.9, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03)
Oct 14 08:20:58 np0005486759.ooo.test podman[61838]: 2025-10-14 08:20:58.135376087 +0000 UTC m=+0.106747626 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03)
Oct 14 08:20:58 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:20:58 np0005486759.ooo.test podman[61835]: 2025-10-14 08:20:58.203917485 +0000 UTC m=+0.177345374 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=nova_compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, version=17.1.9, config_id=tripleo_step5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git)
Oct 14 08:20:58 np0005486759.ooo.test podman[61835]: 2025-10-14 08:20:58.246677397 +0000 UTC m=+0.220105256 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:20:58 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:20:58 np0005486759.ooo.test sudo[61888]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpucnjt5mk/privsep.sock
Oct 14 08:20:58 np0005486759.ooo.test sudo[61888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:58 np0005486759.ooo.test sudo[61888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:20:59 np0005486759.ooo.test sudo[61899]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3ct4cx7i/privsep.sock
Oct 14 08:20:59 np0005486759.ooo.test sudo[61899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:20:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:20:59 np0005486759.ooo.test podman[61900]: 2025-10-14 08:20:59.172433186 +0000 UTC m=+0.065854287 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20250721.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12)
Oct 14 08:20:59 np0005486759.ooo.test podman[61900]: 2025-10-14 08:20:59.180301513 +0000 UTC m=+0.073722654 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, release=1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Oct 14 08:20:59 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:20:59 np0005486759.ooo.test sudo[61899]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:00 np0005486759.ooo.test sudo[61929]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe37fvdbi/privsep.sock
Oct 14 08:21:00 np0005486759.ooo.test sudo[61929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:00 np0005486759.ooo.test sudo[61929]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:01 np0005486759.ooo.test sudo[61946]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplunk0vmk/privsep.sock
Oct 14 08:21:01 np0005486759.ooo.test sudo[61946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:01 np0005486759.ooo.test sudo[61946]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:01 np0005486759.ooo.test sudo[61957]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphxtemtni/privsep.sock
Oct 14 08:21:02 np0005486759.ooo.test sudo[61957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:02 np0005486759.ooo.test sudo[61957]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:02 np0005486759.ooo.test sudo[61968]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1c8bsjxg/privsep.sock
Oct 14 08:21:02 np0005486759.ooo.test sudo[61968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:03 np0005486759.ooo.test sudo[61968]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:03 np0005486759.ooo.test sudo[61979]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqb2rim5/privsep.sock
Oct 14 08:21:03 np0005486759.ooo.test sudo[61979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:04 np0005486759.ooo.test sudo[61979]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:04 np0005486759.ooo.test sudo[61990]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbi0yz15i/privsep.sock
Oct 14 08:21:04 np0005486759.ooo.test sudo[61990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:05 np0005486759.ooo.test sudo[61990]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:05 np0005486759.ooo.test sudo[62001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp33egcxsw/privsep.sock
Oct 14 08:21:05 np0005486759.ooo.test sudo[62001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:06 np0005486759.ooo.test sudo[62001]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:06 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:21:06 np0005486759.ooo.test recover_tripleo_nova_virtqemud[62013]: 47951
Oct 14 08:21:06 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:21:06 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:21:06 np0005486759.ooo.test sudo[62020]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4srnyi0g/privsep.sock
Oct 14 08:21:06 np0005486759.ooo.test sudo[62020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:07 np0005486759.ooo.test sudo[62020]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:07 np0005486759.ooo.test sudo[62031]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcbmdeb99/privsep.sock
Oct 14 08:21:07 np0005486759.ooo.test sudo[62031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:07 np0005486759.ooo.test sudo[62031]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:08 np0005486759.ooo.test sudo[62042]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpskili97f/privsep.sock
Oct 14 08:21:08 np0005486759.ooo.test sudo[62042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:08 np0005486759.ooo.test sudo[62042]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:09 np0005486759.ooo.test sudo[62053]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvr69vha5/privsep.sock
Oct 14 08:21:09 np0005486759.ooo.test sudo[62053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:09 np0005486759.ooo.test sudo[62053]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:09 np0005486759.ooo.test sudo[62064]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprr9j86wd/privsep.sock
Oct 14 08:21:09 np0005486759.ooo.test sudo[62064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:10 np0005486759.ooo.test sudo[62064]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:10 np0005486759.ooo.test sudo[62075]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphz_t7lyj/privsep.sock
Oct 14 08:21:10 np0005486759.ooo.test sudo[62075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:21:11 np0005486759.ooo.test sudo[62075]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:11 np0005486759.ooo.test podman[62082]: 2025-10-14 08:21:11.492556475 +0000 UTC m=+0.108203435 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Oct 14 08:21:11 np0005486759.ooo.test podman[62088]: 2025-10-14 08:21:11.534012144 +0000 UTC m=+0.146325391 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:21:11 np0005486759.ooo.test podman[62082]: 2025-10-14 08:21:11.555334818 +0000 UTC m=+0.170981758 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, architecture=x86_64, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 08:21:11 np0005486759.ooo.test podman[62082]: unhealthy
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:21:11 np0005486759.ooo.test podman[62088]: 2025-10-14 08:21:11.577452369 +0000 UTC m=+0.189765586 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Oct 14 08:21:11 np0005486759.ooo.test podman[62088]: unhealthy
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:21:11 np0005486759.ooo.test podman[62081]: 2025-10-14 08:21:11.459000416 +0000 UTC m=+0.083414214 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, release=1, architecture=x86_64, build-date=2025-07-21T13:07:52, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible)
Oct 14 08:21:11 np0005486759.ooo.test sudo[62147]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5rwypjia/privsep.sock
Oct 14 08:21:11 np0005486759.ooo.test sudo[62147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:11 np0005486759.ooo.test podman[62081]: 2025-10-14 08:21:11.647668534 +0000 UTC m=+0.272082372 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, container_name=logrotate_crond, release=1, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:21:11 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:21:12 np0005486759.ooo.test sudo[62147]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:12 np0005486759.ooo.test sudo[62159]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwwda32vq/privsep.sock
Oct 14 08:21:12 np0005486759.ooo.test sudo[62159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:13 np0005486759.ooo.test sudo[62159]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:13 np0005486759.ooo.test sudo[62170]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqol5rr1y/privsep.sock
Oct 14 08:21:13 np0005486759.ooo.test sudo[62170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:13 np0005486759.ooo.test sudo[62170]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:21:14 np0005486759.ooo.test podman[62174]: 2025-10-14 08:21:14.086029882 +0000 UTC m=+0.071488450 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:21:14 np0005486759.ooo.test sudo[62202]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_5v3g0vh/privsep.sock
Oct 14 08:21:14 np0005486759.ooo.test sudo[62202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:14 np0005486759.ooo.test podman[62174]: 2025-10-14 08:21:14.442023711 +0000 UTC m=+0.427482269 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:21:14 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:21:14 np0005486759.ooo.test sudo[62202]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:15 np0005486759.ooo.test sudo[62213]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9rxd518l/privsep.sock
Oct 14 08:21:15 np0005486759.ooo.test sudo[62213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:15 np0005486759.ooo.test sudo[62213]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:15 np0005486759.ooo.test sudo[62224]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt0_le6o5/privsep.sock
Oct 14 08:21:15 np0005486759.ooo.test sudo[62224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:21:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:21:16 np0005486759.ooo.test podman[62227]: 2025-10-14 08:21:16.435722479 +0000 UTC m=+0.069548803 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Oct 14 08:21:16 np0005486759.ooo.test podman[62228]: 2025-10-14 08:21:16.452057254 +0000 UTC m=+0.080123453 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:21:16 np0005486759.ooo.test podman[62228]: 2025-10-14 08:21:16.494692772 +0000 UTC m=+0.122758961 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, vcs-type=git, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public)
Oct 14 08:21:16 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:21:16 np0005486759.ooo.test sudo[62224]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:16 np0005486759.ooo.test podman[62227]: 2025-10-14 08:21:16.546855693 +0000 UTC m=+0.180682027 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, release=1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, container_name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.expose-services=)
Oct 14 08:21:16 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:21:16 np0005486759.ooo.test sudo[62283]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9x_ytbkt/privsep.sock
Oct 14 08:21:16 np0005486759.ooo.test sudo[62283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:17 np0005486759.ooo.test sudo[62283]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:17 np0005486759.ooo.test sudo[62298]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpssa36uvm/privsep.sock
Oct 14 08:21:17 np0005486759.ooo.test sudo[62298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:18 np0005486759.ooo.test sudo[62298]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:18 np0005486759.ooo.test sudo[62309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptk30rfrk/privsep.sock
Oct 14 08:21:18 np0005486759.ooo.test sudo[62309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:19 np0005486759.ooo.test sudo[62309]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:19 np0005486759.ooo.test sudo[62320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9wopht1f/privsep.sock
Oct 14 08:21:19 np0005486759.ooo.test sudo[62320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:19 np0005486759.ooo.test sudo[62320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:20 np0005486759.ooo.test sudo[62331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz4y41v26/privsep.sock
Oct 14 08:21:20 np0005486759.ooo.test sudo[62331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:20 np0005486759.ooo.test sudo[62331]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:21 np0005486759.ooo.test sudo[62342]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8oi9q6s6/privsep.sock
Oct 14 08:21:21 np0005486759.ooo.test sudo[62342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:21 np0005486759.ooo.test sudo[62342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:22 np0005486759.ooo.test sudo[62353]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpee812wa1/privsep.sock
Oct 14 08:21:22 np0005486759.ooo.test sudo[62353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:22 np0005486759.ooo.test sudo[62353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:22 np0005486759.ooo.test sudo[62370]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpowi2ei1b/privsep.sock
Oct 14 08:21:22 np0005486759.ooo.test sudo[62370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:23 np0005486759.ooo.test sudo[62370]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:23 np0005486759.ooo.test sudo[62381]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5fos6hy4/privsep.sock
Oct 14 08:21:23 np0005486759.ooo.test sudo[62381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:24 np0005486759.ooo.test sudo[62381]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:24 np0005486759.ooo.test sudo[62392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw5yyovwf/privsep.sock
Oct 14 08:21:24 np0005486759.ooo.test sudo[62392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:25 np0005486759.ooo.test sudo[62392]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:25 np0005486759.ooo.test sudo[62403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk7r64ymf/privsep.sock
Oct 14 08:21:25 np0005486759.ooo.test sudo[62403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:26 np0005486759.ooo.test sudo[62403]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:21:26 np0005486759.ooo.test podman[62409]: 2025-10-14 08:21:26.266273411 +0000 UTC m=+0.075880068 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, release=1, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 08:21:26 np0005486759.ooo.test sudo[62443]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5tjixhfh/privsep.sock
Oct 14 08:21:26 np0005486759.ooo.test sudo[62443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:26 np0005486759.ooo.test podman[62409]: 2025-10-14 08:21:26.482266816 +0000 UTC m=+0.291873403 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-qdrouterd, release=1, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Oct 14 08:21:26 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:21:26 np0005486759.ooo.test sudo[62443]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:27 np0005486759.ooo.test sudo[62454]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd2lmdbqd/privsep.sock
Oct 14 08:21:27 np0005486759.ooo.test sudo[62454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:27 np0005486759.ooo.test sudo[62454]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:28 np0005486759.ooo.test sudo[62471]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_88wxp2/privsep.sock
Oct 14 08:21:28 np0005486759.ooo.test sudo[62471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:21:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:21:28 np0005486759.ooo.test podman[62475]: 2025-10-14 08:21:28.469812694 +0000 UTC m=+0.087867075 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=2, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, container_name=collectd)
Oct 14 08:21:28 np0005486759.ooo.test podman[62475]: 2025-10-14 08:21:28.501852343 +0000 UTC m=+0.119906684 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=2, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:21:28 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:21:28 np0005486759.ooo.test systemd[1]: tmp-crun.YjHjUP.mount: Deactivated successfully.
Oct 14 08:21:28 np0005486759.ooo.test podman[62474]: 2025-10-14 08:21:28.596399754 +0000 UTC m=+0.219112463 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:21:28 np0005486759.ooo.test podman[62474]: 2025-10-14 08:21:28.64605963 +0000 UTC m=+0.268772379 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, release=1, container_name=nova_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:21:28 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:21:28 np0005486759.ooo.test sudo[62471]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:29 np0005486759.ooo.test sudo[62528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp50lzk6lb/privsep.sock
Oct 14 08:21:29 np0005486759.ooo.test sudo[62528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:21:29 np0005486759.ooo.test podman[62531]: 2025-10-14 08:21:29.434586599 +0000 UTC m=+0.063374603 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible)
Oct 14 08:21:29 np0005486759.ooo.test podman[62531]: 2025-10-14 08:21:29.47141166 +0000 UTC m=+0.100199684 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:21:29 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:21:29 np0005486759.ooo.test sudo[62528]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:29 np0005486759.ooo.test sudo[62558]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaq6enl2b/privsep.sock
Oct 14 08:21:29 np0005486759.ooo.test sudo[62558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:30 np0005486759.ooo.test sudo[62558]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:30 np0005486759.ooo.test sudo[62569]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyh7mz04u/privsep.sock
Oct 14 08:21:30 np0005486759.ooo.test sudo[62569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:31 np0005486759.ooo.test sudo[62569]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:31 np0005486759.ooo.test sudo[62580]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppgxenl9q/privsep.sock
Oct 14 08:21:31 np0005486759.ooo.test sudo[62580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:32 np0005486759.ooo.test sudo[62580]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:32 np0005486759.ooo.test sudo[62591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpukf37nda/privsep.sock
Oct 14 08:21:32 np0005486759.ooo.test sudo[62591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:33 np0005486759.ooo.test sudo[62591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:33 np0005486759.ooo.test sudo[62607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzjkqux95/privsep.sock
Oct 14 08:21:33 np0005486759.ooo.test sudo[62607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:33 np0005486759.ooo.test sudo[62607]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:34 np0005486759.ooo.test sudo[62619]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7lx0rveu/privsep.sock
Oct 14 08:21:34 np0005486759.ooo.test sudo[62619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:34 np0005486759.ooo.test sudo[62619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:34 np0005486759.ooo.test sudo[62630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphj5xlvlk/privsep.sock
Oct 14 08:21:34 np0005486759.ooo.test sudo[62630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:35 np0005486759.ooo.test sudo[62630]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:35 np0005486759.ooo.test sudo[62641]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzvvg90hh/privsep.sock
Oct 14 08:21:35 np0005486759.ooo.test sudo[62641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:36 np0005486759.ooo.test sudo[62641]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:36 np0005486759.ooo.test sudo[62652]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_c5a8i5h/privsep.sock
Oct 14 08:21:36 np0005486759.ooo.test sudo[62652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:37 np0005486759.ooo.test sudo[62652]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:37 np0005486759.ooo.test sudo[62663]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpabmds9qk/privsep.sock
Oct 14 08:21:37 np0005486759.ooo.test sudo[62663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:38 np0005486759.ooo.test sudo[62663]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:38 np0005486759.ooo.test sudo[62674]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpogo38ixr/privsep.sock
Oct 14 08:21:38 np0005486759.ooo.test sudo[62674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:38 np0005486759.ooo.test sudo[62674]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:39 np0005486759.ooo.test sudo[62691]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps1t018qa/privsep.sock
Oct 14 08:21:39 np0005486759.ooo.test sudo[62691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:39 np0005486759.ooo.test sudo[62691]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:40 np0005486759.ooo.test sudo[62702]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvfocfjto/privsep.sock
Oct 14 08:21:40 np0005486759.ooo.test sudo[62702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:40 np0005486759.ooo.test sudo[62702]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:40 np0005486759.ooo.test sudo[62713]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpspmagyz9/privsep.sock
Oct 14 08:21:40 np0005486759.ooo.test sudo[62713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:41 np0005486759.ooo.test sudo[62713]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:41 np0005486759.ooo.test sudo[62724]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplmbce892/privsep.sock
Oct 14 08:21:41 np0005486759.ooo.test sudo[62724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:21:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:21:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:21:41 np0005486759.ooo.test podman[62726]: 2025-10-14 08:21:41.913574113 +0000 UTC m=+0.093174505 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-type=git, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public)
Oct 14 08:21:41 np0005486759.ooo.test podman[62726]: 2025-10-14 08:21:41.943264612 +0000 UTC m=+0.122864974 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, config_id=tripleo_step4, distribution-scope=public, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.buildah.version=1.33.12, managed_by=tripleo_ansible)
Oct 14 08:21:41 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:21:41 np0005486759.ooo.test podman[62727]: 2025-10-14 08:21:41.955046562 +0000 UTC m=+0.132141479 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:21:41 np0005486759.ooo.test podman[62727]: 2025-10-14 08:21:41.963677405 +0000 UTC m=+0.140772362 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vendor=Red Hat, Inc.)
Oct 14 08:21:41 np0005486759.ooo.test podman[62727]: unhealthy
Oct 14 08:21:41 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:21:41 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:21:42 np0005486759.ooo.test podman[62728]: 2025-10-14 08:21:42.010587038 +0000 UTC m=+0.183421810 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git)
Oct 14 08:21:42 np0005486759.ooo.test podman[62728]: 2025-10-14 08:21:42.023221028 +0000 UTC m=+0.196055780 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, distribution-scope=public, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12)
Oct 14 08:21:42 np0005486759.ooo.test podman[62728]: unhealthy
Oct 14 08:21:42 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:21:42 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:21:42 np0005486759.ooo.test sudo[62724]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:42 np0005486759.ooo.test sudo[62791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7c427_sz/privsep.sock
Oct 14 08:21:42 np0005486759.ooo.test sudo[62791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:43 np0005486759.ooo.test sudo[62791]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:43 np0005486759.ooo.test sudo[62802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvftrqmpg/privsep.sock
Oct 14 08:21:43 np0005486759.ooo.test sudo[62802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:44 np0005486759.ooo.test sudo[62802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:44 np0005486759.ooo.test sudo[62819]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3erd1g7w/privsep.sock
Oct 14 08:21:44 np0005486759.ooo.test sudo[62819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:45 np0005486759.ooo.test sudo[62819]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:21:45 np0005486759.ooo.test podman[62823]: 2025-10-14 08:21:45.125505986 +0000 UTC m=+0.057568464 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:21:45 np0005486759.ooo.test sudo[62853]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdpdzuwl8/privsep.sock
Oct 14 08:21:45 np0005486759.ooo.test sudo[62853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:45 np0005486759.ooo.test podman[62823]: 2025-10-14 08:21:45.512786247 +0000 UTC m=+0.444848665 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, vcs-type=git, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:48:37, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1)
Oct 14 08:21:45 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:21:45 np0005486759.ooo.test sudo[62853]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:46 np0005486759.ooo.test sudo[62865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyfnm6se5/privsep.sock
Oct 14 08:21:46 np0005486759.ooo.test sudo[62865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:46 np0005486759.ooo.test sudo[62865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:21:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:21:46 np0005486759.ooo.test podman[62871]: 2025-10-14 08:21:46.915827685 +0000 UTC m=+0.068047983 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team)
Oct 14 08:21:46 np0005486759.ooo.test podman[62871]: 2025-10-14 08:21:46.948719612 +0000 UTC m=+0.100939890 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53)
Oct 14 08:21:46 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:21:47 np0005486759.ooo.test podman[62872]: 2025-10-14 08:21:47.022412327 +0000 UTC m=+0.170102147 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, build-date=2025-07-21T13:28:44)
Oct 14 08:21:47 np0005486759.ooo.test podman[62872]: 2025-10-14 08:21:47.043367418 +0000 UTC m=+0.191057348 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, container_name=ovn_controller)
Oct 14 08:21:47 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:21:47 np0005486759.ooo.test sudo[62923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa1liavbf/privsep.sock
Oct 14 08:21:47 np0005486759.ooo.test sudo[62923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:47 np0005486759.ooo.test sudo[62923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:47 np0005486759.ooo.test sudo[62934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2zoru1rf/privsep.sock
Oct 14 08:21:47 np0005486759.ooo.test sudo[62934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:48 np0005486759.ooo.test sudo[62934]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:48 np0005486759.ooo.test sudo[62945]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpslmpw761/privsep.sock
Oct 14 08:21:48 np0005486759.ooo.test sudo[62945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:49 np0005486759.ooo.test sudo[62945]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:49 np0005486759.ooo.test sudo[62961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgwajs4v5/privsep.sock
Oct 14 08:21:49 np0005486759.ooo.test sudo[62961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:50 np0005486759.ooo.test sudo[62961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:50 np0005486759.ooo.test sudo[62973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppernrsoj/privsep.sock
Oct 14 08:21:50 np0005486759.ooo.test sudo[62973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:51 np0005486759.ooo.test sudo[62973]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:51 np0005486759.ooo.test sudo[62984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp07eq2bko/privsep.sock
Oct 14 08:21:51 np0005486759.ooo.test sudo[62984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:51 np0005486759.ooo.test sudo[62984]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:52 np0005486759.ooo.test sudo[62995]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpakafr59s/privsep.sock
Oct 14 08:21:52 np0005486759.ooo.test sudo[62995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:52 np0005486759.ooo.test sudo[62995]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:53 np0005486759.ooo.test sudo[63006]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ksertm8/privsep.sock
Oct 14 08:21:53 np0005486759.ooo.test sudo[63006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:53 np0005486759.ooo.test sudo[63006]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:53 np0005486759.ooo.test sudo[63017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkgv8zxuq/privsep.sock
Oct 14 08:21:53 np0005486759.ooo.test sudo[63017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:54 np0005486759.ooo.test sudo[63017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:54 np0005486759.ooo.test sudo[63030]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfimpysn5/privsep.sock
Oct 14 08:21:54 np0005486759.ooo.test sudo[63030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:55 np0005486759.ooo.test sudo[63030]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:55 np0005486759.ooo.test sudo[63045]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp09x_9ch8/privsep.sock
Oct 14 08:21:55 np0005486759.ooo.test sudo[63045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:56 np0005486759.ooo.test sudo[63045]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:56 np0005486759.ooo.test sudo[63056]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppehy39y0/privsep.sock
Oct 14 08:21:56 np0005486759.ooo.test sudo[63056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:21:56 np0005486759.ooo.test podman[63058]: 2025-10-14 08:21:56.623588142 +0000 UTC m=+0.067958801 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, vcs-type=git, version=17.1.9, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, batch=17.1_20250721.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 08:21:56 np0005486759.ooo.test podman[63058]: 2025-10-14 08:21:56.829847664 +0000 UTC m=+0.274218303 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-type=git, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible)
Oct 14 08:21:56 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:21:57 np0005486759.ooo.test sudo[63056]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:57 np0005486759.ooo.test sudo[63095]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4h8youvb/privsep.sock
Oct 14 08:21:57 np0005486759.ooo.test sudo[63095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:58 np0005486759.ooo.test sudo[63095]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:58 np0005486759.ooo.test sudo[63106]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi9o0iky2/privsep.sock
Oct 14 08:21:58 np0005486759.ooo.test sudo[63106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:59 np0005486759.ooo.test sudo[63106]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:21:59 np0005486759.ooo.test podman[63113]: 2025-10-14 08:21:59.131814052 +0000 UTC m=+0.090311055 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:21:59 np0005486759.ooo.test podman[63113]: 2025-10-14 08:21:59.139668983 +0000 UTC m=+0.098165976 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-07-21T13:04:03, version=17.1.9, managed_by=tripleo_ansible, release=2, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:21:59 np0005486759.ooo.test podman[63112]: 2025-10-14 08:21:59.105043202 +0000 UTC m=+0.070177428 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, version=17.1.9, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:21:59 np0005486759.ooo.test podman[63112]: 2025-10-14 08:21:59.184283287 +0000 UTC m=+0.149417483 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, io.openshift.expose-services=, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9)
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:21:59 np0005486759.ooo.test sudo[63161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpntz4cm0l/privsep.sock
Oct 14 08:21:59 np0005486759.ooo.test sudo[63161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:21:59 np0005486759.ooo.test sudo[63161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: tmp-crun.zow3Zk.mount: Deactivated successfully.
Oct 14 08:21:59 np0005486759.ooo.test podman[63166]: 2025-10-14 08:21:59.906052866 +0000 UTC m=+0.073867531 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, container_name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, release=1)
Oct 14 08:21:59 np0005486759.ooo.test podman[63166]: 2025-10-14 08:21:59.916440504 +0000 UTC m=+0.084255129 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.9)
Oct 14 08:21:59 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:22:00 np0005486759.ooo.test sudo[63190]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw5q3fsxt/privsep.sock
Oct 14 08:22:00 np0005486759.ooo.test sudo[63190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:00 np0005486759.ooo.test sudo[63190]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:00 np0005486759.ooo.test sudo[63207]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqii2jvv/privsep.sock
Oct 14 08:22:00 np0005486759.ooo.test sudo[63207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:01 np0005486759.ooo.test sudo[63207]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:01 np0005486759.ooo.test sudo[63218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr6ssp7iq/privsep.sock
Oct 14 08:22:01 np0005486759.ooo.test sudo[63218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:02 np0005486759.ooo.test sudo[63218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:02 np0005486759.ooo.test sudo[63229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9js5qpnn/privsep.sock
Oct 14 08:22:02 np0005486759.ooo.test sudo[63229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:03 np0005486759.ooo.test sudo[63229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:03 np0005486759.ooo.test sudo[63240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptnhra8a7/privsep.sock
Oct 14 08:22:03 np0005486759.ooo.test sudo[63240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:04 np0005486759.ooo.test sudo[63240]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:04 np0005486759.ooo.test sudo[63251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprg226ndy/privsep.sock
Oct 14 08:22:04 np0005486759.ooo.test sudo[63251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:04 np0005486759.ooo.test sudo[63251]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:05 np0005486759.ooo.test sudo[63262]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4v1sqrjp/privsep.sock
Oct 14 08:22:05 np0005486759.ooo.test sudo[63262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:05 np0005486759.ooo.test sudo[63262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:06 np0005486759.ooo.test sudo[63279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt6w6_y9u/privsep.sock
Oct 14 08:22:06 np0005486759.ooo.test sudo[63279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:06 np0005486759.ooo.test sudo[63279]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:06 np0005486759.ooo.test sudo[63290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp33trnnee/privsep.sock
Oct 14 08:22:06 np0005486759.ooo.test sudo[63290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:07 np0005486759.ooo.test sudo[63290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:07 np0005486759.ooo.test sudo[63301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpch5onn6k/privsep.sock
Oct 14 08:22:07 np0005486759.ooo.test sudo[63301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:08 np0005486759.ooo.test sudo[63301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:08 np0005486759.ooo.test sudo[63312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_elqkha5/privsep.sock
Oct 14 08:22:08 np0005486759.ooo.test sudo[63312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:09 np0005486759.ooo.test sudo[63312]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:09 np0005486759.ooo.test sudo[63323]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpndadx8y6/privsep.sock
Oct 14 08:22:09 np0005486759.ooo.test sudo[63323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:10 np0005486759.ooo.test sudo[63323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:10 np0005486759.ooo.test sudo[63334]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph54xr6r5/privsep.sock
Oct 14 08:22:10 np0005486759.ooo.test sudo[63334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:10 np0005486759.ooo.test sudo[63334]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:11 np0005486759.ooo.test sudo[63350]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpipitkzea/privsep.sock
Oct 14 08:22:11 np0005486759.ooo.test sudo[63350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:11 np0005486759.ooo.test sudo[63350]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:12 np0005486759.ooo.test sudo[63362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqmj523nu/privsep.sock
Oct 14 08:22:12 np0005486759.ooo.test sudo[63362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:22:12 np0005486759.ooo.test podman[63364]: 2025-10-14 08:22:12.22355786 +0000 UTC m=+0.078140433 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container)
Oct 14 08:22:12 np0005486759.ooo.test podman[63366]: 2025-10-14 08:22:12.233117592 +0000 UTC m=+0.078090700 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:22:12 np0005486759.ooo.test podman[63366]: 2025-10-14 08:22:12.269596188 +0000 UTC m=+0.114569326 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, release=1, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:22:12 np0005486759.ooo.test podman[63366]: unhealthy
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:22:12 np0005486759.ooo.test podman[63364]: 2025-10-14 08:22:12.306344423 +0000 UTC m=+0.160927066 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.expose-services=, release=1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12)
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:22:12 np0005486759.ooo.test podman[63365]: 2025-10-14 08:22:12.2725601 +0000 UTC m=+0.122211572 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:22:12 np0005486759.ooo.test podman[63365]: 2025-10-14 08:22:12.357477278 +0000 UTC m=+0.207128740 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, config_id=tripleo_step4, release=1, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:22:12 np0005486759.ooo.test podman[63365]: unhealthy
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:22:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:22:12 np0005486759.ooo.test sudo[63362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:12 np0005486759.ooo.test sudo[63434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphfozlg62/privsep.sock
Oct 14 08:22:12 np0005486759.ooo.test sudo[63434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:13 np0005486759.ooo.test sudo[63434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:13 np0005486759.ooo.test sudo[63445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7lj4w945/privsep.sock
Oct 14 08:22:13 np0005486759.ooo.test sudo[63445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:14 np0005486759.ooo.test sudo[63445]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:14 np0005486759.ooo.test sudo[63456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp82d1fj5e/privsep.sock
Oct 14 08:22:14 np0005486759.ooo.test sudo[63456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:15 np0005486759.ooo.test sudo[63456]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:15 np0005486759.ooo.test sudo[63467]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzfn5khi4/privsep.sock
Oct 14 08:22:15 np0005486759.ooo.test sudo[63467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:22:15 np0005486759.ooo.test podman[63469]: 2025-10-14 08:22:15.775343305 +0000 UTC m=+0.081940538 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.9, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 08:22:16 np0005486759.ooo.test podman[63469]: 2025-10-14 08:22:16.152374713 +0000 UTC m=+0.458971996 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, release=1, config_id=tripleo_step4, version=17.1.9, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:22:16 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:22:16 np0005486759.ooo.test sudo[63467]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:16 np0005486759.ooo.test sudo[63505]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj13v_zhn/privsep.sock
Oct 14 08:22:16 np0005486759.ooo.test sudo[63505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:17 np0005486759.ooo.test sudo[63505]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:22:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:22:17 np0005486759.ooo.test podman[63513]: 2025-10-14 08:22:17.209400482 +0000 UTC m=+0.056199121 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, container_name=ovn_metadata_agent, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:22:17 np0005486759.ooo.test podman[63513]: 2025-10-14 08:22:17.239588696 +0000 UTC m=+0.086387395 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.9, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:22:17 np0005486759.ooo.test podman[63514]: 2025-10-14 08:22:17.254028468 +0000 UTC m=+0.091538333 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Oct 14 08:22:17 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:22:17 np0005486759.ooo.test podman[63514]: 2025-10-14 08:22:17.276535596 +0000 UTC m=+0.114045381 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, release=1, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:22:17 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:22:17 np0005486759.ooo.test sudo[63566]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi56srymj/privsep.sock
Oct 14 08:22:17 np0005486759.ooo.test sudo[63566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:17 np0005486759.ooo.test sudo[63566]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:17 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:22:18 np0005486759.ooo.test recover_tripleo_nova_virtqemud[63573]: 47951
Oct 14 08:22:18 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:22:18 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:22:18 np0005486759.ooo.test sudo[63579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt7tvlhmc/privsep.sock
Oct 14 08:22:18 np0005486759.ooo.test sudo[63579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:18 np0005486759.ooo.test sudo[63579]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:19 np0005486759.ooo.test sudo[63590]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzh9seuo0/privsep.sock
Oct 14 08:22:19 np0005486759.ooo.test sudo[63590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:19 np0005486759.ooo.test sudo[63590]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:19 np0005486759.ooo.test sudo[63601]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7s3kovhd/privsep.sock
Oct 14 08:22:19 np0005486759.ooo.test sudo[63601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:20 np0005486759.ooo.test sudo[63601]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:20 np0005486759.ooo.test sudo[63612]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpem2p4eeu/privsep.sock
Oct 14 08:22:20 np0005486759.ooo.test sudo[63612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:21 np0005486759.ooo.test sudo[63612]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:21 np0005486759.ooo.test sudo[63623]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7khlsk0x/privsep.sock
Oct 14 08:22:21 np0005486759.ooo.test sudo[63623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:22 np0005486759.ooo.test sudo[63623]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:22 np0005486759.ooo.test sudo[63640]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0zxtk7op/privsep.sock
Oct 14 08:22:22 np0005486759.ooo.test sudo[63640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:23 np0005486759.ooo.test sudo[63640]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:23 np0005486759.ooo.test sudo[63651]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpekebkqpw/privsep.sock
Oct 14 08:22:23 np0005486759.ooo.test sudo[63651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:23 np0005486759.ooo.test sudo[63651]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:24 np0005486759.ooo.test sudo[63662]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0jua_iqv/privsep.sock
Oct 14 08:22:24 np0005486759.ooo.test sudo[63662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:24 np0005486759.ooo.test sudo[63662]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:25 np0005486759.ooo.test sudo[63673]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3veqpbbh/privsep.sock
Oct 14 08:22:25 np0005486759.ooo.test sudo[63673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:25 np0005486759.ooo.test sudo[63673]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:26 np0005486759.ooo.test sudo[63684]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjrv58_96/privsep.sock
Oct 14 08:22:26 np0005486759.ooo.test sudo[63684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:26 np0005486759.ooo.test sudo[63684]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:26 np0005486759.ooo.test sudo[63695]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3h3omseq/privsep.sock
Oct 14 08:22:26 np0005486759.ooo.test sudo[63695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:22:26 np0005486759.ooo.test podman[63697]: 2025-10-14 08:22:26.980057734 +0000 UTC m=+0.077173433 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, distribution-scope=public)
Oct 14 08:22:27 np0005486759.ooo.test podman[63697]: 2025-10-14 08:22:27.178431134 +0000 UTC m=+0.275546803 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:22:27 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:22:27 np0005486759.ooo.test sudo[63695]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:27 np0005486759.ooo.test sudo[63739]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe4blob01/privsep.sock
Oct 14 08:22:27 np0005486759.ooo.test sudo[63739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:28 np0005486759.ooo.test sudo[63739]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:28 np0005486759.ooo.test sudo[63750]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj52uw6k4/privsep.sock
Oct 14 08:22:28 np0005486759.ooo.test sudo[63750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:29 np0005486759.ooo.test sudo[63750]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:22:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:22:29 np0005486759.ooo.test systemd[1]: tmp-crun.qcSNM9.mount: Deactivated successfully.
Oct 14 08:22:29 np0005486759.ooo.test podman[63756]: 2025-10-14 08:22:29.32903767 +0000 UTC m=+0.097977950 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, architecture=x86_64, container_name=nova_compute, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:22:29 np0005486759.ooo.test podman[63757]: 2025-10-14 08:22:29.375144052 +0000 UTC m=+0.140733219 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, container_name=collectd, name=rhosp17/openstack-collectd, release=2, build-date=2025-07-21T13:04:03, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Oct 14 08:22:29 np0005486759.ooo.test podman[63757]: 2025-10-14 08:22:29.382211957 +0000 UTC m=+0.147801214 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=collectd, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container)
Oct 14 08:22:29 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:22:29 np0005486759.ooo.test podman[63756]: 2025-10-14 08:22:29.433509407 +0000 UTC m=+0.202449647 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20250721.1, release=1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64)
Oct 14 08:22:29 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:22:29 np0005486759.ooo.test sudo[63808]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyacto6ah/privsep.sock
Oct 14 08:22:29 np0005486759.ooo.test sudo[63808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:30 np0005486759.ooo.test sudo[63808]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:22:30 np0005486759.ooo.test podman[63813]: 2025-10-14 08:22:30.165018893 +0000 UTC m=+0.076793201 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64)
Oct 14 08:22:30 np0005486759.ooo.test podman[63813]: 2025-10-14 08:22:30.171715659 +0000 UTC m=+0.083490017 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, batch=17.1_20250721.1, container_name=iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc.)
Oct 14 08:22:30 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:22:30 np0005486759.ooo.test systemd[1]: tmp-crun.b8b0Ve.mount: Deactivated successfully.
Oct 14 08:22:30 np0005486759.ooo.test sudo[63838]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnlqr_af9/privsep.sock
Oct 14 08:22:30 np0005486759.ooo.test sudo[63838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:31 np0005486759.ooo.test sudo[63838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:31 np0005486759.ooo.test sudo[63849]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvszzjj2r/privsep.sock
Oct 14 08:22:31 np0005486759.ooo.test sudo[63849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:31 np0005486759.ooo.test sudo[63849]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:32 np0005486759.ooo.test sudo[63860]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcjtyfrll/privsep.sock
Oct 14 08:22:32 np0005486759.ooo.test sudo[63860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:32 np0005486759.ooo.test sudo[63860]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:33 np0005486759.ooo.test sudo[63877]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa9ib82_0/privsep.sock
Oct 14 08:22:33 np0005486759.ooo.test sudo[63877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:33 np0005486759.ooo.test sudo[63877]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:33 np0005486759.ooo.test sudo[63888]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppj5b5ubm/privsep.sock
Oct 14 08:22:33 np0005486759.ooo.test sudo[63888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:34 np0005486759.ooo.test sudo[63888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:34 np0005486759.ooo.test sudo[63899]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxxjti861/privsep.sock
Oct 14 08:22:34 np0005486759.ooo.test sudo[63899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:35 np0005486759.ooo.test sudo[63899]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:35 np0005486759.ooo.test sudo[63910]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4_7ak53n/privsep.sock
Oct 14 08:22:35 np0005486759.ooo.test sudo[63910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:36 np0005486759.ooo.test sudo[63910]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:36 np0005486759.ooo.test sudo[63921]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsisct2eu/privsep.sock
Oct 14 08:22:36 np0005486759.ooo.test sudo[63921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:37 np0005486759.ooo.test sudo[63921]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:37 np0005486759.ooo.test sudo[63932]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwzhfbnss/privsep.sock
Oct 14 08:22:37 np0005486759.ooo.test sudo[63932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:37 np0005486759.ooo.test sudo[63932]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:38 np0005486759.ooo.test sudo[63948]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqr0tfebi/privsep.sock
Oct 14 08:22:38 np0005486759.ooo.test sudo[63948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:38 np0005486759.ooo.test sudo[63948]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:39 np0005486759.ooo.test sudo[63960]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6p_acw4z/privsep.sock
Oct 14 08:22:39 np0005486759.ooo.test sudo[63960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:39 np0005486759.ooo.test sudo[63960]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:39 np0005486759.ooo.test sudo[63971]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkbxbfpub/privsep.sock
Oct 14 08:22:39 np0005486759.ooo.test sudo[63971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:40 np0005486759.ooo.test sudo[63971]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:40 np0005486759.ooo.test sudo[63982]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1q0rabc/privsep.sock
Oct 14 08:22:40 np0005486759.ooo.test sudo[63982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:41 np0005486759.ooo.test sudo[63982]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:41 np0005486759.ooo.test sudo[63993]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpowp7lgfe/privsep.sock
Oct 14 08:22:41 np0005486759.ooo.test sudo[63993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:42 np0005486759.ooo.test sudo[63993]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:22:42 np0005486759.ooo.test podman[64001]: 2025-10-14 08:22:42.456426079 +0000 UTC m=+0.081782884 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, architecture=x86_64, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:22:42 np0005486759.ooo.test podman[64001]: 2025-10-14 08:22:42.467614631 +0000 UTC m=+0.092971426 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:22:42 np0005486759.ooo.test podman[64001]: unhealthy
Oct 14 08:22:42 np0005486759.ooo.test sudo[64036]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5i20fumn/privsep.sock
Oct 14 08:22:42 np0005486759.ooo.test sudo[64036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:22:42 np0005486759.ooo.test podman[63999]: 2025-10-14 08:22:42.501107767 +0000 UTC m=+0.128459153 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:22:42 np0005486759.ooo.test podman[64032]: 2025-10-14 08:22:42.528447223 +0000 UTC m=+0.058737478 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 08:22:42 np0005486759.ooo.test podman[64032]: 2025-10-14 08:22:42.535473568 +0000 UTC m=+0.065763803 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:22:42 np0005486759.ooo.test podman[64032]: unhealthy
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:22:42 np0005486759.ooo.test podman[63999]: 2025-10-14 08:22:42.562387082 +0000 UTC m=+0.189738498 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:22:42 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:22:43 np0005486759.ooo.test sudo[64036]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:43 np0005486759.ooo.test sudo[64076]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiid4lu3m/privsep.sock
Oct 14 08:22:43 np0005486759.ooo.test sudo[64076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:43 np0005486759.ooo.test sudo[64076]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:44 np0005486759.ooo.test sudo[64091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpta4u9hja/privsep.sock
Oct 14 08:22:44 np0005486759.ooo.test sudo[64091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:44 np0005486759.ooo.test sudo[64091]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:45 np0005486759.ooo.test sudo[64102]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk07r6jef/privsep.sock
Oct 14 08:22:45 np0005486759.ooo.test sudo[64102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:45 np0005486759.ooo.test sudo[64102]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:45 np0005486759.ooo.test sudo[64113]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqj2g53cc/privsep.sock
Oct 14 08:22:45 np0005486759.ooo.test sudo[64113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:22:46 np0005486759.ooo.test systemd[1]: tmp-crun.eFYSiS.mount: Deactivated successfully.
Oct 14 08:22:46 np0005486759.ooo.test podman[64116]: 2025-10-14 08:22:46.48641546 +0000 UTC m=+0.100758445 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, version=17.1.9, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, tcib_managed=true, build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12)
Oct 14 08:22:46 np0005486759.ooo.test sudo[64113]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:46 np0005486759.ooo.test sudo[64147]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf9yeyh_u/privsep.sock
Oct 14 08:22:46 np0005486759.ooo.test sudo[64147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:46 np0005486759.ooo.test podman[64116]: 2025-10-14 08:22:46.886453121 +0000 UTC m=+0.500796036 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:22:46 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:22:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:22:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:22:47 np0005486759.ooo.test sudo[64147]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:47 np0005486759.ooo.test systemd[1]: tmp-crun.vTIJQw.mount: Deactivated successfully.
Oct 14 08:22:47 np0005486759.ooo.test podman[64151]: 2025-10-14 08:22:47.460385065 +0000 UTC m=+0.089791578 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:22:47 np0005486759.ooo.test podman[64151]: 2025-10-14 08:22:47.506769395 +0000 UTC m=+0.136175908 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, release=1)
Oct 14 08:22:47 np0005486759.ooo.test systemd[1]: tmp-crun.Ep96u2.mount: Deactivated successfully.
Oct 14 08:22:47 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:22:47 np0005486759.ooo.test podman[64152]: 2025-10-14 08:22:47.539038442 +0000 UTC m=+0.163431892 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, release=1, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64)
Oct 14 08:22:47 np0005486759.ooo.test podman[64152]: 2025-10-14 08:22:47.563179201 +0000 UTC m=+0.187572712 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-07-21T13:28:44, version=17.1.9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:22:47 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:22:47 np0005486759.ooo.test sudo[64206]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbkluec7w/privsep.sock
Oct 14 08:22:47 np0005486759.ooo.test sudo[64206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:48 np0005486759.ooo.test sudo[64206]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:48 np0005486759.ooo.test sudo[64217]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg6pb4puc/privsep.sock
Oct 14 08:22:48 np0005486759.ooo.test sudo[64217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:49 np0005486759.ooo.test sudo[64217]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:49 np0005486759.ooo.test sudo[64234]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprrgb216m/privsep.sock
Oct 14 08:22:49 np0005486759.ooo.test sudo[64234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:50 np0005486759.ooo.test sudo[64234]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:50 np0005486759.ooo.test sudo[64245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt4dxs67r/privsep.sock
Oct 14 08:22:50 np0005486759.ooo.test sudo[64245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:51 np0005486759.ooo.test sudo[64245]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:51 np0005486759.ooo.test sudo[64256]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpem9l4_cc/privsep.sock
Oct 14 08:22:51 np0005486759.ooo.test sudo[64256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:51 np0005486759.ooo.test sudo[64256]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:52 np0005486759.ooo.test sudo[64267]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp09eifitj/privsep.sock
Oct 14 08:22:52 np0005486759.ooo.test sudo[64267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:52 np0005486759.ooo.test sudo[64267]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:53 np0005486759.ooo.test sudo[64278]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqncc3ey7/privsep.sock
Oct 14 08:22:53 np0005486759.ooo.test sudo[64278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:53 np0005486759.ooo.test sudo[64278]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:53 np0005486759.ooo.test sudo[64289]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaubf80vi/privsep.sock
Oct 14 08:22:53 np0005486759.ooo.test sudo[64289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:54 np0005486759.ooo.test sudo[64289]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:54 np0005486759.ooo.test sudo[64306]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6zq808fl/privsep.sock
Oct 14 08:22:54 np0005486759.ooo.test sudo[64306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:55 np0005486759.ooo.test sudo[64306]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:55 np0005486759.ooo.test sudo[64317]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9l7vtbq5/privsep.sock
Oct 14 08:22:55 np0005486759.ooo.test sudo[64317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:56 np0005486759.ooo.test sudo[64317]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:56 np0005486759.ooo.test sudo[64328]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf84wwus4/privsep.sock
Oct 14 08:22:56 np0005486759.ooo.test sudo[64328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:57 np0005486759.ooo.test sudo[64328]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:57 np0005486759.ooo.test sudo[64339]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjrjsruuw/privsep.sock
Oct 14 08:22:57 np0005486759.ooo.test sudo[64339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:22:57 np0005486759.ooo.test podman[64341]: 2025-10-14 08:22:57.412976471 +0000 UTC m=+0.057795066 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, container_name=metrics_qdr, release=1, build-date=2025-07-21T13:07:59, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:22:57 np0005486759.ooo.test podman[64341]: 2025-10-14 08:22:57.599737404 +0000 UTC m=+0.244556039 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:07:59, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=metrics_qdr, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd)
Oct 14 08:22:57 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:22:57 np0005486759.ooo.test sudo[64339]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:58 np0005486759.ooo.test sudo[64378]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpghi4a36h/privsep.sock
Oct 14 08:22:58 np0005486759.ooo.test sudo[64378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:58 np0005486759.ooo.test sudo[64378]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:59 np0005486759.ooo.test sudo[64389]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_rxk8z4/privsep.sock
Oct 14 08:22:59 np0005486759.ooo.test sudo[64389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:22:59 np0005486759.ooo.test sudo[64389]: pam_unix(sudo:session): session closed for user root
Oct 14 08:22:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:22:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:22:59 np0005486759.ooo.test podman[64396]: 2025-10-14 08:22:59.936073962 +0000 UTC m=+0.077317275 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, release=1, config_id=tripleo_step5, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 08:22:59 np0005486759.ooo.test podman[64401]: 2025-10-14 08:22:59.955650692 +0000 UTC m=+0.088998295 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, distribution-scope=public, io.buildah.version=1.33.12, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:22:59 np0005486759.ooo.test podman[64401]: 2025-10-14 08:22:59.988249922 +0000 UTC m=+0.121597525 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, container_name=collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Oct 14 08:22:59 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:23:00 np0005486759.ooo.test podman[64396]: 2025-10-14 08:23:00.008372259 +0000 UTC m=+0.149615572 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_compute)
Oct 14 08:23:00 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:23:00 np0005486759.ooo.test sudo[64450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvsnzr0iz/privsep.sock
Oct 14 08:23:00 np0005486759.ooo.test sudo[64450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:23:00 np0005486759.ooo.test podman[64453]: 2025-10-14 08:23:00.435569288 +0000 UTC m=+0.065472119 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, release=1, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12)
Oct 14 08:23:00 np0005486759.ooo.test podman[64453]: 2025-10-14 08:23:00.44420486 +0000 UTC m=+0.074107731 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, version=17.1.9, container_name=iscsid)
Oct 14 08:23:00 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:23:00 np0005486759.ooo.test sudo[64450]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:00 np0005486759.ooo.test sudo[64480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp164lrr4o/privsep.sock
Oct 14 08:23:00 np0005486759.ooo.test sudo[64480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:01 np0005486759.ooo.test sudo[64480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:01 np0005486759.ooo.test sudo[64491]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppuy3zy77/privsep.sock
Oct 14 08:23:01 np0005486759.ooo.test sudo[64491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:02 np0005486759.ooo.test sudo[64491]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:02 np0005486759.ooo.test sudo[64502]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf_3lch05/privsep.sock
Oct 14 08:23:02 np0005486759.ooo.test sudo[64502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:03 np0005486759.ooo.test sudo[64502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:03 np0005486759.ooo.test sudo[64513]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91qmqpw6/privsep.sock
Oct 14 08:23:03 np0005486759.ooo.test sudo[64513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:04 np0005486759.ooo.test sudo[64513]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:04 np0005486759.ooo.test sudo[64524]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpecbc_huq/privsep.sock
Oct 14 08:23:04 np0005486759.ooo.test sudo[64524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:04 np0005486759.ooo.test sudo[64524]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:05 np0005486759.ooo.test sudo[64538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmr7_0e4f/privsep.sock
Oct 14 08:23:05 np0005486759.ooo.test sudo[64538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:05 np0005486759.ooo.test sudo[64538]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:06 np0005486759.ooo.test sudo[64552]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4apxa8qi/privsep.sock
Oct 14 08:23:06 np0005486759.ooo.test sudo[64552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:06 np0005486759.ooo.test sudo[64552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:06 np0005486759.ooo.test sudo[64563]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiig092ie/privsep.sock
Oct 14 08:23:06 np0005486759.ooo.test sudo[64563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:07 np0005486759.ooo.test sudo[64563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:07 np0005486759.ooo.test sudo[64574]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp60lk8_vi/privsep.sock
Oct 14 08:23:07 np0005486759.ooo.test sudo[64574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:08 np0005486759.ooo.test sudo[64574]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:08 np0005486759.ooo.test sudo[64585]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps63g21zo/privsep.sock
Oct 14 08:23:08 np0005486759.ooo.test sudo[64585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:09 np0005486759.ooo.test sudo[64585]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:09 np0005486759.ooo.test sudo[64596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4lcgj_tz/privsep.sock
Oct 14 08:23:09 np0005486759.ooo.test sudo[64596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:10 np0005486759.ooo.test sudo[64596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:10 np0005486759.ooo.test sudo[64607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3llpz7f0/privsep.sock
Oct 14 08:23:10 np0005486759.ooo.test sudo[64607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:11 np0005486759.ooo.test sudo[64607]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:11 np0005486759.ooo.test sudo[64624]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplkxfvequ/privsep.sock
Oct 14 08:23:11 np0005486759.ooo.test sudo[64624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:11 np0005486759.ooo.test sudo[64624]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:12 np0005486759.ooo.test sudo[64635]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpotpcix7o/privsep.sock
Oct 14 08:23:12 np0005486759.ooo.test sudo[64635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:12 np0005486759.ooo.test sudo[64635]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:23:12 np0005486759.ooo.test podman[64642]: 2025-10-14 08:23:12.825017165 +0000 UTC m=+0.089576582 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_id=tripleo_step4, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:23:12 np0005486759.ooo.test podman[64639]: 2025-10-14 08:23:12.793297459 +0000 UTC m=+0.062047819 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:23:12 np0005486759.ooo.test podman[64643]: 2025-10-14 08:23:12.855655417 +0000 UTC m=+0.120219584 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:23:12 np0005486759.ooo.test podman[64643]: 2025-10-14 08:23:12.873359513 +0000 UTC m=+0.137923650 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:23:12 np0005486759.ooo.test podman[64639]: 2025-10-14 08:23:12.88216392 +0000 UTC m=+0.150914240 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:23:12 np0005486759.ooo.test podman[64643]: unhealthy
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:23:12 np0005486759.ooo.test podman[64642]: 2025-10-14 08:23:12.961339577 +0000 UTC m=+0.225898994 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4)
Oct 14 08:23:12 np0005486759.ooo.test podman[64642]: unhealthy
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:23:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:23:12 np0005486759.ooo.test sudo[64703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd70ozd8i/privsep.sock
Oct 14 08:23:12 np0005486759.ooo.test sudo[64703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:13 np0005486759.ooo.test sudo[64703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:13 np0005486759.ooo.test sudo[64714]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6m2n_fo0/privsep.sock
Oct 14 08:23:13 np0005486759.ooo.test sudo[64714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:14 np0005486759.ooo.test sudo[64714]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:14 np0005486759.ooo.test sudo[64725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp79cigwpc/privsep.sock
Oct 14 08:23:14 np0005486759.ooo.test sudo[64725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:15 np0005486759.ooo.test sudo[64725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:15 np0005486759.ooo.test sudo[64736]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps5nodz6y/privsep.sock
Oct 14 08:23:15 np0005486759.ooo.test sudo[64736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:16 np0005486759.ooo.test sudo[64736]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:16 np0005486759.ooo.test sudo[64753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa5dcvysa/privsep.sock
Oct 14 08:23:16 np0005486759.ooo.test sudo[64753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:17 np0005486759.ooo.test sudo[64753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:23:17 np0005486759.ooo.test podman[64757]: 2025-10-14 08:23:17.258235692 +0000 UTC m=+0.066476459 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:23:17 np0005486759.ooo.test sudo[64788]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjkwnr3aa/privsep.sock
Oct 14 08:23:17 np0005486759.ooo.test sudo[64788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:23:17 np0005486759.ooo.test podman[64757]: 2025-10-14 08:23:17.675916754 +0000 UTC m=+0.484157511 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, release=1, tcib_managed=true, version=17.1.9, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:23:17 np0005486759.ooo.test podman[64791]: 2025-10-14 08:23:17.766207055 +0000 UTC m=+0.081274699 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: tmp-crun.G1klQv.mount: Deactivated successfully.
Oct 14 08:23:17 np0005486759.ooo.test podman[64791]: 2025-10-14 08:23:17.832451776 +0000 UTC m=+0.147519390 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:23:17 np0005486759.ooo.test podman[64792]: 2025-10-14 08:23:17.833135426 +0000 UTC m=+0.143762330 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, container_name=ovn_controller, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44)
Oct 14 08:23:17 np0005486759.ooo.test podman[64792]: 2025-10-14 08:23:17.913594771 +0000 UTC m=+0.224221695 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc., release=1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Oct 14 08:23:17 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:23:18 np0005486759.ooo.test sudo[64788]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:18 np0005486759.ooo.test sudo[64846]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp01jxa1fc/privsep.sock
Oct 14 08:23:18 np0005486759.ooo.test sudo[64846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:18 np0005486759.ooo.test sudo[64846]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:19 np0005486759.ooo.test sudo[64857]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_117m9mi/privsep.sock
Oct 14 08:23:19 np0005486759.ooo.test sudo[64857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:19 np0005486759.ooo.test sudo[64857]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:19 np0005486759.ooo.test sudo[64868]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptp_ziyj1/privsep.sock
Oct 14 08:23:19 np0005486759.ooo.test sudo[64868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:20 np0005486759.ooo.test sudo[64868]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:20 np0005486759.ooo.test sudo[64879]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1yij7ezr/privsep.sock
Oct 14 08:23:20 np0005486759.ooo.test sudo[64879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:21 np0005486759.ooo.test sudo[64879]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:21 np0005486759.ooo.test sudo[64896]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp56nuce1j/privsep.sock
Oct 14 08:23:21 np0005486759.ooo.test sudo[64896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:22 np0005486759.ooo.test sudo[64896]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:22 np0005486759.ooo.test sudo[64907]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbgkvu6g2/privsep.sock
Oct 14 08:23:22 np0005486759.ooo.test sudo[64907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:23 np0005486759.ooo.test sudo[64907]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:23 np0005486759.ooo.test sudo[64918]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0hk2uthd/privsep.sock
Oct 14 08:23:23 np0005486759.ooo.test sudo[64918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:24 np0005486759.ooo.test sudo[64918]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:24 np0005486759.ooo.test sudo[64929]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5kkbv2pr/privsep.sock
Oct 14 08:23:24 np0005486759.ooo.test sudo[64929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:24 np0005486759.ooo.test sudo[64929]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:25 np0005486759.ooo.test sudo[64940]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv7ipi97j/privsep.sock
Oct 14 08:23:25 np0005486759.ooo.test sudo[64940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:25 np0005486759.ooo.test sudo[64940]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:26 np0005486759.ooo.test sudo[64951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7ye_poie/privsep.sock
Oct 14 08:23:26 np0005486759.ooo.test sudo[64951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:26 np0005486759.ooo.test sudo[64951]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:26 np0005486759.ooo.test sudo[64968]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd9iqn6qy/privsep.sock
Oct 14 08:23:26 np0005486759.ooo.test sudo[64968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:27 np0005486759.ooo.test sudo[64968]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:27 np0005486759.ooo.test sudo[64979]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoev1lhvd/privsep.sock
Oct 14 08:23:27 np0005486759.ooo.test sudo[64979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:23:27 np0005486759.ooo.test podman[64981]: 2025-10-14 08:23:27.909696917 +0000 UTC m=+0.088869451 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1, batch=17.1_20250721.1, vcs-type=git, managed_by=tripleo_ansible, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, version=17.1.9)
Oct 14 08:23:28 np0005486759.ooo.test podman[64981]: 2025-10-14 08:23:28.108422698 +0000 UTC m=+0.287595222 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Oct 14 08:23:28 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:23:28 np0005486759.ooo.test sudo[64979]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:28 np0005486759.ooo.test sudo[65020]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpppsvfjce/privsep.sock
Oct 14 08:23:28 np0005486759.ooo.test sudo[65020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:29 np0005486759.ooo.test sudo[65020]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:29 np0005486759.ooo.test sudo[65031]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsds1lsga/privsep.sock
Oct 14 08:23:29 np0005486759.ooo.test sudo[65031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:30 np0005486759.ooo.test sudo[65031]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:23:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:23:30 np0005486759.ooo.test podman[65035]: 2025-10-14 08:23:30.302255583 +0000 UTC m=+0.092895418 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:23:30 np0005486759.ooo.test podman[65038]: 2025-10-14 08:23:30.353351922 +0000 UTC m=+0.141084163 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.33.12)
Oct 14 08:23:30 np0005486759.ooo.test podman[65038]: 2025-10-14 08:23:30.390423493 +0000 UTC m=+0.178155714 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., vcs-type=git, container_name=collectd, managed_by=tripleo_ansible)
Oct 14 08:23:30 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:23:30 np0005486759.ooo.test podman[65035]: 2025-10-14 08:23:30.408748326 +0000 UTC m=+0.199388121 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible)
Oct 14 08:23:30 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:23:30 np0005486759.ooo.test sudo[65086]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpung5qbma/privsep.sock
Oct 14 08:23:30 np0005486759.ooo.test sudo[65086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:23:30 np0005486759.ooo.test podman[65088]: 2025-10-14 08:23:30.631132648 +0000 UTC m=+0.081784245 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, name=rhosp17/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.buildah.version=1.33.12, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:27:15)
Oct 14 08:23:30 np0005486759.ooo.test podman[65088]: 2025-10-14 08:23:30.644245349 +0000 UTC m=+0.094896976 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid)
Oct 14 08:23:30 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:23:31 np0005486759.ooo.test sudo[65086]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:31 np0005486759.ooo.test systemd[1]: tmp-crun.jpGOGV.mount: Deactivated successfully.
Oct 14 08:23:31 np0005486759.ooo.test sudo[65115]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnimaz2t0/privsep.sock
Oct 14 08:23:31 np0005486759.ooo.test sudo[65115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:32 np0005486759.ooo.test sudo[65115]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:32 np0005486759.ooo.test sudo[65131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzg25mje4/privsep.sock
Oct 14 08:23:32 np0005486759.ooo.test sudo[65131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:32 np0005486759.ooo.test sudo[65131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:33 np0005486759.ooo.test sudo[65143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr65glgdh/privsep.sock
Oct 14 08:23:33 np0005486759.ooo.test sudo[65143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:33 np0005486759.ooo.test sudo[65143]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:34 np0005486759.ooo.test sudo[65154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpihyqlxoe/privsep.sock
Oct 14 08:23:34 np0005486759.ooo.test sudo[65154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:34 np0005486759.ooo.test sudo[65154]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:35 np0005486759.ooo.test sudo[65165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplea8zu4l/privsep.sock
Oct 14 08:23:35 np0005486759.ooo.test sudo[65165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:35 np0005486759.ooo.test sudo[65165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:35 np0005486759.ooo.test sudo[65176]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1o9udcbn/privsep.sock
Oct 14 08:23:35 np0005486759.ooo.test sudo[65176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:36 np0005486759.ooo.test sudo[65176]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:36 np0005486759.ooo.test sudo[65187]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1uzkva1j/privsep.sock
Oct 14 08:23:36 np0005486759.ooo.test sudo[65187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:37 np0005486759.ooo.test sudo[65187]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:37 np0005486759.ooo.test sudo[65201]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6jss5uag/privsep.sock
Oct 14 08:23:37 np0005486759.ooo.test sudo[65201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:38 np0005486759.ooo.test sudo[65201]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:38 np0005486759.ooo.test sudo[65215]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplkenf4rs/privsep.sock
Oct 14 08:23:38 np0005486759.ooo.test sudo[65215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:39 np0005486759.ooo.test sudo[65215]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:39 np0005486759.ooo.test sudo[65226]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8olgi3uw/privsep.sock
Oct 14 08:23:39 np0005486759.ooo.test sudo[65226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:39 np0005486759.ooo.test sudo[65226]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:40 np0005486759.ooo.test sudo[65237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnrpkj0xw/privsep.sock
Oct 14 08:23:40 np0005486759.ooo.test sudo[65237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:40 np0005486759.ooo.test sudo[65237]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:41 np0005486759.ooo.test sudo[65248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm9u7z3_m/privsep.sock
Oct 14 08:23:41 np0005486759.ooo.test sudo[65248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:41 np0005486759.ooo.test sudo[65248]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:41 np0005486759.ooo.test sudo[65259]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqdzilctc/privsep.sock
Oct 14 08:23:41 np0005486759.ooo.test sudo[65259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:42 np0005486759.ooo.test sudo[65259]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:42 np0005486759.ooo.test sudo[65270]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsv8vlatn/privsep.sock
Oct 14 08:23:42 np0005486759.ooo.test sudo[65270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:43 np0005486759.ooo.test sudo[65270]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: tmp-crun.eT1jZv.mount: Deactivated successfully.
Oct 14 08:23:43 np0005486759.ooo.test podman[65281]: 2025-10-14 08:23:43.379936716 +0000 UTC m=+0.079153409 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, architecture=x86_64, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:23:43 np0005486759.ooo.test podman[65283]: 2025-10-14 08:23:43.360452568 +0000 UTC m=+0.064654295 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1)
Oct 14 08:23:43 np0005486759.ooo.test podman[65284]: 2025-10-14 08:23:43.419157119 +0000 UTC m=+0.121244355 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-07-21T15:29:47, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:23:43 np0005486759.ooo.test podman[65283]: 2025-10-14 08:23:43.439432589 +0000 UTC m=+0.143634296 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, architecture=x86_64, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true)
Oct 14 08:23:43 np0005486759.ooo.test podman[65283]: unhealthy
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:23:43 np0005486759.ooo.test podman[65284]: 2025-10-14 08:23:43.459277968 +0000 UTC m=+0.161365194 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9)
Oct 14 08:23:43 np0005486759.ooo.test podman[65284]: unhealthy
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:23:43 np0005486759.ooo.test podman[65281]: 2025-10-14 08:23:43.514005163 +0000 UTC m=+0.213221826 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:23:43 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:23:43 np0005486759.ooo.test sudo[65342]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsxnn1l86/privsep.sock
Oct 14 08:23:43 np0005486759.ooo.test sudo[65342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:44 np0005486759.ooo.test sudo[65342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:44 np0005486759.ooo.test sudo[65353]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphm0n96kt/privsep.sock
Oct 14 08:23:44 np0005486759.ooo.test sudo[65353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:44 np0005486759.ooo.test systemd[1]: tmp-crun.8vSTMQ.mount: Deactivated successfully.
Oct 14 08:23:44 np0005486759.ooo.test sudo[65353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:45 np0005486759.ooo.test sudo[65364]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppi7_tnjt/privsep.sock
Oct 14 08:23:45 np0005486759.ooo.test sudo[65364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:45 np0005486759.ooo.test sudo[65364]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:45 np0005486759.ooo.test sudo[65375]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi6yexl4g/privsep.sock
Oct 14 08:23:45 np0005486759.ooo.test sudo[65375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:46 np0005486759.ooo.test sudo[65375]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:46 np0005486759.ooo.test sudo[65386]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6gq8zdoy/privsep.sock
Oct 14 08:23:46 np0005486759.ooo.test sudo[65386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:47 np0005486759.ooo.test sudo[65386]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:47 np0005486759.ooo.test sudo[65397]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq7dj85zg/privsep.sock
Oct 14 08:23:47 np0005486759.ooo.test sudo[65397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:48 np0005486759.ooo.test sudo[65397]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: tmp-crun.LWBgtN.mount: Deactivated successfully.
Oct 14 08:23:48 np0005486759.ooo.test podman[65403]: 2025-10-14 08:23:48.248398587 +0000 UTC m=+0.078175439 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.9)
Oct 14 08:23:48 np0005486759.ooo.test podman[65404]: 2025-10-14 08:23:48.302262356 +0000 UTC m=+0.128582228 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_id=tripleo_step4, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:23:48 np0005486759.ooo.test podman[65405]: 2025-10-14 08:23:48.346699712 +0000 UTC m=+0.170576023 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:23:48 np0005486759.ooo.test podman[65404]: 2025-10-14 08:23:48.359824323 +0000 UTC m=+0.186144175 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:23:48 np0005486759.ooo.test sudo[65478]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp86kst770/privsep.sock
Oct 14 08:23:48 np0005486759.ooo.test sudo[65478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:48 np0005486759.ooo.test podman[65405]: 2025-10-14 08:23:48.391453486 +0000 UTC m=+0.215329807 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:23:48 np0005486759.ooo.test podman[65403]: 2025-10-14 08:23:48.637218297 +0000 UTC m=+0.466995149 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1, tcib_managed=true, io.buildah.version=1.33.12, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, version=17.1.9, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Oct 14 08:23:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:23:48 np0005486759.ooo.test sudo[65478]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:49 np0005486759.ooo.test sudo[65494]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6vrckfuw/privsep.sock
Oct 14 08:23:49 np0005486759.ooo.test sudo[65494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:49 np0005486759.ooo.test sudo[65494]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:50 np0005486759.ooo.test sudo[65505]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjwvbg5pv/privsep.sock
Oct 14 08:23:50 np0005486759.ooo.test sudo[65505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:50 np0005486759.ooo.test sudo[65505]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:50 np0005486759.ooo.test sudo[65516]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp479z_ob4/privsep.sock
Oct 14 08:23:50 np0005486759.ooo.test sudo[65516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:51 np0005486759.ooo.test sudo[65516]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:51 np0005486759.ooo.test sudo[65527]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm0l4c4mi/privsep.sock
Oct 14 08:23:51 np0005486759.ooo.test sudo[65527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:52 np0005486759.ooo.test sudo[65527]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:52 np0005486759.ooo.test sudo[65538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6g4qa8fd/privsep.sock
Oct 14 08:23:52 np0005486759.ooo.test sudo[65538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:53 np0005486759.ooo.test sudo[65538]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:53 np0005486759.ooo.test sudo[65549]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy_035amw/privsep.sock
Oct 14 08:23:53 np0005486759.ooo.test sudo[65549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:54 np0005486759.ooo.test sudo[65549]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:54 np0005486759.ooo.test sudo[65566]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphhf3t4c8/privsep.sock
Oct 14 08:23:54 np0005486759.ooo.test sudo[65566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:55 np0005486759.ooo.test sudo[65566]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:55 np0005486759.ooo.test sudo[65577]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoeveo60h/privsep.sock
Oct 14 08:23:55 np0005486759.ooo.test sudo[65577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:55 np0005486759.ooo.test sudo[65577]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:56 np0005486759.ooo.test sudo[65588]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp__pm2unu/privsep.sock
Oct 14 08:23:56 np0005486759.ooo.test sudo[65588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:56 np0005486759.ooo.test sudo[65588]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:57 np0005486759.ooo.test sudo[65599]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpom9chkvv/privsep.sock
Oct 14 08:23:57 np0005486759.ooo.test sudo[65599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:57 np0005486759.ooo.test sudo[65599]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:57 np0005486759.ooo.test sudo[65610]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzintcytw/privsep.sock
Oct 14 08:23:57 np0005486759.ooo.test sudo[65610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:23:58 np0005486759.ooo.test podman[65613]: 2025-10-14 08:23:58.448649689 +0000 UTC m=+0.081397153 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Oct 14 08:23:58 np0005486759.ooo.test sudo[65610]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:58 np0005486759.ooo.test podman[65613]: 2025-10-14 08:23:58.646910942 +0000 UTC m=+0.279658316 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9)
Oct 14 08:23:58 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:23:58 np0005486759.ooo.test sudo[65649]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp31ag5h4y/privsep.sock
Oct 14 08:23:58 np0005486759.ooo.test sudo[65649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:23:59 np0005486759.ooo.test sudo[65649]: pam_unix(sudo:session): session closed for user root
Oct 14 08:23:59 np0005486759.ooo.test sudo[65666]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoycciwl2/privsep.sock
Oct 14 08:23:59 np0005486759.ooo.test sudo[65666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:00 np0005486759.ooo.test sudo[65666]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:00 np0005486759.ooo.test sudo[65677]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn2m863bi/privsep.sock
Oct 14 08:24:00 np0005486759.ooo.test sudo[65677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:24:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:24:00 np0005486759.ooo.test podman[65680]: 2025-10-14 08:24:00.579935401 +0000 UTC m=+0.058893116 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, vendor=Red Hat, Inc., release=2, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Oct 14 08:24:00 np0005486759.ooo.test podman[65679]: 2025-10-14 08:24:00.591476039 +0000 UTC m=+0.069248747 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, vendor=Red Hat, Inc., version=17.1.9, container_name=nova_compute, architecture=x86_64)
Oct 14 08:24:00 np0005486759.ooo.test podman[65679]: 2025-10-14 08:24:00.614327247 +0000 UTC m=+0.092099865 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., version=17.1.9, container_name=nova_compute, architecture=x86_64, build-date=2025-07-21T14:48:37, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, config_id=tripleo_step5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Oct 14 08:24:00 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:24:00 np0005486759.ooo.test podman[65680]: 2025-10-14 08:24:00.665180723 +0000 UTC m=+0.144138498 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=2, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.9)
Oct 14 08:24:00 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:24:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:24:00 np0005486759.ooo.test podman[65727]: 2025-10-14 08:24:00.757849774 +0000 UTC m=+0.064108468 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, release=1, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Oct 14 08:24:00 np0005486759.ooo.test podman[65727]: 2025-10-14 08:24:00.767292056 +0000 UTC m=+0.073550710 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:24:00 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:24:01 np0005486759.ooo.test sudo[65677]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:01 np0005486759.ooo.test sudo[65755]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoqv1w3eo/privsep.sock
Oct 14 08:24:01 np0005486759.ooo.test sudo[65755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:01 np0005486759.ooo.test sudo[65755]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:02 np0005486759.ooo.test sudo[65766]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5yk_b4gu/privsep.sock
Oct 14 08:24:02 np0005486759.ooo.test sudo[65766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:02 np0005486759.ooo.test sudo[65766]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:03 np0005486759.ooo.test sudo[65777]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp66pn4ixy/privsep.sock
Oct 14 08:24:03 np0005486759.ooo.test sudo[65777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:03 np0005486759.ooo.test sudo[65777]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:03 np0005486759.ooo.test sudo[65788]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpod93ry5e/privsep.sock
Oct 14 08:24:03 np0005486759.ooo.test sudo[65788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:04 np0005486759.ooo.test sudo[65788]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:04 np0005486759.ooo.test sudo[65804]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu4tjrlz_/privsep.sock
Oct 14 08:24:04 np0005486759.ooo.test sudo[65804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:05 np0005486759.ooo.test sudo[65804]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:05 np0005486759.ooo.test sudo[65816]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaazcufcg/privsep.sock
Oct 14 08:24:05 np0005486759.ooo.test sudo[65816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:06 np0005486759.ooo.test sudo[65816]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:06 np0005486759.ooo.test sudo[65827]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpus06afxp/privsep.sock
Oct 14 08:24:06 np0005486759.ooo.test sudo[65827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:07 np0005486759.ooo.test sudo[65827]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:07 np0005486759.ooo.test sudo[65838]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpubtzbocw/privsep.sock
Oct 14 08:24:07 np0005486759.ooo.test sudo[65838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:07 np0005486759.ooo.test sudo[65838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:08 np0005486759.ooo.test sudo[65849]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp945gjnfb/privsep.sock
Oct 14 08:24:08 np0005486759.ooo.test sudo[65849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:08 np0005486759.ooo.test sudo[65849]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:09 np0005486759.ooo.test sudo[65860]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfxrfzhfe/privsep.sock
Oct 14 08:24:09 np0005486759.ooo.test sudo[65860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:09 np0005486759.ooo.test sudo[65860]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:09 np0005486759.ooo.test sudo[65873]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4lfv37yi/privsep.sock
Oct 14 08:24:09 np0005486759.ooo.test sudo[65873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:10 np0005486759.ooo.test sudo[65873]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:10 np0005486759.ooo.test sudo[65888]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9v7231hd/privsep.sock
Oct 14 08:24:10 np0005486759.ooo.test sudo[65888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:11 np0005486759.ooo.test sudo[65888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:11 np0005486759.ooo.test sudo[65899]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3jl_5hi1/privsep.sock
Oct 14 08:24:11 np0005486759.ooo.test sudo[65899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:12 np0005486759.ooo.test sudo[65899]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:12 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:24:12 np0005486759.ooo.test recover_tripleo_nova_virtqemud[65906]: 47951
Oct 14 08:24:12 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:24:12 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:24:12 np0005486759.ooo.test sudo[65913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsu03msgx/privsep.sock
Oct 14 08:24:12 np0005486759.ooo.test sudo[65913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:13 np0005486759.ooo.test sudo[65913]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:13 np0005486759.ooo.test sudo[65924]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuyldz74t/privsep.sock
Oct 14 08:24:13 np0005486759.ooo.test sudo[65924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:13 np0005486759.ooo.test sudo[65924]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:24:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:24:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:24:14 np0005486759.ooo.test systemd[1]: tmp-crun.oDGJo5.mount: Deactivated successfully.
Oct 14 08:24:14 np0005486759.ooo.test podman[65931]: 2025-10-14 08:24:14.045872172 +0000 UTC m=+0.057508393 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:24:14 np0005486759.ooo.test podman[65931]: 2025-10-14 08:24:14.05191174 +0000 UTC m=+0.063547971 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, distribution-scope=public, release=1, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:24:14 np0005486759.ooo.test podman[65931]: unhealthy
Oct 14 08:24:14 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:24:14 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:24:14 np0005486759.ooo.test podman[65930]: 2025-10-14 08:24:14.095271633 +0000 UTC m=+0.110191445 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, io.buildah.version=1.33.12, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, managed_by=tripleo_ansible, version=17.1.9, release=1, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4)
Oct 14 08:24:14 np0005486759.ooo.test podman[65930]: 2025-10-14 08:24:14.12323416 +0000 UTC m=+0.138154012 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp17/openstack-cron, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team)
Oct 14 08:24:14 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:24:14 np0005486759.ooo.test sudo[65984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwl51l0rz/privsep.sock
Oct 14 08:24:14 np0005486759.ooo.test sudo[65984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:14 np0005486759.ooo.test podman[65937]: 2025-10-14 08:24:14.207186871 +0000 UTC m=+0.215423346 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, version=17.1.9, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 08:24:14 np0005486759.ooo.test podman[65937]: 2025-10-14 08:24:14.248038277 +0000 UTC m=+0.256274802 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Oct 14 08:24:14 np0005486759.ooo.test podman[65937]: unhealthy
Oct 14 08:24:14 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:24:14 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:24:14 np0005486759.ooo.test sudo[65984]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:15 np0005486759.ooo.test sudo[66005]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplmn0wihh/privsep.sock
Oct 14 08:24:15 np0005486759.ooo.test sudo[66005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:15 np0005486759.ooo.test sudo[66005]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:15 np0005486759.ooo.test sudo[66022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3kf3sv_h/privsep.sock
Oct 14 08:24:15 np0005486759.ooo.test sudo[66022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:16 np0005486759.ooo.test sudo[66022]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:16 np0005486759.ooo.test sudo[66033]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq7weqo3w/privsep.sock
Oct 14 08:24:16 np0005486759.ooo.test sudo[66033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:17 np0005486759.ooo.test sudo[66033]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:17 np0005486759.ooo.test sudo[66044]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprjryuv3n/privsep.sock
Oct 14 08:24:17 np0005486759.ooo.test sudo[66044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:18 np0005486759.ooo.test sudo[66044]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:18 np0005486759.ooo.test sudo[66055]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3vzpmh5h/privsep.sock
Oct 14 08:24:18 np0005486759.ooo.test sudo[66055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: tmp-crun.QtLHfh.mount: Deactivated successfully.
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: tmp-crun.nDsGwO.mount: Deactivated successfully.
Oct 14 08:24:18 np0005486759.ooo.test podman[66057]: 2025-10-14 08:24:18.621666003 +0000 UTC m=+0.098846134 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:24:18 np0005486759.ooo.test podman[66058]: 2025-10-14 08:24:18.585551924 +0000 UTC m=+0.065407377 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.9, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Oct 14 08:24:18 np0005486759.ooo.test podman[66058]: 2025-10-14 08:24:18.667254576 +0000 UTC m=+0.147110019 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:24:18 np0005486759.ooo.test podman[66057]: 2025-10-14 08:24:18.681487077 +0000 UTC m=+0.158667158 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:24:18 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:24:18 np0005486759.ooo.test podman[66108]: 2025-10-14 08:24:18.749581196 +0000 UTC m=+0.072437714 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.9, vcs-type=git, release=1, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Oct 14 08:24:19 np0005486759.ooo.test sudo[66055]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:19 np0005486759.ooo.test podman[66108]: 2025-10-14 08:24:19.092158723 +0000 UTC m=+0.415015191 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Oct 14 08:24:19 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:24:19 np0005486759.ooo.test sudo[66138]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_9jq6wu6/privsep.sock
Oct 14 08:24:19 np0005486759.ooo.test sudo[66138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:19 np0005486759.ooo.test sudo[66138]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:20 np0005486759.ooo.test sudo[66149]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbbil6wem/privsep.sock
Oct 14 08:24:20 np0005486759.ooo.test sudo[66149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:20 np0005486759.ooo.test sudo[66149]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:21 np0005486759.ooo.test sudo[66165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxgzp3n_l/privsep.sock
Oct 14 08:24:21 np0005486759.ooo.test sudo[66165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:21 np0005486759.ooo.test sudo[66165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:21 np0005486759.ooo.test sudo[66177]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpob568uhj/privsep.sock
Oct 14 08:24:21 np0005486759.ooo.test sudo[66177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:22 np0005486759.ooo.test sudo[66177]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:22 np0005486759.ooo.test sudo[66188]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn2bjw7c1/privsep.sock
Oct 14 08:24:22 np0005486759.ooo.test sudo[66188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:23 np0005486759.ooo.test sudo[66188]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:23 np0005486759.ooo.test sudo[66199]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfd952alg/privsep.sock
Oct 14 08:24:23 np0005486759.ooo.test sudo[66199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:24 np0005486759.ooo.test sudo[66199]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:24 np0005486759.ooo.test sudo[66210]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1nwsrmg6/privsep.sock
Oct 14 08:24:24 np0005486759.ooo.test sudo[66210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:25 np0005486759.ooo.test sudo[66210]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:25 np0005486759.ooo.test sudo[66221]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb171b_5w/privsep.sock
Oct 14 08:24:25 np0005486759.ooo.test sudo[66221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:25 np0005486759.ooo.test sudo[66221]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:26 np0005486759.ooo.test sudo[66232]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpetbajqc1/privsep.sock
Oct 14 08:24:26 np0005486759.ooo.test sudo[66232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:26 np0005486759.ooo.test sudo[66232]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:27 np0005486759.ooo.test sudo[66249]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy224tua_/privsep.sock
Oct 14 08:24:27 np0005486759.ooo.test sudo[66249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:27 np0005486759.ooo.test sudo[66249]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:27 np0005486759.ooo.test sudo[66260]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqat0k8fm/privsep.sock
Oct 14 08:24:27 np0005486759.ooo.test sudo[66260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:28 np0005486759.ooo.test sudo[66260]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:28 np0005486759.ooo.test sudo[66271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcyrdf6v5/privsep.sock
Oct 14 08:24:28 np0005486759.ooo.test sudo[66271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:29 np0005486759.ooo.test sudo[66271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:24:29 np0005486759.ooo.test systemd[1]: tmp-crun.2dfjhc.mount: Deactivated successfully.
Oct 14 08:24:29 np0005486759.ooo.test podman[66277]: 2025-10-14 08:24:29.291398267 +0000 UTC m=+0.070589478 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, release=1, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:24:29 np0005486759.ooo.test sudo[66309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjg8a4asc/privsep.sock
Oct 14 08:24:29 np0005486759.ooo.test sudo[66309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:29 np0005486759.ooo.test podman[66277]: 2025-10-14 08:24:29.484386108 +0000 UTC m=+0.263577329 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:24:29 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:24:30 np0005486759.ooo.test sudo[66309]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:30 np0005486759.ooo.test sudo[66320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplqh58cmq/privsep.sock
Oct 14 08:24:30 np0005486759.ooo.test sudo[66320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:31 np0005486759.ooo.test sudo[66320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: tmp-crun.Pao29J.mount: Deactivated successfully.
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: tmp-crun.i6RBO2.mount: Deactivated successfully.
Oct 14 08:24:31 np0005486759.ooo.test podman[66328]: 2025-10-14 08:24:31.171536749 +0000 UTC m=+0.119052001 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, release=2, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9)
Oct 14 08:24:31 np0005486759.ooo.test podman[66325]: 2025-10-14 08:24:31.125289285 +0000 UTC m=+0.081419634 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:24:31 np0005486759.ooo.test podman[66327]: 2025-10-14 08:24:31.19935164 +0000 UTC m=+0.149050480 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step5, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1)
Oct 14 08:24:31 np0005486759.ooo.test podman[66328]: 2025-10-14 08:24:31.206715878 +0000 UTC m=+0.154231100 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:24:31 np0005486759.ooo.test podman[66327]: 2025-10-14 08:24:31.252147616 +0000 UTC m=+0.201846526 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:24:31 np0005486759.ooo.test podman[66325]: 2025-10-14 08:24:31.258686929 +0000 UTC m=+0.214817288 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3)
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:24:31 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:24:31 np0005486759.ooo.test sudo[66396]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqawyjpw3/privsep.sock
Oct 14 08:24:31 np0005486759.ooo.test sudo[66396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:31 np0005486759.ooo.test sudo[66396]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:32 np0005486759.ooo.test sudo[66413]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo21eo2al/privsep.sock
Oct 14 08:24:32 np0005486759.ooo.test sudo[66413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:32 np0005486759.ooo.test sudo[66413]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:33 np0005486759.ooo.test sudo[66424]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_cyfd4mz/privsep.sock
Oct 14 08:24:33 np0005486759.ooo.test sudo[66424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:33 np0005486759.ooo.test sudo[66424]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:34 np0005486759.ooo.test sudo[66435]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2kep_t0l/privsep.sock
Oct 14 08:24:34 np0005486759.ooo.test sudo[66435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:34 np0005486759.ooo.test sudo[66435]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:34 np0005486759.ooo.test sudo[66446]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx7lvz2ce/privsep.sock
Oct 14 08:24:34 np0005486759.ooo.test sudo[66446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:35 np0005486759.ooo.test sudo[66446]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:35 np0005486759.ooo.test sudo[66457]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsw0auvjk/privsep.sock
Oct 14 08:24:35 np0005486759.ooo.test sudo[66457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:36 np0005486759.ooo.test sudo[66457]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:36 np0005486759.ooo.test sudo[66468]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsgpb90bd/privsep.sock
Oct 14 08:24:36 np0005486759.ooo.test sudo[66468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:37 np0005486759.ooo.test sudo[66468]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:37 np0005486759.ooo.test sudo[66485]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe0gi37vi/privsep.sock
Oct 14 08:24:37 np0005486759.ooo.test sudo[66485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:37 np0005486759.ooo.test sudo[66485]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:38 np0005486759.ooo.test sudo[66496]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplh5t084g/privsep.sock
Oct 14 08:24:38 np0005486759.ooo.test sudo[66496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:38 np0005486759.ooo.test sudo[66496]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:39 np0005486759.ooo.test sudo[66507]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_zcv4enr/privsep.sock
Oct 14 08:24:39 np0005486759.ooo.test sudo[66507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:39 np0005486759.ooo.test sudo[66507]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:40 np0005486759.ooo.test sudo[66518]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqx5jgpqr/privsep.sock
Oct 14 08:24:40 np0005486759.ooo.test sudo[66518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:40 np0005486759.ooo.test sudo[66518]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:40 np0005486759.ooo.test sudo[66529]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzpic6cis/privsep.sock
Oct 14 08:24:40 np0005486759.ooo.test sudo[66529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:41 np0005486759.ooo.test sudo[66529]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:41 np0005486759.ooo.test sudo[66540]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5zjm79m2/privsep.sock
Oct 14 08:24:41 np0005486759.ooo.test sudo[66540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:42 np0005486759.ooo.test sudo[66540]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:42 np0005486759.ooo.test sudo[66553]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph9glruy9/privsep.sock
Oct 14 08:24:42 np0005486759.ooo.test sudo[66553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:43 np0005486759.ooo.test sudo[66553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:43 np0005486759.ooo.test sudo[66568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpikeu1j0k/privsep.sock
Oct 14 08:24:43 np0005486759.ooo.test sudo[66568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:43 np0005486759.ooo.test sudo[66568]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:44 np0005486759.ooo.test sudo[66579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb1joj0t1/privsep.sock
Oct 14 08:24:44 np0005486759.ooo.test sudo[66579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:24:44 np0005486759.ooo.test podman[66581]: 2025-10-14 08:24:44.371028073 +0000 UTC m=+0.094395126 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, build-date=2025-07-21T13:07:52)
Oct 14 08:24:44 np0005486759.ooo.test podman[66581]: 2025-10-14 08:24:44.406316487 +0000 UTC m=+0.129683520 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, version=17.1.9, vendor=Red Hat, Inc., vcs-type=git, container_name=logrotate_crond, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1)
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: tmp-crun.i2E0DX.mount: Deactivated successfully.
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:24:44 np0005486759.ooo.test podman[66582]: 2025-10-14 08:24:44.436447461 +0000 UTC m=+0.156174171 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33)
Oct 14 08:24:44 np0005486759.ooo.test podman[66583]: 2025-10-14 08:24:44.469577227 +0000 UTC m=+0.182350172 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T15:29:47, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 08:24:44 np0005486759.ooo.test podman[66582]: 2025-10-14 08:24:44.479208255 +0000 UTC m=+0.198934935 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, version=17.1.9, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:24:44 np0005486759.ooo.test podman[66582]: unhealthy
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:24:44 np0005486759.ooo.test podman[66583]: 2025-10-14 08:24:44.507335837 +0000 UTC m=+0.220108782 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.9, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1)
Oct 14 08:24:44 np0005486759.ooo.test podman[66583]: unhealthy
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:24:44 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:24:44 np0005486759.ooo.test sudo[66579]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:45 np0005486759.ooo.test sudo[66650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpabbqej2g/privsep.sock
Oct 14 08:24:45 np0005486759.ooo.test sudo[66650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:45 np0005486759.ooo.test systemd[1]: tmp-crun.dGm0mo.mount: Deactivated successfully.
Oct 14 08:24:45 np0005486759.ooo.test sudo[66650]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:45 np0005486759.ooo.test sudo[66661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqfz3ic5r/privsep.sock
Oct 14 08:24:45 np0005486759.ooo.test sudo[66661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:46 np0005486759.ooo.test sudo[66661]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:46 np0005486759.ooo.test sudo[66672]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq892ga_y/privsep.sock
Oct 14 08:24:46 np0005486759.ooo.test sudo[66672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:47 np0005486759.ooo.test sudo[66672]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:47 np0005486759.ooo.test sudo[66683]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxkb1td5s/privsep.sock
Oct 14 08:24:47 np0005486759.ooo.test sudo[66683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:48 np0005486759.ooo.test sudo[66683]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:48 np0005486759.ooo.test sudo[66700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuj_g8t2c/privsep.sock
Oct 14 08:24:48 np0005486759.ooo.test sudo[66700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:49 np0005486759.ooo.test sudo[66700]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:24:49 np0005486759.ooo.test podman[66708]: 2025-10-14 08:24:49.228298106 +0000 UTC m=+0.077474281 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, build-date=2025-07-21T13:28:44, distribution-scope=public, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1)
Oct 14 08:24:49 np0005486759.ooo.test podman[66705]: 2025-10-14 08:24:49.28650887 +0000 UTC m=+0.137357217 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1, distribution-scope=public, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1)
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: tmp-crun.iOGsyD.mount: Deactivated successfully.
Oct 14 08:24:49 np0005486759.ooo.test podman[66707]: 2025-10-14 08:24:49.349942696 +0000 UTC m=+0.198252095 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, version=17.1.9, config_id=tripleo_step4, release=1, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:24:49 np0005486759.ooo.test podman[66708]: 2025-10-14 08:24:49.373732553 +0000 UTC m=+0.222908678 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, release=1, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:24:49 np0005486759.ooo.test sudo[66777]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkxlfqv3o/privsep.sock
Oct 14 08:24:49 np0005486759.ooo.test sudo[66777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:24:49 np0005486759.ooo.test podman[66707]: 2025-10-14 08:24:49.407249621 +0000 UTC m=+0.255558990 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:24:49 np0005486759.ooo.test podman[66705]: 2025-10-14 08:24:49.639355874 +0000 UTC m=+0.490204481 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:24:49 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:24:49 np0005486759.ooo.test sudo[66777]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:50 np0005486759.ooo.test sudo[66792]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkwlmauc7/privsep.sock
Oct 14 08:24:50 np0005486759.ooo.test sudo[66792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:50 np0005486759.ooo.test sudo[66792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:50 np0005486759.ooo.test sudo[66803]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpef4_ih7k/privsep.sock
Oct 14 08:24:50 np0005486759.ooo.test sudo[66803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:51 np0005486759.ooo.test sudo[66803]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:51 np0005486759.ooo.test sudo[66814]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpycn4ygom/privsep.sock
Oct 14 08:24:51 np0005486759.ooo.test sudo[66814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:52 np0005486759.ooo.test sudo[66814]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:52 np0005486759.ooo.test sudo[66825]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppomd330s/privsep.sock
Oct 14 08:24:52 np0005486759.ooo.test sudo[66825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:53 np0005486759.ooo.test sudo[66825]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:53 np0005486759.ooo.test sudo[66842]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9ts8a_ym/privsep.sock
Oct 14 08:24:53 np0005486759.ooo.test sudo[66842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:54 np0005486759.ooo.test sudo[66842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:54 np0005486759.ooo.test sudo[66853]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsrhv_1qb/privsep.sock
Oct 14 08:24:54 np0005486759.ooo.test sudo[66853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:55 np0005486759.ooo.test sudo[66853]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:55 np0005486759.ooo.test sudo[66864]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn79e8kko/privsep.sock
Oct 14 08:24:55 np0005486759.ooo.test sudo[66864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:55 np0005486759.ooo.test sudo[66864]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:56 np0005486759.ooo.test sudo[66875]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc_kue_bd/privsep.sock
Oct 14 08:24:56 np0005486759.ooo.test sudo[66875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:56 np0005486759.ooo.test sudo[66875]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:57 np0005486759.ooo.test sudo[66886]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmtzw3p1g/privsep.sock
Oct 14 08:24:57 np0005486759.ooo.test sudo[66886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:57 np0005486759.ooo.test sudo[66886]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:58 np0005486759.ooo.test sudo[66897]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1lvfg4ae/privsep.sock
Oct 14 08:24:58 np0005486759.ooo.test sudo[66897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:58 np0005486759.ooo.test sudo[66897]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:58 np0005486759.ooo.test sudo[66913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphdv81efl/privsep.sock
Oct 14 08:24:58 np0005486759.ooo.test sudo[66913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:59 np0005486759.ooo.test sudo[66913]: pam_unix(sudo:session): session closed for user root
Oct 14 08:24:59 np0005486759.ooo.test sudo[66925]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6uohtuv8/privsep.sock
Oct 14 08:24:59 np0005486759.ooo.test sudo[66925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:24:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:24:59 np0005486759.ooo.test podman[66927]: 2025-10-14 08:24:59.826104088 +0000 UTC m=+0.075894668 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=)
Oct 14 08:25:00 np0005486759.ooo.test podman[66927]: 2025-10-14 08:25:00.003299343 +0000 UTC m=+0.253089953 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1)
Oct 14 08:25:00 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:25:00 np0005486759.ooo.test sudo[66925]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:00 np0005486759.ooo.test sudo[66964]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvltn_xvm/privsep.sock
Oct 14 08:25:00 np0005486759.ooo.test sudo[66964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:01 np0005486759.ooo.test sudo[66964]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:01 np0005486759.ooo.test sudo[66975]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_1xq2ubw/privsep.sock
Oct 14 08:25:01 np0005486759.ooo.test sudo[66975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:25:01 np0005486759.ooo.test podman[66979]: 2025-10-14 08:25:01.428125196 +0000 UTC m=+0.059962203 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-type=git, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.buildah.version=1.33.12)
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: tmp-crun.8bABOn.mount: Deactivated successfully.
Oct 14 08:25:01 np0005486759.ooo.test podman[66977]: 2025-10-14 08:25:01.460015147 +0000 UTC m=+0.091830244 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, maintainer=OpenStack TripleO Team)
Oct 14 08:25:01 np0005486759.ooo.test podman[66977]: 2025-10-14 08:25:01.492309359 +0000 UTC m=+0.124124526 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1)
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:25:01 np0005486759.ooo.test podman[66979]: 2025-10-14 08:25:01.50874178 +0000 UTC m=+0.140578777 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, release=2, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, container_name=collectd)
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:25:01 np0005486759.ooo.test podman[66978]: 2025-10-14 08:25:01.557844144 +0000 UTC m=+0.183820659 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:25:01 np0005486759.ooo.test podman[66978]: 2025-10-14 08:25:01.575355599 +0000 UTC m=+0.201332084 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git)
Oct 14 08:25:01 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:25:01 np0005486759.ooo.test sudo[66975]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:02 np0005486759.ooo.test sudo[67050]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmido0rng/privsep.sock
Oct 14 08:25:02 np0005486759.ooo.test sudo[67050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:02 np0005486759.ooo.test sudo[67050]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:03 np0005486759.ooo.test sudo[67061]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8s0k2k6r/privsep.sock
Oct 14 08:25:03 np0005486759.ooo.test sudo[67061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:03 np0005486759.ooo.test sudo[67061]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:03 np0005486759.ooo.test sudo[67072]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbuo57a8f/privsep.sock
Oct 14 08:25:03 np0005486759.ooo.test sudo[67072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:04 np0005486759.ooo.test sudo[67072]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:04 np0005486759.ooo.test sudo[67089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnssqcqor/privsep.sock
Oct 14 08:25:04 np0005486759.ooo.test sudo[67089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:05 np0005486759.ooo.test sudo[67089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:05 np0005486759.ooo.test sudo[67100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz8aux6cy/privsep.sock
Oct 14 08:25:05 np0005486759.ooo.test sudo[67100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:06 np0005486759.ooo.test sudo[67100]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:06 np0005486759.ooo.test sudo[67111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyahwtmrm/privsep.sock
Oct 14 08:25:06 np0005486759.ooo.test sudo[67111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:07 np0005486759.ooo.test sudo[67111]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:07 np0005486759.ooo.test sudo[67122]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg4e5pwz6/privsep.sock
Oct 14 08:25:07 np0005486759.ooo.test sudo[67122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:07 np0005486759.ooo.test sudo[67122]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:08 np0005486759.ooo.test sudo[67133]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm8ak_n29/privsep.sock
Oct 14 08:25:08 np0005486759.ooo.test sudo[67133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:08 np0005486759.ooo.test sudo[67133]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:09 np0005486759.ooo.test sudo[67144]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn81a99ga/privsep.sock
Oct 14 08:25:09 np0005486759.ooo.test sudo[67144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:09 np0005486759.ooo.test sudo[67144]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:10 np0005486759.ooo.test sudo[67161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpag8bdf1_/privsep.sock
Oct 14 08:25:10 np0005486759.ooo.test sudo[67161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:10 np0005486759.ooo.test sudo[67161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:10 np0005486759.ooo.test sudo[67172]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplt12mrm9/privsep.sock
Oct 14 08:25:10 np0005486759.ooo.test sudo[67172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:11 np0005486759.ooo.test sudo[67172]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:11 np0005486759.ooo.test sudo[67183]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnbarw_j4/privsep.sock
Oct 14 08:25:11 np0005486759.ooo.test sudo[67183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:12 np0005486759.ooo.test sudo[67183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:12 np0005486759.ooo.test sudo[67194]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfinv4kex/privsep.sock
Oct 14 08:25:12 np0005486759.ooo.test sudo[67194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:13 np0005486759.ooo.test sudo[67194]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:13 np0005486759.ooo.test sudo[67205]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1yu2f6gu/privsep.sock
Oct 14 08:25:13 np0005486759.ooo.test sudo[67205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:13 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:25:13 np0005486759.ooo.test recover_tripleo_nova_virtqemud[67208]: 47951
Oct 14 08:25:13 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:25:13 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:25:14 np0005486759.ooo.test sudo[67205]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:14 np0005486759.ooo.test sudo[67218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2s2qex8d/privsep.sock
Oct 14 08:25:14 np0005486759.ooo.test sudo[67218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:15 np0005486759.ooo.test sudo[67218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:25:15 np0005486759.ooo.test podman[67229]: 2025-10-14 08:25:15.151031606 +0000 UTC m=+0.086135036 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.9, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:07:52, architecture=x86_64)
Oct 14 08:25:15 np0005486759.ooo.test podman[67229]: 2025-10-14 08:25:15.187322073 +0000 UTC m=+0.122425523 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, release=1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:07:52)
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: tmp-crun.xwrPOp.mount: Deactivated successfully.
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:25:15 np0005486759.ooo.test podman[67231]: 2025-10-14 08:25:15.204738584 +0000 UTC m=+0.136689657 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1)
Oct 14 08:25:15 np0005486759.ooo.test podman[67230]: 2025-10-14 08:25:15.246689197 +0000 UTC m=+0.181791497 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team)
Oct 14 08:25:15 np0005486759.ooo.test podman[67230]: 2025-10-14 08:25:15.256887104 +0000 UTC m=+0.191989394 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:25:15 np0005486759.ooo.test podman[67230]: unhealthy
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:25:15 np0005486759.ooo.test podman[67231]: 2025-10-14 08:25:15.270693702 +0000 UTC m=+0.202644755 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Oct 14 08:25:15 np0005486759.ooo.test sudo[67290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppk6dauzf/privsep.sock
Oct 14 08:25:15 np0005486759.ooo.test sudo[67290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:15 np0005486759.ooo.test podman[67231]: unhealthy
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:25:15 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:25:15 np0005486759.ooo.test sudo[67290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:16 np0005486759.ooo.test sudo[67301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj0qh9x2g/privsep.sock
Oct 14 08:25:16 np0005486759.ooo.test sudo[67301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:16 np0005486759.ooo.test sudo[67301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:17 np0005486759.ooo.test sudo[67312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg8nnwjra/privsep.sock
Oct 14 08:25:17 np0005486759.ooo.test sudo[67312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:17 np0005486759.ooo.test sudo[67312]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:17 np0005486759.ooo.test sudo[67323]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpje6m3jyc/privsep.sock
Oct 14 08:25:17 np0005486759.ooo.test sudo[67323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:18 np0005486759.ooo.test sudo[67323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:18 np0005486759.ooo.test sudo[67334]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_sd3re30/privsep.sock
Oct 14 08:25:18 np0005486759.ooo.test sudo[67334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:19 np0005486759.ooo.test sudo[67334]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:25:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:25:19 np0005486759.ooo.test systemd[1]: tmp-crun.SHEnWa.mount: Deactivated successfully.
Oct 14 08:25:19 np0005486759.ooo.test podman[67339]: 2025-10-14 08:25:19.543198493 +0000 UTC m=+0.090170163 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-07-21T16:28:53, release=1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:25:19 np0005486759.ooo.test podman[67339]: 2025-10-14 08:25:19.576364412 +0000 UTC m=+0.123336132 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 08:25:19 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:25:19 np0005486759.ooo.test podman[67341]: 2025-10-14 08:25:19.586943321 +0000 UTC m=+0.134079906 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, tcib_managed=true, release=1, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Oct 14 08:25:19 np0005486759.ooo.test podman[67341]: 2025-10-14 08:25:19.667749701 +0000 UTC m=+0.214886306 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., container_name=ovn_controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:25:19 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:25:19 np0005486759.ooo.test sudo[67390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc44vdzma/privsep.sock
Oct 14 08:25:19 np0005486759.ooo.test sudo[67390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:25:19 np0005486759.ooo.test podman[67391]: 2025-10-14 08:25:19.795424477 +0000 UTC m=+0.068151709 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:25:20 np0005486759.ooo.test podman[67391]: 2025-10-14 08:25:20.158019688 +0000 UTC m=+0.430746950 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.33.12, container_name=nova_migration_target, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:25:20 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:25:20 np0005486759.ooo.test sudo[67390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:20 np0005486759.ooo.test systemd[1]: tmp-crun.EU1iiZ.mount: Deactivated successfully.
Oct 14 08:25:20 np0005486759.ooo.test sudo[67432]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu826qlfk/privsep.sock
Oct 14 08:25:20 np0005486759.ooo.test sudo[67432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:21 np0005486759.ooo.test sudo[67432]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:21 np0005486759.ooo.test sudo[67443]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpipw8f84t/privsep.sock
Oct 14 08:25:21 np0005486759.ooo.test sudo[67443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:22 np0005486759.ooo.test sudo[67443]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:22 np0005486759.ooo.test sudo[67454]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnofvqxoj/privsep.sock
Oct 14 08:25:22 np0005486759.ooo.test sudo[67454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:22 np0005486759.ooo.test sudo[67454]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:23 np0005486759.ooo.test sudo[67465]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_9_v10ye/privsep.sock
Oct 14 08:25:23 np0005486759.ooo.test sudo[67465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:23 np0005486759.ooo.test sudo[67465]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:23 np0005486759.ooo.test sudo[67476]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqugqilq8/privsep.sock
Oct 14 08:25:23 np0005486759.ooo.test sudo[67476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:24 np0005486759.ooo.test sudo[67476]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:24 np0005486759.ooo.test sudo[67487]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa82geg7m/privsep.sock
Oct 14 08:25:24 np0005486759.ooo.test sudo[67487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:25 np0005486759.ooo.test sudo[67487]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:25 np0005486759.ooo.test sudo[67503]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4hg6azb1/privsep.sock
Oct 14 08:25:25 np0005486759.ooo.test sudo[67503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:26 np0005486759.ooo.test sudo[67503]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:26 np0005486759.ooo.test sudo[67515]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplthj_3dh/privsep.sock
Oct 14 08:25:26 np0005486759.ooo.test sudo[67515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:27 np0005486759.ooo.test sudo[67515]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:27 np0005486759.ooo.test sudo[67526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps17y8rgz/privsep.sock
Oct 14 08:25:27 np0005486759.ooo.test sudo[67526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:28 np0005486759.ooo.test sudo[67526]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:28 np0005486759.ooo.test sudo[67537]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ak7n94v/privsep.sock
Oct 14 08:25:28 np0005486759.ooo.test sudo[67537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:29 np0005486759.ooo.test sudo[67537]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:29 np0005486759.ooo.test sudo[67548]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjx_1gltv/privsep.sock
Oct 14 08:25:29 np0005486759.ooo.test sudo[67548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:29 np0005486759.ooo.test sudo[67548]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:30 np0005486759.ooo.test sudo[67559]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc4s8r3zo/privsep.sock
Oct 14 08:25:30 np0005486759.ooo.test sudo[67559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:25:30 np0005486759.ooo.test podman[67561]: 2025-10-14 08:25:30.241573873 +0000 UTC m=+0.070805911 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, release=1, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:25:30 np0005486759.ooo.test podman[67561]: 2025-10-14 08:25:30.416337861 +0000 UTC m=+0.245569909 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=)
Oct 14 08:25:30 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:25:30 np0005486759.ooo.test sudo[67559]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:31 np0005486759.ooo.test sudo[67601]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp70ak0ecn/privsep.sock
Oct 14 08:25:31 np0005486759.ooo.test sudo[67601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:31 np0005486759.ooo.test sudo[67601]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:25:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:25:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:25:31 np0005486759.ooo.test podman[67613]: 2025-10-14 08:25:31.710055953 +0000 UTC m=+0.066758775 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step3, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, version=17.1.9, distribution-scope=public, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, maintainer=OpenStack TripleO Team)
Oct 14 08:25:31 np0005486759.ooo.test podman[67612]: 2025-10-14 08:25:31.720098004 +0000 UTC m=+0.075433834 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, config_id=tripleo_step5, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, tcib_managed=true)
Oct 14 08:25:31 np0005486759.ooo.test podman[67612]: 2025-10-14 08:25:31.745243675 +0000 UTC m=+0.100579515 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=nova_compute, name=rhosp17/openstack-nova-compute)
Oct 14 08:25:31 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:25:31 np0005486759.ooo.test podman[67611]: 2025-10-14 08:25:31.769114697 +0000 UTC m=+0.130087372 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, container_name=iscsid, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:27:15, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible)
Oct 14 08:25:31 np0005486759.ooo.test podman[67611]: 2025-10-14 08:25:31.782309407 +0000 UTC m=+0.143282042 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:25:31 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:25:31 np0005486759.ooo.test podman[67613]: 2025-10-14 08:25:31.801197473 +0000 UTC m=+0.157900325 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, architecture=x86_64, distribution-scope=public, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=collectd, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:25:31 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:25:31 np0005486759.ooo.test sudo[67681]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy4gycpsc/privsep.sock
Oct 14 08:25:31 np0005486759.ooo.test sudo[67681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:32 np0005486759.ooo.test sudo[67681]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:32 np0005486759.ooo.test sudo[67692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp__smpv4_/privsep.sock
Oct 14 08:25:32 np0005486759.ooo.test sudo[67692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:32 np0005486759.ooo.test systemd[1]: tmp-crun.LD1xro.mount: Deactivated successfully.
Oct 14 08:25:33 np0005486759.ooo.test sudo[67692]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:33 np0005486759.ooo.test sudo[67703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoo38swgt/privsep.sock
Oct 14 08:25:33 np0005486759.ooo.test sudo[67703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:34 np0005486759.ooo.test sudo[67703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:34 np0005486759.ooo.test sudo[67714]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2clodmrn/privsep.sock
Oct 14 08:25:34 np0005486759.ooo.test sudo[67714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:35 np0005486759.ooo.test sudo[67714]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:35 np0005486759.ooo.test sudo[67725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6dyo8pec/privsep.sock
Oct 14 08:25:35 np0005486759.ooo.test sudo[67725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:35 np0005486759.ooo.test sudo[67725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:36 np0005486759.ooo.test sudo[67736]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw2ux1krq/privsep.sock
Oct 14 08:25:36 np0005486759.ooo.test sudo[67736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:36 np0005486759.ooo.test sudo[67736]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:37 np0005486759.ooo.test sudo[67753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxg2_ibqj/privsep.sock
Oct 14 08:25:37 np0005486759.ooo.test sudo[67753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:37 np0005486759.ooo.test sudo[67753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:38 np0005486759.ooo.test sudo[67764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4frahwbe/privsep.sock
Oct 14 08:25:38 np0005486759.ooo.test sudo[67764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:38 np0005486759.ooo.test sudo[67764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:39 np0005486759.ooo.test sudo[67775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa4cp6v8x/privsep.sock
Oct 14 08:25:39 np0005486759.ooo.test sudo[67775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:39 np0005486759.ooo.test sudo[67775]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:39 np0005486759.ooo.test sudo[67786]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp98zye0g2/privsep.sock
Oct 14 08:25:39 np0005486759.ooo.test sudo[67786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:40 np0005486759.ooo.test sudo[67786]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:40 np0005486759.ooo.test sudo[67797]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl7qlifu5/privsep.sock
Oct 14 08:25:40 np0005486759.ooo.test sudo[67797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:41 np0005486759.ooo.test sudo[67797]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:41 np0005486759.ooo.test sudo[67808]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8i_m93qg/privsep.sock
Oct 14 08:25:41 np0005486759.ooo.test sudo[67808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:42 np0005486759.ooo.test sudo[67808]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:42 np0005486759.ooo.test sudo[67825]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzfpmzr1d/privsep.sock
Oct 14 08:25:42 np0005486759.ooo.test sudo[67825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:43 np0005486759.ooo.test sudo[67825]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:43 np0005486759.ooo.test sudo[67836]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfsy5ert_/privsep.sock
Oct 14 08:25:43 np0005486759.ooo.test sudo[67836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:44 np0005486759.ooo.test sudo[67836]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:44 np0005486759.ooo.test sudo[67847]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8oja396s/privsep.sock
Oct 14 08:25:44 np0005486759.ooo.test sudo[67847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:45 np0005486759.ooo.test sudo[67847]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:25:45 np0005486759.ooo.test podman[67855]: 2025-10-14 08:25:45.45121951 +0000 UTC m=+0.076068874 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:25:45 np0005486759.ooo.test sudo[67898]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphkguirbl/privsep.sock
Oct 14 08:25:45 np0005486759.ooo.test sudo[67898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:45 np0005486759.ooo.test podman[67855]: 2025-10-14 08:25:45.494442142 +0000 UTC m=+0.119291516 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, tcib_managed=true, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.expose-services=)
Oct 14 08:25:45 np0005486759.ooo.test podman[67855]: unhealthy
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:25:45 np0005486759.ooo.test podman[67854]: 2025-10-14 08:25:45.514861017 +0000 UTC m=+0.141141366 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, distribution-scope=public, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:25:45 np0005486759.ooo.test podman[67854]: 2025-10-14 08:25:45.522352069 +0000 UTC m=+0.148632448 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, vcs-type=git, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: tmp-crun.trZYoN.mount: Deactivated successfully.
Oct 14 08:25:45 np0005486759.ooo.test podman[67856]: 2025-10-14 08:25:45.626097361 +0000 UTC m=+0.245567498 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:25:45 np0005486759.ooo.test podman[67856]: 2025-10-14 08:25:45.636724841 +0000 UTC m=+0.256194968 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:25:45 np0005486759.ooo.test podman[67856]: unhealthy
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:25:45 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:25:46 np0005486759.ooo.test sudo[67898]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:46 np0005486759.ooo.test sudo[67928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmm3d_qyo/privsep.sock
Oct 14 08:25:46 np0005486759.ooo.test sudo[67928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:46 np0005486759.ooo.test sudo[67928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:47 np0005486759.ooo.test sudo[67939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj7xe5dj5/privsep.sock
Oct 14 08:25:47 np0005486759.ooo.test sudo[67939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:47 np0005486759.ooo.test sudo[67939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:48 np0005486759.ooo.test sudo[67956]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7rqzmc16/privsep.sock
Oct 14 08:25:48 np0005486759.ooo.test sudo[67956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:48 np0005486759.ooo.test sudo[67956]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:48 np0005486759.ooo.test sudo[67967]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk8cs0by8/privsep.sock
Oct 14 08:25:48 np0005486759.ooo.test sudo[67967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:49 np0005486759.ooo.test sudo[67967]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:49 np0005486759.ooo.test sudo[67978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgzumesne/privsep.sock
Oct 14 08:25:49 np0005486759.ooo.test sudo[67978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:25:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:25:49 np0005486759.ooo.test podman[67981]: 2025-10-14 08:25:49.788535102 +0000 UTC m=+0.074649920 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, release=1, distribution-scope=public, container_name=ovn_controller, batch=17.1_20250721.1, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.9)
Oct 14 08:25:49 np0005486759.ooo.test podman[67981]: 2025-10-14 08:25:49.83063475 +0000 UTC m=+0.116749578 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Oct 14 08:25:49 np0005486759.ooo.test systemd[1]: tmp-crun.If7S42.mount: Deactivated successfully.
Oct 14 08:25:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:25:49 np0005486759.ooo.test podman[67980]: 2025-10-14 08:25:49.852803558 +0000 UTC m=+0.139423321 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=)
Oct 14 08:25:49 np0005486759.ooo.test podman[67980]: 2025-10-14 08:25:49.914696271 +0000 UTC m=+0.201316064 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible)
Oct 14 08:25:49 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:25:50 np0005486759.ooo.test sudo[67978]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:25:50 np0005486759.ooo.test podman[68029]: 2025-10-14 08:25:50.384039329 +0000 UTC m=+0.062954917 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-nova-compute-container, vcs-type=git, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:25:50 np0005486759.ooo.test sudo[68057]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2_4vc7vu/privsep.sock
Oct 14 08:25:50 np0005486759.ooo.test sudo[68057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:50 np0005486759.ooo.test systemd[1]: tmp-crun.gKcB2o.mount: Deactivated successfully.
Oct 14 08:25:50 np0005486759.ooo.test podman[68029]: 2025-10-14 08:25:50.786532789 +0000 UTC m=+0.465448377 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, container_name=nova_migration_target, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:25:50 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:25:51 np0005486759.ooo.test sudo[68057]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:51 np0005486759.ooo.test sudo[68069]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpted4h7f3/privsep.sock
Oct 14 08:25:51 np0005486759.ooo.test sudo[68069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:52 np0005486759.ooo.test sudo[68069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:52 np0005486759.ooo.test sudo[68080]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9ziu2mhu/privsep.sock
Oct 14 08:25:52 np0005486759.ooo.test sudo[68080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:53 np0005486759.ooo.test sudo[68080]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:53 np0005486759.ooo.test sudo[68097]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2__ywpnd/privsep.sock
Oct 14 08:25:53 np0005486759.ooo.test sudo[68097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:53 np0005486759.ooo.test sudo[68097]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:54 np0005486759.ooo.test sudo[68108]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjsmlbf4i/privsep.sock
Oct 14 08:25:54 np0005486759.ooo.test sudo[68108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:54 np0005486759.ooo.test sudo[68108]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:55 np0005486759.ooo.test sudo[68119]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp66j1waen/privsep.sock
Oct 14 08:25:55 np0005486759.ooo.test sudo[68119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:55 np0005486759.ooo.test sudo[68119]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:56 np0005486759.ooo.test sudo[68130]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp86zda4cm/privsep.sock
Oct 14 08:25:56 np0005486759.ooo.test sudo[68130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:56 np0005486759.ooo.test sudo[68130]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:56 np0005486759.ooo.test sudo[68141]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_awqs7nh/privsep.sock
Oct 14 08:25:56 np0005486759.ooo.test sudo[68141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:57 np0005486759.ooo.test sudo[68141]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:57 np0005486759.ooo.test sudo[68152]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqel2ap5e/privsep.sock
Oct 14 08:25:57 np0005486759.ooo.test sudo[68152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:58 np0005486759.ooo.test sudo[68152]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:58 np0005486759.ooo.test sudo[68169]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgw43fbjg/privsep.sock
Oct 14 08:25:58 np0005486759.ooo.test sudo[68169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:25:59 np0005486759.ooo.test sudo[68169]: pam_unix(sudo:session): session closed for user root
Oct 14 08:25:59 np0005486759.ooo.test sudo[68180]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyk4xtkhf/privsep.sock
Oct 14 08:25:59 np0005486759.ooo.test sudo[68180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:00 np0005486759.ooo.test sudo[68180]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:00 np0005486759.ooo.test sudo[68191]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaw1tb8yb/privsep.sock
Oct 14 08:26:00 np0005486759.ooo.test sudo[68191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:26:00 np0005486759.ooo.test podman[68193]: 2025-10-14 08:26:00.644519279 +0000 UTC m=+0.058931972 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, version=17.1.9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:26:00 np0005486759.ooo.test podman[68193]: 2025-10-14 08:26:00.81198334 +0000 UTC m=+0.226396133 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64)
Oct 14 08:26:00 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:26:01 np0005486759.ooo.test sudo[68191]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:01 np0005486759.ooo.test sudo[68230]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2089emjz/privsep.sock
Oct 14 08:26:01 np0005486759.ooo.test sudo[68230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:02 np0005486759.ooo.test sudo[68230]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: tmp-crun.IUdukE.mount: Deactivated successfully.
Oct 14 08:26:02 np0005486759.ooo.test podman[68237]: 2025-10-14 08:26:02.176171841 +0000 UTC m=+0.078888751 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, architecture=x86_64, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, container_name=nova_compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12)
Oct 14 08:26:02 np0005486759.ooo.test podman[68236]: 2025-10-14 08:26:02.242172581 +0000 UTC m=+0.144782788 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, version=17.1.9, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=iscsid, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:26:02 np0005486759.ooo.test podman[68238]: 2025-10-14 08:26:02.254016479 +0000 UTC m=+0.148505364 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=2, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:26:02 np0005486759.ooo.test podman[68236]: 2025-10-14 08:26:02.277320342 +0000 UTC m=+0.179930519 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:26:02 np0005486759.ooo.test podman[68238]: 2025-10-14 08:26:02.286907961 +0000 UTC m=+0.181396886 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, config_id=tripleo_step3, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:26:02 np0005486759.ooo.test sudo[68300]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpujrahelz/privsep.sock
Oct 14 08:26:02 np0005486759.ooo.test sudo[68300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:02 np0005486759.ooo.test podman[68237]: 2025-10-14 08:26:02.332267089 +0000 UTC m=+0.234983949 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:26:02 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:26:02 np0005486759.ooo.test sudo[68300]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:03 np0005486759.ooo.test sudo[68311]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3psfhc0e/privsep.sock
Oct 14 08:26:03 np0005486759.ooo.test sudo[68311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:03 np0005486759.ooo.test systemd[1]: tmp-crun.0dUX21.mount: Deactivated successfully.
Oct 14 08:26:03 np0005486759.ooo.test sudo[68311]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:04 np0005486759.ooo.test sudo[68328]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf75p82vc/privsep.sock
Oct 14 08:26:04 np0005486759.ooo.test sudo[68328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:04 np0005486759.ooo.test sudo[68328]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:04 np0005486759.ooo.test sudo[68339]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxm73jkln/privsep.sock
Oct 14 08:26:04 np0005486759.ooo.test sudo[68339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:05 np0005486759.ooo.test sudo[68339]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:05 np0005486759.ooo.test sudo[68350]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5zk0doi/privsep.sock
Oct 14 08:26:05 np0005486759.ooo.test sudo[68350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:06 np0005486759.ooo.test sudo[68350]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:06 np0005486759.ooo.test sudo[68361]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm0_djzcp/privsep.sock
Oct 14 08:26:06 np0005486759.ooo.test sudo[68361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:07 np0005486759.ooo.test sudo[68361]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:07 np0005486759.ooo.test sudo[68372]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2rqufzdj/privsep.sock
Oct 14 08:26:07 np0005486759.ooo.test sudo[68372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:08 np0005486759.ooo.test sudo[68372]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:08 np0005486759.ooo.test sudo[68383]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmx7fzv0s/privsep.sock
Oct 14 08:26:08 np0005486759.ooo.test sudo[68383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:08 np0005486759.ooo.test sudo[68383]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:09 np0005486759.ooo.test sudo[68400]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9g4xmy4a/privsep.sock
Oct 14 08:26:09 np0005486759.ooo.test sudo[68400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:09 np0005486759.ooo.test sudo[68400]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:10 np0005486759.ooo.test sudo[68411]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6fvl7n7e/privsep.sock
Oct 14 08:26:10 np0005486759.ooo.test sudo[68411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:10 np0005486759.ooo.test sudo[68411]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:10 np0005486759.ooo.test sudo[68422]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp43p0zvai/privsep.sock
Oct 14 08:26:10 np0005486759.ooo.test sudo[68422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:11 np0005486759.ooo.test sudo[68422]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:11 np0005486759.ooo.test sudo[68433]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj0sn292s/privsep.sock
Oct 14 08:26:11 np0005486759.ooo.test sudo[68433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:12 np0005486759.ooo.test sudo[68433]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:12 np0005486759.ooo.test sudo[68444]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpry03cs3w/privsep.sock
Oct 14 08:26:12 np0005486759.ooo.test sudo[68444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:13 np0005486759.ooo.test sudo[68444]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:13 np0005486759.ooo.test sudo[68455]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn4zrib9i/privsep.sock
Oct 14 08:26:13 np0005486759.ooo.test sudo[68455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:14 np0005486759.ooo.test sudo[68455]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:14 np0005486759.ooo.test sudo[68468]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiq0mrb1d/privsep.sock
Oct 14 08:26:14 np0005486759.ooo.test sudo[68468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:15 np0005486759.ooo.test sudo[68468]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:15 np0005486759.ooo.test sudo[68483]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_3in08wq/privsep.sock
Oct 14 08:26:15 np0005486759.ooo.test sudo[68483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:15 np0005486759.ooo.test sudo[68483]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:26:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:26:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:26:16 np0005486759.ooo.test systemd[1]: tmp-crun.GlYtps.mount: Deactivated successfully.
Oct 14 08:26:16 np0005486759.ooo.test podman[68490]: 2025-10-14 08:26:16.014912398 +0000 UTC m=+0.078379585 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:26:16 np0005486759.ooo.test podman[68491]: 2025-10-14 08:26:15.993022668 +0000 UTC m=+0.057888258 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:26:16 np0005486759.ooo.test podman[68489]: 2025-10-14 08:26:16.056273303 +0000 UTC m=+0.122790235 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1)
Oct 14 08:26:16 np0005486759.ooo.test podman[68491]: 2025-10-14 08:26:16.07837496 +0000 UTC m=+0.143240570 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vendor=Red Hat, Inc.)
Oct 14 08:26:16 np0005486759.ooo.test podman[68491]: unhealthy
Oct 14 08:26:16 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:26:16 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:26:16 np0005486759.ooo.test podman[68489]: 2025-10-14 08:26:16.092242001 +0000 UTC m=+0.158758933 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, container_name=logrotate_crond, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container)
Oct 14 08:26:16 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:26:16 np0005486759.ooo.test podman[68490]: 2025-10-14 08:26:16.134264186 +0000 UTC m=+0.197731453 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public)
Oct 14 08:26:16 np0005486759.ooo.test podman[68490]: unhealthy
Oct 14 08:26:16 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:26:16 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:26:16 np0005486759.ooo.test sudo[68549]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyxfsom56/privsep.sock
Oct 14 08:26:16 np0005486759.ooo.test sudo[68549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:16 np0005486759.ooo.test sudo[68549]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:17 np0005486759.ooo.test sudo[68560]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm1vpefgl/privsep.sock
Oct 14 08:26:17 np0005486759.ooo.test sudo[68560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:17 np0005486759.ooo.test sudo[68560]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:18 np0005486759.ooo.test sudo[68571]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8cxglc47/privsep.sock
Oct 14 08:26:18 np0005486759.ooo.test sudo[68571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:18 np0005486759.ooo.test sudo[68571]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:18 np0005486759.ooo.test sudo[68582]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp106a87zv/privsep.sock
Oct 14 08:26:18 np0005486759.ooo.test sudo[68582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:19 np0005486759.ooo.test sudo[68582]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:19 np0005486759.ooo.test sudo[68595]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ljooal8/privsep.sock
Oct 14 08:26:19 np0005486759.ooo.test sudo[68595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:26:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:26:20 np0005486759.ooo.test sudo[68595]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:20 np0005486759.ooo.test podman[68604]: 2025-10-14 08:26:20.441985779 +0000 UTC m=+0.067122656 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64)
Oct 14 08:26:20 np0005486759.ooo.test podman[68604]: 2025-10-14 08:26:20.468379439 +0000 UTC m=+0.093516406 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:26:20 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:26:20 np0005486759.ooo.test podman[68603]: 2025-10-14 08:26:20.545892846 +0000 UTC m=+0.172273461 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, version=17.1.9, config_id=tripleo_step4, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:26:20 np0005486759.ooo.test podman[68603]: 2025-10-14 08:26:20.591505483 +0000 UTC m=+0.217886168 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:26:20 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:26:20 np0005486759.ooo.test sudo[68659]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpykjxqdck/privsep.sock
Oct 14 08:26:20 np0005486759.ooo.test sudo[68659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:21 np0005486759.ooo.test sudo[68659]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:26:21 np0005486759.ooo.test podman[68664]: 2025-10-14 08:26:21.373692887 +0000 UTC m=+0.072304587 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, version=17.1.9, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git)
Oct 14 08:26:21 np0005486759.ooo.test sudo[68691]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp18sm_bwb/privsep.sock
Oct 14 08:26:21 np0005486759.ooo.test sudo[68691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:21 np0005486759.ooo.test podman[68664]: 2025-10-14 08:26:21.76949029 +0000 UTC m=+0.468101940 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Oct 14 08:26:21 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:26:22 np0005486759.ooo.test sudo[68691]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:22 np0005486759.ooo.test sudo[68703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmperb_p93m/privsep.sock
Oct 14 08:26:22 np0005486759.ooo.test sudo[68703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:23 np0005486759.ooo.test sudo[68703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:23 np0005486759.ooo.test sudo[68714]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeh4i9zjd/privsep.sock
Oct 14 08:26:23 np0005486759.ooo.test sudo[68714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:24 np0005486759.ooo.test sudo[68714]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:24 np0005486759.ooo.test sudo[68725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwhxx90ef/privsep.sock
Oct 14 08:26:24 np0005486759.ooo.test sudo[68725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:24 np0005486759.ooo.test sudo[68725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:25 np0005486759.ooo.test sudo[68736]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0pr5rs3c/privsep.sock
Oct 14 08:26:25 np0005486759.ooo.test sudo[68736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:25 np0005486759.ooo.test sudo[68736]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:25 np0005486759.ooo.test sudo[68753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppkr982wd/privsep.sock
Oct 14 08:26:25 np0005486759.ooo.test sudo[68753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:26 np0005486759.ooo.test sudo[68753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:26 np0005486759.ooo.test sudo[68764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdpnkilu1/privsep.sock
Oct 14 08:26:26 np0005486759.ooo.test sudo[68764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:27 np0005486759.ooo.test sudo[68764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:27 np0005486759.ooo.test sudo[68775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpat1qs756/privsep.sock
Oct 14 08:26:27 np0005486759.ooo.test sudo[68775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:28 np0005486759.ooo.test sudo[68775]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:28 np0005486759.ooo.test sudo[68786]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo11mzzzm/privsep.sock
Oct 14 08:26:28 np0005486759.ooo.test sudo[68786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:29 np0005486759.ooo.test sudo[68786]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:29 np0005486759.ooo.test sudo[68797]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi1gaegy4/privsep.sock
Oct 14 08:26:29 np0005486759.ooo.test sudo[68797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:30 np0005486759.ooo.test sudo[68797]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:30 np0005486759.ooo.test sudo[68808]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_7axl7qg/privsep.sock
Oct 14 08:26:30 np0005486759.ooo.test sudo[68808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:31 np0005486759.ooo.test sudo[68808]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:26:31 np0005486759.ooo.test podman[68819]: 2025-10-14 08:26:31.110082419 +0000 UTC m=+0.054497923 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9)
Oct 14 08:26:31 np0005486759.ooo.test podman[68819]: 2025-10-14 08:26:31.275205118 +0000 UTC m=+0.219620672 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd)
Oct 14 08:26:31 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:26:31 np0005486759.ooo.test sudo[68854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5gddzxj0/privsep.sock
Oct 14 08:26:31 np0005486759.ooo.test sudo[68854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:31 np0005486759.ooo.test sudo[68854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:32 np0005486759.ooo.test sudo[68865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6g7ercvc/privsep.sock
Oct 14 08:26:32 np0005486759.ooo.test sudo[68865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: tmp-crun.bgd2df.mount: Deactivated successfully.
Oct 14 08:26:32 np0005486759.ooo.test podman[68868]: 2025-10-14 08:26:32.51292924 +0000 UTC m=+0.135715375 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1, vcs-type=git, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid)
Oct 14 08:26:32 np0005486759.ooo.test podman[68868]: 2025-10-14 08:26:32.524433558 +0000 UTC m=+0.147219693 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:26:32 np0005486759.ooo.test podman[68870]: 2025-10-14 08:26:32.490109362 +0000 UTC m=+0.106960163 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, managed_by=tripleo_ansible, release=2, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=collectd, architecture=x86_64)
Oct 14 08:26:32 np0005486759.ooo.test podman[68870]: 2025-10-14 08:26:32.574571485 +0000 UTC m=+0.191422316 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:26:32 np0005486759.ooo.test podman[68869]: 2025-10-14 08:26:32.629232153 +0000 UTC m=+0.248710696 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64)
Oct 14 08:26:32 np0005486759.ooo.test podman[68869]: 2025-10-14 08:26:32.654307242 +0000 UTC m=+0.273785735 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5)
Oct 14 08:26:32 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:26:32 np0005486759.ooo.test sudo[68865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:33 np0005486759.ooo.test sudo[68936]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpubm3ljjr/privsep.sock
Oct 14 08:26:33 np0005486759.ooo.test sudo[68936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:33 np0005486759.ooo.test sudo[68936]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:33 np0005486759.ooo.test sudo[68947]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxkxwx4p5/privsep.sock
Oct 14 08:26:33 np0005486759.ooo.test sudo[68947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:34 np0005486759.ooo.test sudo[68947]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:34 np0005486759.ooo.test sudo[68958]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp06ihwb2z/privsep.sock
Oct 14 08:26:34 np0005486759.ooo.test sudo[68958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:35 np0005486759.ooo.test sudo[68958]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:35 np0005486759.ooo.test sudo[68969]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpytcbvapa/privsep.sock
Oct 14 08:26:35 np0005486759.ooo.test sudo[68969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:36 np0005486759.ooo.test sudo[68969]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:36 np0005486759.ooo.test sudo[68986]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbxv2go79/privsep.sock
Oct 14 08:26:36 np0005486759.ooo.test sudo[68986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:37 np0005486759.ooo.test sudo[68986]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:37 np0005486759.ooo.test sudo[68997]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdtyhmspb/privsep.sock
Oct 14 08:26:37 np0005486759.ooo.test sudo[68997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:37 np0005486759.ooo.test sudo[68997]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:38 np0005486759.ooo.test sudo[69008]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm9whhdvc/privsep.sock
Oct 14 08:26:38 np0005486759.ooo.test sudo[69008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:38 np0005486759.ooo.test sudo[69008]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:39 np0005486759.ooo.test sudo[69019]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiarz_714/privsep.sock
Oct 14 08:26:39 np0005486759.ooo.test sudo[69019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:39 np0005486759.ooo.test sudo[69019]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:39 np0005486759.ooo.test sudo[69030]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgr7zit47/privsep.sock
Oct 14 08:26:39 np0005486759.ooo.test sudo[69030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:40 np0005486759.ooo.test sudo[69030]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:40 np0005486759.ooo.test sudo[69041]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwqt527z6/privsep.sock
Oct 14 08:26:40 np0005486759.ooo.test sudo[69041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:41 np0005486759.ooo.test sudo[69041]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:41 np0005486759.ooo.test sudo[69058]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8wrldr_5/privsep.sock
Oct 14 08:26:41 np0005486759.ooo.test sudo[69058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:42 np0005486759.ooo.test sudo[69058]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:42 np0005486759.ooo.test sudo[69069]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg784zoya/privsep.sock
Oct 14 08:26:42 np0005486759.ooo.test sudo[69069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:43 np0005486759.ooo.test sudo[69069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:43 np0005486759.ooo.test sudo[69080]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphetve6qz/privsep.sock
Oct 14 08:26:43 np0005486759.ooo.test sudo[69080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:44 np0005486759.ooo.test sudo[69080]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:44 np0005486759.ooo.test sudo[69091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplq0xnml8/privsep.sock
Oct 14 08:26:44 np0005486759.ooo.test sudo[69091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:45 np0005486759.ooo.test sudo[69091]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:45 np0005486759.ooo.test sudo[69102]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuxy9f5pn/privsep.sock
Oct 14 08:26:45 np0005486759.ooo.test sudo[69102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:45 np0005486759.ooo.test sudo[69102]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:46 np0005486759.ooo.test sudo[69113]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpozniqrl5/privsep.sock
Oct 14 08:26:46 np0005486759.ooo.test sudo[69113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: tmp-crun.HDvyqg.mount: Deactivated successfully.
Oct 14 08:26:46 np0005486759.ooo.test podman[69116]: 2025-10-14 08:26:46.184873208 +0000 UTC m=+0.055177365 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:26:46 np0005486759.ooo.test podman[69116]: 2025-10-14 08:26:46.211325979 +0000 UTC m=+0.081630216 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:26:46 np0005486759.ooo.test podman[69116]: unhealthy
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: tmp-crun.a2mBvS.mount: Deactivated successfully.
Oct 14 08:26:46 np0005486759.ooo.test podman[69115]: 2025-10-14 08:26:46.265052848 +0000 UTC m=+0.136065017 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, release=1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Oct 14 08:26:46 np0005486759.ooo.test podman[69145]: 2025-10-14 08:26:46.269118345 +0000 UTC m=+0.059743438 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc.)
Oct 14 08:26:46 np0005486759.ooo.test podman[69145]: 2025-10-14 08:26:46.280600521 +0000 UTC m=+0.071225614 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:26:46 np0005486759.ooo.test podman[69145]: unhealthy
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:26:46 np0005486759.ooo.test podman[69115]: 2025-10-14 08:26:46.297735093 +0000 UTC m=+0.168747292 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:07:52, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-cron)
Oct 14 08:26:46 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:26:46 np0005486759.ooo.test sudo[69113]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:46 np0005486759.ooo.test sudo[69187]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4sfn1uta/privsep.sock
Oct 14 08:26:46 np0005486759.ooo.test sudo[69187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:47 np0005486759.ooo.test sudo[69187]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:47 np0005486759.ooo.test sudo[69199]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi15udec3/privsep.sock
Oct 14 08:26:47 np0005486759.ooo.test sudo[69199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:48 np0005486759.ooo.test sudo[69199]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:48 np0005486759.ooo.test sudo[69210]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoj97kodo/privsep.sock
Oct 14 08:26:48 np0005486759.ooo.test sudo[69210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:49 np0005486759.ooo.test sudo[69210]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:49 np0005486759.ooo.test sudo[69221]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvpkswg46/privsep.sock
Oct 14 08:26:49 np0005486759.ooo.test sudo[69221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:50 np0005486759.ooo.test sudo[69221]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:50 np0005486759.ooo.test sudo[69232]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc2urr42s/privsep.sock
Oct 14 08:26:50 np0005486759.ooo.test sudo[69232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:51 np0005486759.ooo.test sudo[69232]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:26:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:26:51 np0005486759.ooo.test systemd[1]: tmp-crun.wGiKSS.mount: Deactivated successfully.
Oct 14 08:26:51 np0005486759.ooo.test podman[69237]: 2025-10-14 08:26:51.095227928 +0000 UTC m=+0.058884029 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, release=1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12)
Oct 14 08:26:51 np0005486759.ooo.test podman[69239]: 2025-10-14 08:26:51.182638594 +0000 UTC m=+0.140014480 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-type=git, architecture=x86_64)
Oct 14 08:26:51 np0005486759.ooo.test podman[69237]: 2025-10-14 08:26:51.202602594 +0000 UTC m=+0.166258755 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, managed_by=tripleo_ansible)
Oct 14 08:26:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:26:51 np0005486759.ooo.test podman[69239]: 2025-10-14 08:26:51.254230817 +0000 UTC m=+0.211606713 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, release=1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, version=17.1.9, build-date=2025-07-21T13:28:44, distribution-scope=public)
Oct 14 08:26:51 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:26:51 np0005486759.ooo.test sudo[69288]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1p_l8g3l/privsep.sock
Oct 14 08:26:51 np0005486759.ooo.test sudo[69288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:51 np0005486759.ooo.test sudo[69288]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:26:51 np0005486759.ooo.test podman[69293]: 2025-10-14 08:26:51.917664993 +0000 UTC m=+0.069740337 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_migration_target, batch=17.1_20250721.1, config_id=tripleo_step4)
Oct 14 08:26:52 np0005486759.ooo.test sudo[69321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd9xp1rdl/privsep.sock
Oct 14 08:26:52 np0005486759.ooo.test sudo[69321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:52 np0005486759.ooo.test podman[69293]: 2025-10-14 08:26:52.307258233 +0000 UTC m=+0.459333617 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, release=1, version=17.1.9, maintainer=OpenStack TripleO Team)
Oct 14 08:26:52 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:26:52 np0005486759.ooo.test sudo[69321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:52 np0005486759.ooo.test sudo[69339]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr_vpo9t1/privsep.sock
Oct 14 08:26:52 np0005486759.ooo.test sudo[69339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:53 np0005486759.ooo.test sudo[69339]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:53 np0005486759.ooo.test sudo[69350]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvw6qssyt/privsep.sock
Oct 14 08:26:53 np0005486759.ooo.test sudo[69350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:54 np0005486759.ooo.test sudo[69350]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:54 np0005486759.ooo.test sudo[69361]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvom8ul3n/privsep.sock
Oct 14 08:26:54 np0005486759.ooo.test sudo[69361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:55 np0005486759.ooo.test sudo[69361]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:55 np0005486759.ooo.test sudo[69372]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptpbqtn1z/privsep.sock
Oct 14 08:26:55 np0005486759.ooo.test sudo[69372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:56 np0005486759.ooo.test sudo[69372]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:56 np0005486759.ooo.test sudo[69383]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb3unktnd/privsep.sock
Oct 14 08:26:56 np0005486759.ooo.test sudo[69383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:56 np0005486759.ooo.test sudo[69383]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:57 np0005486759.ooo.test sudo[69394]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy8kvgqe7/privsep.sock
Oct 14 08:26:57 np0005486759.ooo.test sudo[69394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:57 np0005486759.ooo.test sudo[69394]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:58 np0005486759.ooo.test sudo[69411]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_0ss8nwd/privsep.sock
Oct 14 08:26:58 np0005486759.ooo.test sudo[69411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:58 np0005486759.ooo.test sudo[69411]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:59 np0005486759.ooo.test sudo[69422]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6d0i_kcx/privsep.sock
Oct 14 08:26:59 np0005486759.ooo.test sudo[69422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:26:59 np0005486759.ooo.test sudo[69422]: pam_unix(sudo:session): session closed for user root
Oct 14 08:26:59 np0005486759.ooo.test sudo[69433]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphj08_19e/privsep.sock
Oct 14 08:26:59 np0005486759.ooo.test sudo[69433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:00 np0005486759.ooo.test sudo[69433]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:00 np0005486759.ooo.test sudo[69444]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphc063w4t/privsep.sock
Oct 14 08:27:00 np0005486759.ooo.test sudo[69444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:01 np0005486759.ooo.test sudo[69444]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:27:01 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:27:01 np0005486759.ooo.test recover_tripleo_nova_virtqemud[69457]: 47951
Oct 14 08:27:01 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:27:01 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:27:01 np0005486759.ooo.test podman[69448]: 2025-10-14 08:27:01.417332013 +0000 UTC m=+0.085758515 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd)
Oct 14 08:27:01 np0005486759.ooo.test sudo[69484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp920suxyx/privsep.sock
Oct 14 08:27:01 np0005486759.ooo.test sudo[69484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:01 np0005486759.ooo.test podman[69448]: 2025-10-14 08:27:01.606321264 +0000 UTC m=+0.274747686 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, release=1)
Oct 14 08:27:01 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:27:02 np0005486759.ooo.test sudo[69484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:02 np0005486759.ooo.test sudo[69498]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppygfhmhe/privsep.sock
Oct 14 08:27:02 np0005486759.ooo.test sudo[69498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:03 np0005486759.ooo.test sudo[69498]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:27:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:27:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:27:03 np0005486759.ooo.test podman[69510]: 2025-10-14 08:27:03.335405027 +0000 UTC m=+0.082642758 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, vendor=Red Hat, Inc.)
Oct 14 08:27:03 np0005486759.ooo.test podman[69510]: 2025-10-14 08:27:03.35580904 +0000 UTC m=+0.103046791 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64)
Oct 14 08:27:03 np0005486759.ooo.test podman[69509]: 2025-10-14 08:27:03.365892464 +0000 UTC m=+0.116671005 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Oct 14 08:27:03 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:27:03 np0005486759.ooo.test podman[69509]: 2025-10-14 08:27:03.397794454 +0000 UTC m=+0.148572945 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:27:03 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:27:03 np0005486759.ooo.test podman[69511]: 2025-10-14 08:27:03.451878394 +0000 UTC m=+0.196082981 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:27:03 np0005486759.ooo.test sudo[69572]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxbhawxgh/privsep.sock
Oct 14 08:27:03 np0005486759.ooo.test sudo[69572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:03 np0005486759.ooo.test podman[69511]: 2025-10-14 08:27:03.484566489 +0000 UTC m=+0.228771096 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T13:04:03, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, release=2, config_id=tripleo_step3, version=17.1.9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible)
Oct 14 08:27:03 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:27:04 np0005486759.ooo.test sudo[69572]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:04 np0005486759.ooo.test sudo[69583]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi___k_7w/privsep.sock
Oct 14 08:27:04 np0005486759.ooo.test sudo[69583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:04 np0005486759.ooo.test sudo[69583]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:05 np0005486759.ooo.test sudo[69594]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp85ualvxn/privsep.sock
Oct 14 08:27:05 np0005486759.ooo.test sudo[69594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:05 np0005486759.ooo.test sudo[69594]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:06 np0005486759.ooo.test sudo[69605]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp2buo7rz/privsep.sock
Oct 14 08:27:06 np0005486759.ooo.test sudo[69605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:06 np0005486759.ooo.test sudo[69605]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:06 np0005486759.ooo.test sudo[69616]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpazg3620p/privsep.sock
Oct 14 08:27:06 np0005486759.ooo.test sudo[69616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:07 np0005486759.ooo.test sudo[69616]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:07 np0005486759.ooo.test sudo[69627]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1ak6vu_h/privsep.sock
Oct 14 08:27:07 np0005486759.ooo.test sudo[69627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:08 np0005486759.ooo.test sudo[69627]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:08 np0005486759.ooo.test sudo[69644]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpad3lm1lc/privsep.sock
Oct 14 08:27:08 np0005486759.ooo.test sudo[69644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:09 np0005486759.ooo.test sudo[69644]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:09 np0005486759.ooo.test sudo[69655]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvi3cagml/privsep.sock
Oct 14 08:27:09 np0005486759.ooo.test sudo[69655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:10 np0005486759.ooo.test sudo[69655]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:10 np0005486759.ooo.test sudo[69666]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbwka3ni7/privsep.sock
Oct 14 08:27:10 np0005486759.ooo.test sudo[69666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:11 np0005486759.ooo.test sudo[69666]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:11 np0005486759.ooo.test sudo[69677]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzsulktd7/privsep.sock
Oct 14 08:27:11 np0005486759.ooo.test sudo[69677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:12 np0005486759.ooo.test sudo[69677]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:12 np0005486759.ooo.test sudo[69688]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpovp98u77/privsep.sock
Oct 14 08:27:12 np0005486759.ooo.test sudo[69688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:13 np0005486759.ooo.test sudo[69688]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:13 np0005486759.ooo.test sudo[69699]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa8cmsjni/privsep.sock
Oct 14 08:27:13 np0005486759.ooo.test sudo[69699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:14 np0005486759.ooo.test sudo[69699]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:14 np0005486759.ooo.test sudo[69716]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzcft_ej3/privsep.sock
Oct 14 08:27:14 np0005486759.ooo.test sudo[69716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:14 np0005486759.ooo.test sudo[69716]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:15 np0005486759.ooo.test sudo[69727]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcfk8wuq9/privsep.sock
Oct 14 08:27:15 np0005486759.ooo.test sudo[69727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:15 np0005486759.ooo.test sudo[69727]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:16 np0005486759.ooo.test sudo[69738]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpftr70c3z/privsep.sock
Oct 14 08:27:16 np0005486759.ooo.test sudo[69738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: tmp-crun.exw6X9.mount: Deactivated successfully.
Oct 14 08:27:16 np0005486759.ooo.test podman[69741]: 2025-10-14 08:27:16.442926637 +0000 UTC m=+0.069882731 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12)
Oct 14 08:27:16 np0005486759.ooo.test podman[69742]: 2025-10-14 08:27:16.478532911 +0000 UTC m=+0.097345782 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9)
Oct 14 08:27:16 np0005486759.ooo.test podman[69743]: 2025-10-14 08:27:16.484746335 +0000 UTC m=+0.103674129 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:27:16 np0005486759.ooo.test podman[69742]: 2025-10-14 08:27:16.498105519 +0000 UTC m=+0.116918440 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_compute, release=1, version=17.1.9, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33)
Oct 14 08:27:16 np0005486759.ooo.test podman[69742]: unhealthy
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:27:16 np0005486759.ooo.test podman[69743]: 2025-10-14 08:27:16.519689299 +0000 UTC m=+0.138617123 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:27:16 np0005486759.ooo.test podman[69743]: unhealthy
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:27:16 np0005486759.ooo.test podman[69741]: 2025-10-14 08:27:16.5722174 +0000 UTC m=+0.199173564 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-07-21T13:07:52)
Oct 14 08:27:16 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:27:16 np0005486759.ooo.test sudo[69738]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:16 np0005486759.ooo.test sudo[69802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz92jnnbg/privsep.sock
Oct 14 08:27:16 np0005486759.ooo.test sudo[69802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:17 np0005486759.ooo.test sudo[69802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:17 np0005486759.ooo.test sudo[69813]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1u0aieh4/privsep.sock
Oct 14 08:27:17 np0005486759.ooo.test sudo[69813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:18 np0005486759.ooo.test sudo[69813]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:18 np0005486759.ooo.test sudo[69824]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgw2ee9e9/privsep.sock
Oct 14 08:27:18 np0005486759.ooo.test sudo[69824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:19 np0005486759.ooo.test sudo[69824]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:19 np0005486759.ooo.test sudo[69841]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpypnzk4q3/privsep.sock
Oct 14 08:27:19 np0005486759.ooo.test sudo[69841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:20 np0005486759.ooo.test sudo[69841]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:20 np0005486759.ooo.test sudo[69852]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpamxlo1pr/privsep.sock
Oct 14 08:27:20 np0005486759.ooo.test sudo[69852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:21 np0005486759.ooo.test sudo[69852]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:27:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:27:21 np0005486759.ooo.test systemd[1]: tmp-crun.akIiCC.mount: Deactivated successfully.
Oct 14 08:27:21 np0005486759.ooo.test podman[69857]: 2025-10-14 08:27:21.353896261 +0000 UTC m=+0.091576675 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T16:28:53, release=1, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, container_name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 08:27:21 np0005486759.ooo.test podman[69857]: 2025-10-14 08:27:21.388229286 +0000 UTC m=+0.125909680 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 08:27:21 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:27:21 np0005486759.ooo.test podman[69874]: 2025-10-14 08:27:21.430662923 +0000 UTC m=+0.076787634 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, version=17.1.9, release=1, container_name=ovn_controller, io.buildah.version=1.33.12, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:27:21 np0005486759.ooo.test podman[69874]: 2025-10-14 08:27:21.479182749 +0000 UTC m=+0.125307430 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller)
Oct 14 08:27:21 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:27:21 np0005486759.ooo.test sudo[69911]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzao9edur/privsep.sock
Oct 14 08:27:21 np0005486759.ooo.test sudo[69911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:22 np0005486759.ooo.test sudo[69911]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:27:22 np0005486759.ooo.test sudo[69936]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsnettjmz/privsep.sock
Oct 14 08:27:22 np0005486759.ooo.test podman[69918]: 2025-10-14 08:27:22.457294632 +0000 UTC m=+0.080484740 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T14:48:37, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:27:22 np0005486759.ooo.test sudo[69936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:22 np0005486759.ooo.test podman[69918]: 2025-10-14 08:27:22.828324409 +0000 UTC m=+0.451514557 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.9, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T14:48:37)
Oct 14 08:27:22 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:27:23 np0005486759.ooo.test sudo[69936]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:23 np0005486759.ooo.test sudo[69954]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_yags89n/privsep.sock
Oct 14 08:27:23 np0005486759.ooo.test sudo[69954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:23 np0005486759.ooo.test sudo[69954]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:24 np0005486759.ooo.test sudo[69965]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf9y_hepy/privsep.sock
Oct 14 08:27:24 np0005486759.ooo.test sudo[69965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:24 np0005486759.ooo.test sudo[69965]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:25 np0005486759.ooo.test sudo[69982]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvaje1vvy/privsep.sock
Oct 14 08:27:25 np0005486759.ooo.test sudo[69982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:25 np0005486759.ooo.test sudo[69982]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:25 np0005486759.ooo.test sudo[69993]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx9h4fr1y/privsep.sock
Oct 14 08:27:25 np0005486759.ooo.test sudo[69993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:26 np0005486759.ooo.test sudo[69993]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:26 np0005486759.ooo.test sudo[70004]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91_7znsm/privsep.sock
Oct 14 08:27:26 np0005486759.ooo.test sudo[70004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:27 np0005486759.ooo.test sudo[70004]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:27 np0005486759.ooo.test sudo[70015]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphtsceg2w/privsep.sock
Oct 14 08:27:27 np0005486759.ooo.test sudo[70015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:28 np0005486759.ooo.test sudo[70015]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:28 np0005486759.ooo.test sudo[70026]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsywaaftk/privsep.sock
Oct 14 08:27:28 np0005486759.ooo.test sudo[70026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:29 np0005486759.ooo.test sudo[70026]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:29 np0005486759.ooo.test sudo[70037]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp232gzao2/privsep.sock
Oct 14 08:27:29 np0005486759.ooo.test sudo[70037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:30 np0005486759.ooo.test sudo[70037]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:30 np0005486759.ooo.test sudo[70053]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcnhm1uj3/privsep.sock
Oct 14 08:27:30 np0005486759.ooo.test sudo[70053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:30 np0005486759.ooo.test sudo[70053]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:31 np0005486759.ooo.test sudo[70065]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph0rtwt8c/privsep.sock
Oct 14 08:27:31 np0005486759.ooo.test sudo[70065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:31 np0005486759.ooo.test sudo[70065]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:27:31 np0005486759.ooo.test podman[70071]: 2025-10-14 08:27:31.83822633 +0000 UTC m=+0.056842836 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team)
Oct 14 08:27:31 np0005486759.ooo.test sudo[70103]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9oucq0a9/privsep.sock
Oct 14 08:27:31 np0005486759.ooo.test sudo[70103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:32 np0005486759.ooo.test podman[70071]: 2025-10-14 08:27:32.021997625 +0000 UTC m=+0.240614141 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, distribution-scope=public, vcs-type=git, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., release=1)
Oct 14 08:27:32 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:27:32 np0005486759.ooo.test sudo[70103]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:32 np0005486759.ooo.test sudo[70114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbc5ofp9v/privsep.sock
Oct 14 08:27:32 np0005486759.ooo.test sudo[70114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:33 np0005486759.ooo.test sudo[70114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:27:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:27:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:27:33 np0005486759.ooo.test podman[70122]: 2025-10-14 08:27:33.626359537 +0000 UTC m=+0.092750431 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:27:33 np0005486759.ooo.test podman[70122]: 2025-10-14 08:27:33.636529072 +0000 UTC m=+0.102919926 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, release=2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, tcib_managed=true, container_name=collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:27:33 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:27:33 np0005486759.ooo.test podman[70119]: 2025-10-14 08:27:33.729061104 +0000 UTC m=+0.197957415 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, version=17.1.9, vendor=Red Hat, Inc.)
Oct 14 08:27:33 np0005486759.ooo.test podman[70119]: 2025-10-14 08:27:33.741517272 +0000 UTC m=+0.210413593 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:27:33 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:27:33 np0005486759.ooo.test sudo[70175]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo8bvg5s5/privsep.sock
Oct 14 08:27:33 np0005486759.ooo.test sudo[70175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:33 np0005486759.ooo.test podman[70121]: 2025-10-14 08:27:33.832534976 +0000 UTC m=+0.301154848 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:27:33 np0005486759.ooo.test podman[70121]: 2025-10-14 08:27:33.883067615 +0000 UTC m=+0.351687557 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:27:33 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:27:34 np0005486759.ooo.test sudo[70175]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:34 np0005486759.ooo.test sudo[70201]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuzaaas16/privsep.sock
Oct 14 08:27:34 np0005486759.ooo.test sudo[70201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:35 np0005486759.ooo.test sudo[70201]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:35 np0005486759.ooo.test sudo[70212]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnbmbwyu9/privsep.sock
Oct 14 08:27:35 np0005486759.ooo.test sudo[70212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:36 np0005486759.ooo.test sudo[70212]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:36 np0005486759.ooo.test sudo[70229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfft9hfl1/privsep.sock
Oct 14 08:27:36 np0005486759.ooo.test sudo[70229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:37 np0005486759.ooo.test sudo[70229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:37 np0005486759.ooo.test sudo[70240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqe90hpkr/privsep.sock
Oct 14 08:27:37 np0005486759.ooo.test sudo[70240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:37 np0005486759.ooo.test sudo[70240]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:38 np0005486759.ooo.test sudo[70251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuiu567y_/privsep.sock
Oct 14 08:27:38 np0005486759.ooo.test sudo[70251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:38 np0005486759.ooo.test sudo[70251]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:39 np0005486759.ooo.test sudo[70262]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpua4hz7tz/privsep.sock
Oct 14 08:27:39 np0005486759.ooo.test sudo[70262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:39 np0005486759.ooo.test sudo[70262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:39 np0005486759.ooo.test sudo[70273]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprprvgpg2/privsep.sock
Oct 14 08:27:39 np0005486759.ooo.test sudo[70273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:40 np0005486759.ooo.test sudo[70273]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:40 np0005486759.ooo.test sudo[70284]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfumfggdg/privsep.sock
Oct 14 08:27:40 np0005486759.ooo.test sudo[70284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:41 np0005486759.ooo.test sudo[70284]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:41 np0005486759.ooo.test sudo[70301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpczliw9p5/privsep.sock
Oct 14 08:27:41 np0005486759.ooo.test sudo[70301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:42 np0005486759.ooo.test sudo[70301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:42 np0005486759.ooo.test sudo[70312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmx18vyva/privsep.sock
Oct 14 08:27:42 np0005486759.ooo.test sudo[70312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:43 np0005486759.ooo.test sudo[70312]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:43 np0005486759.ooo.test sudo[70323]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpusrpcnea/privsep.sock
Oct 14 08:27:43 np0005486759.ooo.test sudo[70323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:44 np0005486759.ooo.test sudo[70323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:44 np0005486759.ooo.test sudo[70334]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpimhpuj4e/privsep.sock
Oct 14 08:27:44 np0005486759.ooo.test sudo[70334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:45 np0005486759.ooo.test sudo[70334]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:45 np0005486759.ooo.test sudo[70345]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpre07kktl/privsep.sock
Oct 14 08:27:45 np0005486759.ooo.test sudo[70345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:45 np0005486759.ooo.test sudo[70345]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:46 np0005486759.ooo.test sudo[70356]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgeznjuo4/privsep.sock
Oct 14 08:27:46 np0005486759.ooo.test sudo[70356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:46 np0005486759.ooo.test sudo[70356]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:27:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:27:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:27:47 np0005486759.ooo.test sudo[70403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0wuanct7/privsep.sock
Oct 14 08:27:47 np0005486759.ooo.test sudo[70403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:47 np0005486759.ooo.test podman[70369]: 2025-10-14 08:27:47.101939369 +0000 UTC m=+0.222899400 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:27:47 np0005486759.ooo.test podman[70369]: 2025-10-14 08:27:47.115327255 +0000 UTC m=+0.236287256 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33)
Oct 14 08:27:47 np0005486759.ooo.test podman[70369]: unhealthy
Oct 14 08:27:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:27:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:27:47 np0005486759.ooo.test podman[70373]: 2025-10-14 08:27:47.154188001 +0000 UTC m=+0.270004093 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:27:47 np0005486759.ooo.test podman[70368]: 2025-10-14 08:27:47.205589227 +0000 UTC m=+0.330749448 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20250721.1, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 08:27:47 np0005486759.ooo.test podman[70368]: 2025-10-14 08:27:47.217271139 +0000 UTC m=+0.342431310 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc.)
Oct 14 08:27:47 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:27:47 np0005486759.ooo.test podman[70373]: 2025-10-14 08:27:47.27369117 +0000 UTC m=+0.389507302 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 08:27:47 np0005486759.ooo.test podman[70373]: unhealthy
Oct 14 08:27:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:27:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:27:47 np0005486759.ooo.test sudo[70403]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:48 np0005486759.ooo.test sudo[70438]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6wpo6jy5/privsep.sock
Oct 14 08:27:48 np0005486759.ooo.test sudo[70438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:48 np0005486759.ooo.test sudo[70438]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:48 np0005486759.ooo.test sudo[70449]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuocelyew/privsep.sock
Oct 14 08:27:48 np0005486759.ooo.test sudo[70449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:49 np0005486759.ooo.test sudo[70449]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:49 np0005486759.ooo.test sudo[70460]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvfroajdd/privsep.sock
Oct 14 08:27:49 np0005486759.ooo.test sudo[70460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:50 np0005486759.ooo.test sudo[70460]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:50 np0005486759.ooo.test sudo[70471]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp13e0unu2/privsep.sock
Oct 14 08:27:50 np0005486759.ooo.test sudo[70471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:51 np0005486759.ooo.test sudo[70471]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:51 np0005486759.ooo.test sudo[70482]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplda4ye1a/privsep.sock
Oct 14 08:27:51 np0005486759.ooo.test sudo[70482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:27:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:27:51 np0005486759.ooo.test systemd[1]: tmp-crun.3yF2rb.mount: Deactivated successfully.
Oct 14 08:27:51 np0005486759.ooo.test podman[70485]: 2025-10-14 08:27:51.711703343 +0000 UTC m=+0.080064566 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public)
Oct 14 08:27:51 np0005486759.ooo.test systemd[1]: tmp-crun.FpmXrR.mount: Deactivated successfully.
Oct 14 08:27:51 np0005486759.ooo.test podman[70484]: 2025-10-14 08:27:51.764698158 +0000 UTC m=+0.133724591 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 08:27:51 np0005486759.ooo.test podman[70485]: 2025-10-14 08:27:51.782585924 +0000 UTC m=+0.150947157 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, vcs-type=git, container_name=ovn_controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:27:51 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:27:51 np0005486759.ooo.test podman[70484]: 2025-10-14 08:27:51.819676665 +0000 UTC m=+0.188703098 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git)
Oct 14 08:27:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:27:52 np0005486759.ooo.test sudo[70482]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:52 np0005486759.ooo.test sudo[70545]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe6f384yp/privsep.sock
Oct 14 08:27:52 np0005486759.ooo.test sudo[70545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:53 np0005486759.ooo.test sudo[70545]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:27:53 np0005486759.ooo.test podman[70550]: 2025-10-14 08:27:53.20218848 +0000 UTC m=+0.078603701 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.9, release=1, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 08:27:53 np0005486759.ooo.test sudo[70577]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk26r5g8f/privsep.sock
Oct 14 08:27:53 np0005486759.ooo.test sudo[70577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:53 np0005486759.ooo.test podman[70550]: 2025-10-14 08:27:53.637282446 +0000 UTC m=+0.513697647 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, version=17.1.9, release=1, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12)
Oct 14 08:27:53 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:27:54 np0005486759.ooo.test sudo[70577]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:54 np0005486759.ooo.test sudo[70592]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwdawkp1g/privsep.sock
Oct 14 08:27:54 np0005486759.ooo.test sudo[70592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:54 np0005486759.ooo.test sudo[70592]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:55 np0005486759.ooo.test sudo[70603]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk672s2ms/privsep.sock
Oct 14 08:27:55 np0005486759.ooo.test sudo[70603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:55 np0005486759.ooo.test sudo[70603]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:56 np0005486759.ooo.test sudo[70614]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0n5g138l/privsep.sock
Oct 14 08:27:56 np0005486759.ooo.test sudo[70614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:56 np0005486759.ooo.test sudo[70614]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:56 np0005486759.ooo.test sudo[70625]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp_kdq1u9/privsep.sock
Oct 14 08:27:56 np0005486759.ooo.test sudo[70625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:57 np0005486759.ooo.test sudo[70625]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:57 np0005486759.ooo.test sudo[70642]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp377x0w7m/privsep.sock
Oct 14 08:27:57 np0005486759.ooo.test sudo[70642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:58 np0005486759.ooo.test sudo[70642]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:58 np0005486759.ooo.test sudo[70653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp88yydz7a/privsep.sock
Oct 14 08:27:58 np0005486759.ooo.test sudo[70653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:27:59 np0005486759.ooo.test sudo[70653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:27:59 np0005486759.ooo.test sudo[70664]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcb6xfhyw/privsep.sock
Oct 14 08:27:59 np0005486759.ooo.test sudo[70664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:00 np0005486759.ooo.test sudo[70664]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:00 np0005486759.ooo.test sudo[70675]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkl1jcc97/privsep.sock
Oct 14 08:28:00 np0005486759.ooo.test sudo[70675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:01 np0005486759.ooo.test sudo[70675]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:01 np0005486759.ooo.test sudo[70686]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsl7uy7if/privsep.sock
Oct 14 08:28:01 np0005486759.ooo.test sudo[70686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:02 np0005486759.ooo.test sudo[70686]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:28:02 np0005486759.ooo.test podman[70692]: 2025-10-14 08:28:02.305420738 +0000 UTC m=+0.082781491 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:28:02 np0005486759.ooo.test sudo[70725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprjeynxvk/privsep.sock
Oct 14 08:28:02 np0005486759.ooo.test sudo[70725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:02 np0005486759.ooo.test podman[70692]: 2025-10-14 08:28:02.547793522 +0000 UTC m=+0.325154225 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, release=1, build-date=2025-07-21T13:07:59, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:28:02 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:28:03 np0005486759.ooo.test sudo[70725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:03 np0005486759.ooo.test sudo[70743]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ncaavxh/privsep.sock
Oct 14 08:28:03 np0005486759.ooo.test sudo[70743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:03 np0005486759.ooo.test sudo[70743]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:28:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:28:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:28:04 np0005486759.ooo.test systemd[1]: tmp-crun.ax0Or6.mount: Deactivated successfully.
Oct 14 08:28:04 np0005486759.ooo.test systemd[1]: tmp-crun.70pqvZ.mount: Deactivated successfully.
Oct 14 08:28:04 np0005486759.ooo.test podman[70750]: 2025-10-14 08:28:04.06729389 +0000 UTC m=+0.096457296 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, distribution-scope=public, release=1, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 08:28:04 np0005486759.ooo.test podman[70749]: 2025-10-14 08:28:04.0299105 +0000 UTC m=+0.064707191 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:28:04 np0005486759.ooo.test podman[70751]: 2025-10-14 08:28:04.120135 +0000 UTC m=+0.145240470 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-collectd, architecture=x86_64, tcib_managed=true, vcs-type=git, container_name=collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:28:04 np0005486759.ooo.test podman[70751]: 2025-10-14 08:28:04.132514134 +0000 UTC m=+0.157619594 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, container_name=collectd, release=2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public)
Oct 14 08:28:04 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:28:04 np0005486759.ooo.test podman[70750]: 2025-10-14 08:28:04.149575503 +0000 UTC m=+0.178738879 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=nova_compute, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:28:04 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:28:04 np0005486759.ooo.test sudo[70818]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgx8mpwx2/privsep.sock
Oct 14 08:28:04 np0005486759.ooo.test sudo[70818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:04 np0005486759.ooo.test podman[70749]: 2025-10-14 08:28:04.217582055 +0000 UTC m=+0.252378806 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Oct 14 08:28:04 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:28:04 np0005486759.ooo.test sudo[70818]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:05 np0005486759.ooo.test sudo[70829]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb_bn589d/privsep.sock
Oct 14 08:28:05 np0005486759.ooo.test sudo[70829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:05 np0005486759.ooo.test sudo[70829]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:05 np0005486759.ooo.test sudo[70840]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsunozyea/privsep.sock
Oct 14 08:28:05 np0005486759.ooo.test sudo[70840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:06 np0005486759.ooo.test sudo[70840]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:06 np0005486759.ooo.test sudo[70851]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppowzqio_/privsep.sock
Oct 14 08:28:06 np0005486759.ooo.test sudo[70851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:07 np0005486759.ooo.test sudo[70851]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:07 np0005486759.ooo.test sudo[70862]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplvro6v__/privsep.sock
Oct 14 08:28:07 np0005486759.ooo.test sudo[70862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:08 np0005486759.ooo.test sudo[70862]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:08 np0005486759.ooo.test sudo[70879]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1s85bjbz/privsep.sock
Oct 14 08:28:08 np0005486759.ooo.test sudo[70879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:09 np0005486759.ooo.test sudo[70879]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:09 np0005486759.ooo.test sudo[70890]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv3sjhnrn/privsep.sock
Oct 14 08:28:09 np0005486759.ooo.test sudo[70890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:10 np0005486759.ooo.test sudo[70890]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:10 np0005486759.ooo.test sudo[70901]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1p_c2ipa/privsep.sock
Oct 14 08:28:10 np0005486759.ooo.test sudo[70901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:10 np0005486759.ooo.test sudo[70901]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:11 np0005486759.ooo.test sudo[70912]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4d10kr5v/privsep.sock
Oct 14 08:28:11 np0005486759.ooo.test sudo[70912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:11 np0005486759.ooo.test sudo[70912]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:12 np0005486759.ooo.test sudo[70923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2jdke7cp/privsep.sock
Oct 14 08:28:12 np0005486759.ooo.test sudo[70923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:12 np0005486759.ooo.test sudo[70923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:12 np0005486759.ooo.test sudo[70934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6siw848u/privsep.sock
Oct 14 08:28:12 np0005486759.ooo.test sudo[70934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:13 np0005486759.ooo.test sudo[70934]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:13 np0005486759.ooo.test sudo[70950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4o5xe7q_/privsep.sock
Oct 14 08:28:13 np0005486759.ooo.test sudo[70950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:14 np0005486759.ooo.test sudo[70950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:14 np0005486759.ooo.test sudo[70962]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj92zpnt7/privsep.sock
Oct 14 08:28:14 np0005486759.ooo.test sudo[70962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:15 np0005486759.ooo.test sudo[70962]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:15 np0005486759.ooo.test sudo[70973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7ki25ihk/privsep.sock
Oct 14 08:28:15 np0005486759.ooo.test sudo[70973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:16 np0005486759.ooo.test sudo[70973]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:16 np0005486759.ooo.test sudo[70984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3t3r9sy8/privsep.sock
Oct 14 08:28:16 np0005486759.ooo.test sudo[70984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:17 np0005486759.ooo.test sudo[70984]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:17 np0005486759.ooo.test sudo[70995]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwockw686/privsep.sock
Oct 14 08:28:17 np0005486759.ooo.test sudo[70995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:28:17 np0005486759.ooo.test podman[70999]: 2025-10-14 08:28:17.380875294 +0000 UTC m=+0.064709030 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: tmp-crun.yooTKz.mount: Deactivated successfully.
Oct 14 08:28:17 np0005486759.ooo.test podman[70997]: 2025-10-14 08:28:17.464040066 +0000 UTC m=+0.148830151 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:28:17 np0005486759.ooo.test podman[70997]: 2025-10-14 08:28:17.498291479 +0000 UTC m=+0.183081624 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, vcs-type=git, version=17.1.9, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:28:17 np0005486759.ooo.test podman[70999]: 2025-10-14 08:28:17.524446481 +0000 UTC m=+0.208280287 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:28:17 np0005486759.ooo.test podman[70999]: unhealthy
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:28:17 np0005486759.ooo.test podman[70998]: 2025-10-14 08:28:17.498885298 +0000 UTC m=+0.179819164 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, build-date=2025-07-21T14:45:33, vcs-type=git, vendor=Red Hat, Inc., release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute)
Oct 14 08:28:17 np0005486759.ooo.test podman[70998]: 2025-10-14 08:28:17.582457791 +0000 UTC m=+0.263391727 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1)
Oct 14 08:28:17 np0005486759.ooo.test podman[70998]: unhealthy
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:28:17 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:28:17 np0005486759.ooo.test sudo[70995]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:18 np0005486759.ooo.test sudo[71063]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpem8ongv0/privsep.sock
Oct 14 08:28:18 np0005486759.ooo.test sudo[71063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:18 np0005486759.ooo.test sudo[71063]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:19 np0005486759.ooo.test sudo[71077]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw6ku9lwg/privsep.sock
Oct 14 08:28:19 np0005486759.ooo.test sudo[71077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:19 np0005486759.ooo.test sudo[71077]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:19 np0005486759.ooo.test sudo[71091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj5691cw8/privsep.sock
Oct 14 08:28:19 np0005486759.ooo.test sudo[71091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:20 np0005486759.ooo.test sudo[71091]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:20 np0005486759.ooo.test sudo[71102]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvvekvvy8/privsep.sock
Oct 14 08:28:20 np0005486759.ooo.test sudo[71102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:21 np0005486759.ooo.test sudo[71102]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:21 np0005486759.ooo.test sudo[71113]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg9brreq9/privsep.sock
Oct 14 08:28:21 np0005486759.ooo.test sudo[71113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:22 np0005486759.ooo.test sudo[71113]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:28:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:28:22 np0005486759.ooo.test systemd[1]: tmp-crun.gV2Ycs.mount: Deactivated successfully.
Oct 14 08:28:22 np0005486759.ooo.test podman[71117]: 2025-10-14 08:28:22.403814543 +0000 UTC m=+0.086639790 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.33.12, distribution-scope=public, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:28:22 np0005486759.ooo.test podman[71120]: 2025-10-14 08:28:22.378816797 +0000 UTC m=+0.062938564 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:28:22 np0005486759.ooo.test podman[71117]: 2025-10-14 08:28:22.44042787 +0000 UTC m=+0.123253127 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, release=1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:28:22 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:28:22 np0005486759.ooo.test podman[71120]: 2025-10-14 08:28:22.461387251 +0000 UTC m=+0.145508968 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, release=1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:28:22 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:28:22 np0005486759.ooo.test sudo[71170]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpel6yo63x/privsep.sock
Oct 14 08:28:22 np0005486759.ooo.test sudo[71170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:23 np0005486759.ooo.test sudo[71170]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:23 np0005486759.ooo.test sudo[71181]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6umyc0re/privsep.sock
Oct 14 08:28:23 np0005486759.ooo.test sudo[71181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:24 np0005486759.ooo.test sudo[71181]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:28:24 np0005486759.ooo.test systemd[1]: tmp-crun.KeC95J.mount: Deactivated successfully.
Oct 14 08:28:24 np0005486759.ooo.test podman[71186]: 2025-10-14 08:28:24.193821418 +0000 UTC m=+0.068677532 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1, vcs-type=git)
Oct 14 08:28:24 np0005486759.ooo.test sudo[71217]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprz56b4ra/privsep.sock
Oct 14 08:28:24 np0005486759.ooo.test sudo[71217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:24 np0005486759.ooo.test podman[71186]: 2025-10-14 08:28:24.550284014 +0000 UTC m=+0.425140098 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4)
Oct 14 08:28:24 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:28:24 np0005486759.ooo.test sudo[71217]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:25 np0005486759.ooo.test sudo[71233]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpay_atscp/privsep.sock
Oct 14 08:28:25 np0005486759.ooo.test sudo[71233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:25 np0005486759.ooo.test sudo[71233]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:26 np0005486759.ooo.test sudo[71244]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwycjzu96/privsep.sock
Oct 14 08:28:26 np0005486759.ooo.test sudo[71244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:26 np0005486759.ooo.test sudo[71244]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:27 np0005486759.ooo.test sudo[71255]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3q9gczsz/privsep.sock
Oct 14 08:28:27 np0005486759.ooo.test sudo[71255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:27 np0005486759.ooo.test sudo[71255]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:27 np0005486759.ooo.test sudo[71266]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy22g3kxk/privsep.sock
Oct 14 08:28:27 np0005486759.ooo.test sudo[71266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:28 np0005486759.ooo.test sudo[71266]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:28 np0005486759.ooo.test sudo[71277]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2n3f0es9/privsep.sock
Oct 14 08:28:28 np0005486759.ooo.test sudo[71277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:29 np0005486759.ooo.test sudo[71277]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:29 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:28:29 np0005486759.ooo.test recover_tripleo_nova_virtqemud[71284]: 47951
Oct 14 08:28:29 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:28:29 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:28:29 np0005486759.ooo.test sudo[71290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0lexufq6/privsep.sock
Oct 14 08:28:29 np0005486759.ooo.test sudo[71290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:30 np0005486759.ooo.test sudo[71290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:30 np0005486759.ooo.test sudo[71307]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8_w10q_m/privsep.sock
Oct 14 08:28:30 np0005486759.ooo.test sudo[71307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:31 np0005486759.ooo.test sudo[71307]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:31 np0005486759.ooo.test sudo[71318]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxdu2825l/privsep.sock
Oct 14 08:28:31 np0005486759.ooo.test sudo[71318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:31 np0005486759.ooo.test sudo[71318]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:32 np0005486759.ooo.test sudo[71329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0evi0ns7/privsep.sock
Oct 14 08:28:32 np0005486759.ooo.test sudo[71329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:32 np0005486759.ooo.test sudo[71329]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:28:32 np0005486759.ooo.test systemd[1]: tmp-crun.dxfaQ5.mount: Deactivated successfully.
Oct 14 08:28:32 np0005486759.ooo.test podman[71333]: 2025-10-14 08:28:32.950559671 +0000 UTC m=+0.094155863 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:28:33 np0005486759.ooo.test sudo[71369]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph390h_xt/privsep.sock
Oct 14 08:28:33 np0005486759.ooo.test sudo[71369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:33 np0005486759.ooo.test podman[71333]: 2025-10-14 08:28:33.174145551 +0000 UTC m=+0.317741723 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:28:33 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:28:33 np0005486759.ooo.test sudo[71369]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:33 np0005486759.ooo.test sudo[71380]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0am_fdet/privsep.sock
Oct 14 08:28:33 np0005486759.ooo.test sudo[71380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: tmp-crun.TBMwrk.mount: Deactivated successfully.
Oct 14 08:28:34 np0005486759.ooo.test podman[71385]: 2025-10-14 08:28:34.451292996 +0000 UTC m=+0.071904093 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=)
Oct 14 08:28:34 np0005486759.ooo.test podman[71385]: 2025-10-14 08:28:34.486421146 +0000 UTC m=+0.107032263 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, version=17.1.9)
Oct 14 08:28:34 np0005486759.ooo.test sudo[71380]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:34 np0005486759.ooo.test podman[71383]: 2025-10-14 08:28:34.504067215 +0000 UTC m=+0.128670966 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:28:34 np0005486759.ooo.test podman[71384]: 2025-10-14 08:28:34.563654094 +0000 UTC m=+0.184423896 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, vcs-type=git, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5)
Oct 14 08:28:34 np0005486759.ooo.test podman[71383]: 2025-10-14 08:28:34.591314943 +0000 UTC m=+0.215918724 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:28:34 np0005486759.ooo.test podman[71384]: 2025-10-14 08:28:34.609418774 +0000 UTC m=+0.230188546 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Oct 14 08:28:34 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:28:34 np0005486759.ooo.test sudo[71457]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbxh2kjnd/privsep.sock
Oct 14 08:28:34 np0005486759.ooo.test sudo[71457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:35 np0005486759.ooo.test sudo[71457]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:35 np0005486759.ooo.test sudo[71474]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkj4yajuy/privsep.sock
Oct 14 08:28:35 np0005486759.ooo.test sudo[71474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:36 np0005486759.ooo.test sudo[71474]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:36 np0005486759.ooo.test sudo[71485]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfaexee_t/privsep.sock
Oct 14 08:28:36 np0005486759.ooo.test sudo[71485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:37 np0005486759.ooo.test sudo[71485]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:37 np0005486759.ooo.test sudo[71496]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppkjdwyoi/privsep.sock
Oct 14 08:28:37 np0005486759.ooo.test sudo[71496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:37 np0005486759.ooo.test sudo[71496]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:38 np0005486759.ooo.test sudo[71507]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt4r71_b5/privsep.sock
Oct 14 08:28:38 np0005486759.ooo.test sudo[71507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:38 np0005486759.ooo.test sudo[71507]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:39 np0005486759.ooo.test sudo[71518]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk6ovtfxn/privsep.sock
Oct 14 08:28:39 np0005486759.ooo.test sudo[71518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:39 np0005486759.ooo.test sudo[71518]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:39 np0005486759.ooo.test sudo[71529]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg4vlaf9i/privsep.sock
Oct 14 08:28:39 np0005486759.ooo.test sudo[71529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:40 np0005486759.ooo.test sudo[71529]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:40 np0005486759.ooo.test sudo[71545]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbqo634yq/privsep.sock
Oct 14 08:28:40 np0005486759.ooo.test sudo[71545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:41 np0005486759.ooo.test sudo[71545]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:41 np0005486759.ooo.test sudo[71557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdp6h50w8/privsep.sock
Oct 14 08:28:41 np0005486759.ooo.test sudo[71557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:42 np0005486759.ooo.test sudo[71557]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:42 np0005486759.ooo.test sudo[71568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp59qh01d5/privsep.sock
Oct 14 08:28:42 np0005486759.ooo.test sudo[71568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:43 np0005486759.ooo.test sudo[71568]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:43 np0005486759.ooo.test sudo[71579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzeczpe6_/privsep.sock
Oct 14 08:28:43 np0005486759.ooo.test sudo[71579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:43 np0005486759.ooo.test sudo[71579]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:44 np0005486759.ooo.test sudo[71590]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl7z7qstr/privsep.sock
Oct 14 08:28:44 np0005486759.ooo.test sudo[71590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:44 np0005486759.ooo.test sudo[71590]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:45 np0005486759.ooo.test sudo[71601]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphg6gr0f5/privsep.sock
Oct 14 08:28:45 np0005486759.ooo.test sudo[71601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:45 np0005486759.ooo.test sudo[71601]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:46 np0005486759.ooo.test sudo[71614]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm6uun_pc/privsep.sock
Oct 14 08:28:46 np0005486759.ooo.test sudo[71614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:46 np0005486759.ooo.test sudo[71614]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:46 np0005486759.ooo.test sudo[71629]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphmlqlaj0/privsep.sock
Oct 14 08:28:46 np0005486759.ooo.test sudo[71629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:47 np0005486759.ooo.test sudo[71629]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:28:47 np0005486759.ooo.test podman[71635]: 2025-10-14 08:28:47.638682793 +0000 UTC m=+0.090263363 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:28:47 np0005486759.ooo.test podman[71635]: 2025-10-14 08:28:47.673489294 +0000 UTC m=+0.125069874 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, tcib_managed=true, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:28:47 np0005486759.ooo.test podman[71636]: 2025-10-14 08:28:47.676880209 +0000 UTC m=+0.123647959 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Oct 14 08:28:47 np0005486759.ooo.test podman[71663]: 2025-10-14 08:28:47.735980163 +0000 UTC m=+0.085478114 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 08:28:47 np0005486759.ooo.test podman[71663]: 2025-10-14 08:28:47.745701136 +0000 UTC m=+0.095199067 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:28:47 np0005486759.ooo.test podman[71663]: unhealthy
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:28:47 np0005486759.ooo.test podman[71636]: 2025-10-14 08:28:47.765402117 +0000 UTC m=+0.212169897 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, release=1, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20250721.1)
Oct 14 08:28:47 np0005486759.ooo.test podman[71636]: unhealthy
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:28:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:28:47 np0005486759.ooo.test sudo[71694]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjiaj5zeb/privsep.sock
Oct 14 08:28:47 np0005486759.ooo.test sudo[71694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:48 np0005486759.ooo.test sudo[71694]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:48 np0005486759.ooo.test sudo[71706]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp92djm1zy/privsep.sock
Oct 14 08:28:48 np0005486759.ooo.test sudo[71706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:49 np0005486759.ooo.test sudo[71706]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:49 np0005486759.ooo.test sudo[71717]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp760hgh0q/privsep.sock
Oct 14 08:28:49 np0005486759.ooo.test sudo[71717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:50 np0005486759.ooo.test sudo[71717]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:50 np0005486759.ooo.test sudo[71728]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjjor8g54/privsep.sock
Oct 14 08:28:50 np0005486759.ooo.test sudo[71728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:50 np0005486759.ooo.test sudo[71728]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:51 np0005486759.ooo.test sudo[71739]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp594b29m5/privsep.sock
Oct 14 08:28:51 np0005486759.ooo.test sudo[71739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:51 np0005486759.ooo.test sudo[71739]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:52 np0005486759.ooo.test sudo[71756]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxivtlohh/privsep.sock
Oct 14 08:28:52 np0005486759.ooo.test sudo[71756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:52 np0005486759.ooo.test sudo[71756]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:28:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:28:52 np0005486759.ooo.test podman[71760]: 2025-10-14 08:28:52.760946096 +0000 UTC m=+0.068499918 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:28:52 np0005486759.ooo.test systemd[1]: tmp-crun.byI0OB.mount: Deactivated successfully.
Oct 14 08:28:52 np0005486759.ooo.test podman[71762]: 2025-10-14 08:28:52.839655539 +0000 UTC m=+0.141592156 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, container_name=ovn_controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container)
Oct 14 08:28:52 np0005486759.ooo.test podman[71760]: 2025-10-14 08:28:52.852198288 +0000 UTC m=+0.159752050 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Oct 14 08:28:52 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:28:52 np0005486759.ooo.test podman[71762]: 2025-10-14 08:28:52.896511544 +0000 UTC m=+0.198448171 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, version=17.1.9, build-date=2025-07-21T13:28:44, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Oct 14 08:28:52 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:28:52 np0005486759.ooo.test sudo[71816]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfmfp8cu/privsep.sock
Oct 14 08:28:52 np0005486759.ooo.test sudo[71816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:53 np0005486759.ooo.test sudo[71816]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:53 np0005486759.ooo.test sudo[71827]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcrkxeubc/privsep.sock
Oct 14 08:28:53 np0005486759.ooo.test sudo[71827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:54 np0005486759.ooo.test sudo[71827]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:54 np0005486759.ooo.test sudo[71838]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz0bsvqag/privsep.sock
Oct 14 08:28:54 np0005486759.ooo.test sudo[71838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:28:54 np0005486759.ooo.test podman[71840]: 2025-10-14 08:28:54.849673383 +0000 UTC m=+0.079011213 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4)
Oct 14 08:28:55 np0005486759.ooo.test podman[71840]: 2025-10-14 08:28:55.232485656 +0000 UTC m=+0.461823506 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20250721.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:28:55 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:28:55 np0005486759.ooo.test sudo[71838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:55 np0005486759.ooo.test sudo[71873]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxa4f06ri/privsep.sock
Oct 14 08:28:55 np0005486759.ooo.test sudo[71873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:56 np0005486759.ooo.test sudo[71873]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:56 np0005486759.ooo.test sudo[71884]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfaqxbvas/privsep.sock
Oct 14 08:28:56 np0005486759.ooo.test sudo[71884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:57 np0005486759.ooo.test sudo[71884]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:57 np0005486759.ooo.test sudo[71901]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiz3uuccr/privsep.sock
Oct 14 08:28:57 np0005486759.ooo.test sudo[71901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:57 np0005486759.ooo.test sudo[71901]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:58 np0005486759.ooo.test sudo[71912]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcv50hf26/privsep.sock
Oct 14 08:28:58 np0005486759.ooo.test sudo[71912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:58 np0005486759.ooo.test sudo[71912]: pam_unix(sudo:session): session closed for user root
Oct 14 08:28:59 np0005486759.ooo.test sudo[71923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwgek42cd/privsep.sock
Oct 14 08:28:59 np0005486759.ooo.test sudo[71923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:28:59 np0005486759.ooo.test sudo[71923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:00 np0005486759.ooo.test sudo[71934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmj1kha8i/privsep.sock
Oct 14 08:29:00 np0005486759.ooo.test sudo[71934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:00 np0005486759.ooo.test sudo[71934]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:00 np0005486759.ooo.test sudo[71945]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp00qo6zrw/privsep.sock
Oct 14 08:29:00 np0005486759.ooo.test sudo[71945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:01 np0005486759.ooo.test sudo[71945]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:01 np0005486759.ooo.test sudo[71956]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps7q4l2j2/privsep.sock
Oct 14 08:29:01 np0005486759.ooo.test sudo[71956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:02 np0005486759.ooo.test sudo[71956]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:02 np0005486759.ooo.test sudo[71973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw1sj0qwh/privsep.sock
Oct 14 08:29:02 np0005486759.ooo.test sudo[71973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:29:03 np0005486759.ooo.test sudo[71973]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:03 np0005486759.ooo.test podman[71977]: 2025-10-14 08:29:03.442941272 +0000 UTC m=+0.071118789 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 08:29:03 np0005486759.ooo.test podman[71977]: 2025-10-14 08:29:03.61328751 +0000 UTC m=+0.241465037 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 08:29:03 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:29:03 np0005486759.ooo.test sudo[72012]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw30bgb4u/privsep.sock
Oct 14 08:29:03 np0005486759.ooo.test sudo[72012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:04 np0005486759.ooo.test sudo[72012]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:04 np0005486759.ooo.test sudo[72023]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2o68wipe/privsep.sock
Oct 14 08:29:04 np0005486759.ooo.test sudo[72023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: tmp-crun.eyhX54.mount: Deactivated successfully.
Oct 14 08:29:04 np0005486759.ooo.test podman[72025]: 2025-10-14 08:29:04.679019181 +0000 UTC m=+0.091897524 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, version=17.1.9, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Oct 14 08:29:04 np0005486759.ooo.test podman[72025]: 2025-10-14 08:29:04.685140631 +0000 UTC m=+0.098018944 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, tcib_managed=true, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:29:04 np0005486759.ooo.test podman[72041]: 2025-10-14 08:29:04.75212686 +0000 UTC m=+0.078798376 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, tcib_managed=true, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., distribution-scope=public)
Oct 14 08:29:04 np0005486759.ooo.test podman[72041]: 2025-10-14 08:29:04.761254525 +0000 UTC m=+0.087926071 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, build-date=2025-07-21T13:27:15)
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:29:04 np0005486759.ooo.test podman[72043]: 2025-10-14 08:29:04.733873024 +0000 UTC m=+0.054024268 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:29:04 np0005486759.ooo.test podman[72043]: 2025-10-14 08:29:04.814264329 +0000 UTC m=+0.134415553 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, vcs-type=git)
Oct 14 08:29:04 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:29:05 np0005486759.ooo.test sudo[72023]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:05 np0005486759.ooo.test sudo[72096]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfg9dcadb/privsep.sock
Oct 14 08:29:05 np0005486759.ooo.test sudo[72096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:05 np0005486759.ooo.test sudo[72096]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:06 np0005486759.ooo.test sudo[72107]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp488m9xqq/privsep.sock
Oct 14 08:29:06 np0005486759.ooo.test sudo[72107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:06 np0005486759.ooo.test sudo[72107]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:07 np0005486759.ooo.test sudo[72118]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphpa4nxda/privsep.sock
Oct 14 08:29:07 np0005486759.ooo.test sudo[72118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:07 np0005486759.ooo.test sudo[72118]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:07 np0005486759.ooo.test sudo[72134]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnvkmxuc3/privsep.sock
Oct 14 08:29:07 np0005486759.ooo.test sudo[72134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:08 np0005486759.ooo.test sudo[72134]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:08 np0005486759.ooo.test sudo[72146]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcot7v7kf/privsep.sock
Oct 14 08:29:08 np0005486759.ooo.test sudo[72146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:09 np0005486759.ooo.test sudo[72146]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:09 np0005486759.ooo.test sudo[72157]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3mbsvaqn/privsep.sock
Oct 14 08:29:09 np0005486759.ooo.test sudo[72157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:10 np0005486759.ooo.test sudo[72157]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:10 np0005486759.ooo.test sudo[72168]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ywlwn45/privsep.sock
Oct 14 08:29:10 np0005486759.ooo.test sudo[72168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:11 np0005486759.ooo.test sudo[72168]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:11 np0005486759.ooo.test sudo[72179]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp40guq8fu/privsep.sock
Oct 14 08:29:11 np0005486759.ooo.test sudo[72179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:12 np0005486759.ooo.test sudo[72179]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:12 np0005486759.ooo.test sudo[72190]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjp6om3bt/privsep.sock
Oct 14 08:29:12 np0005486759.ooo.test sudo[72190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:12 np0005486759.ooo.test sudo[72190]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:13 np0005486759.ooo.test sudo[72203]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_frutobc/privsep.sock
Oct 14 08:29:13 np0005486759.ooo.test sudo[72203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:13 np0005486759.ooo.test sudo[72203]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:14 np0005486759.ooo.test sudo[72218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1m5sdtw9/privsep.sock
Oct 14 08:29:14 np0005486759.ooo.test sudo[72218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:14 np0005486759.ooo.test sudo[72218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:14 np0005486759.ooo.test sudo[72229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpud946dz_/privsep.sock
Oct 14 08:29:14 np0005486759.ooo.test sudo[72229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:15 np0005486759.ooo.test sudo[72229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:15 np0005486759.ooo.test sudo[72240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6f__sisq/privsep.sock
Oct 14 08:29:15 np0005486759.ooo.test sudo[72240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:16 np0005486759.ooo.test sudo[72240]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:16 np0005486759.ooo.test sudo[72251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvsz5wfl7/privsep.sock
Oct 14 08:29:16 np0005486759.ooo.test sudo[72251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:17 np0005486759.ooo.test sudo[72251]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:17 np0005486759.ooo.test sudo[72262]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_lfik2qt/privsep.sock
Oct 14 08:29:17 np0005486759.ooo.test sudo[72262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:18 np0005486759.ooo.test sudo[72262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:29:18 np0005486759.ooo.test podman[72268]: 2025-10-14 08:29:18.181289049 +0000 UTC m=+0.084201456 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, name=rhosp17/openstack-cron, release=1, vcs-type=git, com.redhat.component=openstack-cron-container)
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: tmp-crun.xehbQs.mount: Deactivated successfully.
Oct 14 08:29:18 np0005486759.ooo.test podman[72270]: 2025-10-14 08:29:18.238558704 +0000 UTC m=+0.134732541 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, architecture=x86_64, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:29:18 np0005486759.ooo.test podman[72268]: 2025-10-14 08:29:18.269403796 +0000 UTC m=+0.172316213 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:29:18 np0005486759.ooo.test podman[72269]: 2025-10-14 08:29:18.286610913 +0000 UTC m=+0.185383251 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64)
Oct 14 08:29:18 np0005486759.ooo.test podman[72269]: 2025-10-14 08:29:18.302401005 +0000 UTC m=+0.201173383 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 08:29:18 np0005486759.ooo.test podman[72269]: unhealthy
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:29:18 np0005486759.ooo.test podman[72270]: 2025-10-14 08:29:18.328234 +0000 UTC m=+0.224407817 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:29:18 np0005486759.ooo.test podman[72270]: unhealthy
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:29:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:29:18 np0005486759.ooo.test sudo[72329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp86qj236c/privsep.sock
Oct 14 08:29:18 np0005486759.ooo.test sudo[72329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:18 np0005486759.ooo.test sudo[72329]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:19 np0005486759.ooo.test sudo[72346]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxq1mogzw/privsep.sock
Oct 14 08:29:19 np0005486759.ooo.test sudo[72346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:19 np0005486759.ooo.test sudo[72346]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:20 np0005486759.ooo.test sudo[72357]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyh0l3511/privsep.sock
Oct 14 08:29:20 np0005486759.ooo.test sudo[72357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:20 np0005486759.ooo.test sudo[72357]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:20 np0005486759.ooo.test sudo[72368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5f160n0w/privsep.sock
Oct 14 08:29:20 np0005486759.ooo.test sudo[72368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:21 np0005486759.ooo.test sudo[72368]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:21 np0005486759.ooo.test sudo[72379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp74323tmd/privsep.sock
Oct 14 08:29:21 np0005486759.ooo.test sudo[72379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:22 np0005486759.ooo.test sudo[72379]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:22 np0005486759.ooo.test sudo[72390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8e4tvcvw/privsep.sock
Oct 14 08:29:22 np0005486759.ooo.test sudo[72390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:23 np0005486759.ooo.test sudo[72390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:29:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:29:23 np0005486759.ooo.test podman[72396]: 2025-10-14 08:29:23.407520597 +0000 UTC m=+0.098632306 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:29:23 np0005486759.ooo.test podman[72396]: 2025-10-14 08:29:23.460190999 +0000 UTC m=+0.151302708 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9)
Oct 14 08:29:23 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:29:23 np0005486759.ooo.test podman[72397]: 2025-10-14 08:29:23.467068064 +0000 UTC m=+0.153952601 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, version=17.1.9)
Oct 14 08:29:23 np0005486759.ooo.test sudo[72449]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt7lxgjna/privsep.sock
Oct 14 08:29:23 np0005486759.ooo.test sudo[72449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:23 np0005486759.ooo.test podman[72397]: 2025-10-14 08:29:23.546732977 +0000 UTC m=+0.233617534 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=ovn_controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, release=1, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.9)
Oct 14 08:29:23 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:29:24 np0005486759.ooo.test sudo[72449]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:24 np0005486759.ooo.test sudo[72466]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmoqae768/privsep.sock
Oct 14 08:29:24 np0005486759.ooo.test sudo[72466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:25 np0005486759.ooo.test sudo[72466]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:25 np0005486759.ooo.test sudo[72477]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpryu3smhj/privsep.sock
Oct 14 08:29:25 np0005486759.ooo.test sudo[72477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:29:25 np0005486759.ooo.test podman[72479]: 2025-10-14 08:29:25.37219936 +0000 UTC m=+0.078484378 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37)
Oct 14 08:29:25 np0005486759.ooo.test podman[72479]: 2025-10-14 08:29:25.735594019 +0000 UTC m=+0.441879097 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:29:25 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:29:25 np0005486759.ooo.test sudo[72477]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:26 np0005486759.ooo.test sudo[72510]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz46stj06/privsep.sock
Oct 14 08:29:26 np0005486759.ooo.test sudo[72510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:26 np0005486759.ooo.test sudo[72510]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:27 np0005486759.ooo.test sudo[72521]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8o7pq7hx/privsep.sock
Oct 14 08:29:27 np0005486759.ooo.test sudo[72521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:27 np0005486759.ooo.test sudo[72521]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:27 np0005486759.ooo.test sudo[72532]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpswh96vzb/privsep.sock
Oct 14 08:29:27 np0005486759.ooo.test sudo[72532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:28 np0005486759.ooo.test sudo[72532]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:28 np0005486759.ooo.test sudo[72543]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe19e92oi/privsep.sock
Oct 14 08:29:28 np0005486759.ooo.test sudo[72543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:29 np0005486759.ooo.test sudo[72543]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:29 np0005486759.ooo.test sudo[72560]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmperyo6wlh/privsep.sock
Oct 14 08:29:29 np0005486759.ooo.test sudo[72560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:30 np0005486759.ooo.test sudo[72560]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:30 np0005486759.ooo.test sudo[72571]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpczr7sdvk/privsep.sock
Oct 14 08:29:30 np0005486759.ooo.test sudo[72571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:31 np0005486759.ooo.test sudo[72571]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:31 np0005486759.ooo.test sudo[72582]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl8b8csal/privsep.sock
Oct 14 08:29:31 np0005486759.ooo.test sudo[72582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:32 np0005486759.ooo.test sudo[72582]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:32 np0005486759.ooo.test sudo[72593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmccwus4c/privsep.sock
Oct 14 08:29:32 np0005486759.ooo.test sudo[72593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:32 np0005486759.ooo.test sudo[72593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:33 np0005486759.ooo.test sudo[72604]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp8lk0hhq/privsep.sock
Oct 14 08:29:33 np0005486759.ooo.test sudo[72604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:33 np0005486759.ooo.test sudo[72604]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:29:33 np0005486759.ooo.test systemd[1]: tmp-crun.oS01MB.mount: Deactivated successfully.
Oct 14 08:29:33 np0005486759.ooo.test podman[72608]: 2025-10-14 08:29:33.913169001 +0000 UTC m=+0.079295304 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd)
Oct 14 08:29:34 np0005486759.ooo.test sudo[72643]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpif4hrif_/privsep.sock
Oct 14 08:29:34 np0005486759.ooo.test sudo[72643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:34 np0005486759.ooo.test podman[72608]: 2025-10-14 08:29:34.13642255 +0000 UTC m=+0.302548833 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:29:34 np0005486759.ooo.test sudo[72643]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:29:34 np0005486759.ooo.test podman[72649]: 2025-10-14 08:29:34.809341691 +0000 UTC m=+0.067876818 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, tcib_managed=true, build-date=2025-07-21T13:04:03, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=collectd)
Oct 14 08:29:34 np0005486759.ooo.test podman[72649]: 2025-10-14 08:29:34.84429848 +0000 UTC m=+0.102833587 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, release=2, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: tmp-crun.SbUmqw.mount: Deactivated successfully.
Oct 14 08:29:34 np0005486759.ooo.test podman[72680]: 2025-10-14 08:29:34.927553076 +0000 UTC m=+0.056920375 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:29:34 np0005486759.ooo.test podman[72680]: 2025-10-14 08:29:34.9562338 +0000 UTC m=+0.085601109 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1, architecture=x86_64, version=17.1.9, name=rhosp17/openstack-nova-compute, container_name=nova_compute)
Oct 14 08:29:34 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:29:34 np0005486759.ooo.test sudo[72719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt5t7k29y/privsep.sock
Oct 14 08:29:34 np0005486759.ooo.test sudo[72719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:35 np0005486759.ooo.test podman[72669]: 2025-10-14 08:29:34.908683257 +0000 UTC m=+0.088897932 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, container_name=iscsid, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:29:35 np0005486759.ooo.test podman[72669]: 2025-10-14 08:29:35.041386314 +0000 UTC m=+0.221601019 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, container_name=iscsid, io.openshift.expose-services=, release=1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:29:35 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:29:35 np0005486759.ooo.test sudo[72719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:35 np0005486759.ooo.test sudo[72736]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1ssewcdr/privsep.sock
Oct 14 08:29:35 np0005486759.ooo.test sudo[72736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:36 np0005486759.ooo.test sudo[72736]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:36 np0005486759.ooo.test sudo[72747]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps_xl1j7v/privsep.sock
Oct 14 08:29:36 np0005486759.ooo.test sudo[72747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:37 np0005486759.ooo.test sudo[72747]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:37 np0005486759.ooo.test sudo[72758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ldok9y0/privsep.sock
Oct 14 08:29:37 np0005486759.ooo.test sudo[72758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:38 np0005486759.ooo.test sudo[72758]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:38 np0005486759.ooo.test sudo[72769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2kq2_pn6/privsep.sock
Oct 14 08:29:38 np0005486759.ooo.test sudo[72769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:39 np0005486759.ooo.test sudo[72769]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:39 np0005486759.ooo.test sudo[72780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpki9xktxp/privsep.sock
Oct 14 08:29:39 np0005486759.ooo.test sudo[72780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:40 np0005486759.ooo.test sudo[72780]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:40 np0005486759.ooo.test sudo[72796]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnpm_6p5k/privsep.sock
Oct 14 08:29:40 np0005486759.ooo.test sudo[72796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:40 np0005486759.ooo.test sudo[72796]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:41 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:29:41 np0005486759.ooo.test recover_tripleo_nova_virtqemud[72804]: 47951
Oct 14 08:29:41 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:29:41 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:29:41 np0005486759.ooo.test sudo[72810]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5sxxj6uo/privsep.sock
Oct 14 08:29:41 np0005486759.ooo.test sudo[72810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:41 np0005486759.ooo.test sudo[72810]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:42 np0005486759.ooo.test sudo[72821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbj9wy5hp/privsep.sock
Oct 14 08:29:42 np0005486759.ooo.test sudo[72821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:42 np0005486759.ooo.test sudo[72821]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:43 np0005486759.ooo.test sudo[72832]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm1keqwa5/privsep.sock
Oct 14 08:29:43 np0005486759.ooo.test sudo[72832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:43 np0005486759.ooo.test sudo[72832]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:44 np0005486759.ooo.test sudo[72843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg7z1khur/privsep.sock
Oct 14 08:29:44 np0005486759.ooo.test sudo[72843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:44 np0005486759.ooo.test sudo[72843]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:44 np0005486759.ooo.test sudo[72854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnano9hkc/privsep.sock
Oct 14 08:29:44 np0005486759.ooo.test sudo[72854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:45 np0005486759.ooo.test sudo[72854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:45 np0005486759.ooo.test sudo[72867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzguxdmvo/privsep.sock
Oct 14 08:29:45 np0005486759.ooo.test sudo[72867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:46 np0005486759.ooo.test sudo[72867]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:46 np0005486759.ooo.test sudo[72882]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9i_cluel/privsep.sock
Oct 14 08:29:46 np0005486759.ooo.test sudo[72882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:47 np0005486759.ooo.test sudo[72882]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:47 np0005486759.ooo.test sudo[72893]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ds4g6zq/privsep.sock
Oct 14 08:29:47 np0005486759.ooo.test sudo[72893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:47 np0005486759.ooo.test sudo[72893]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:48 np0005486759.ooo.test sudo[72904]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc5z6hswa/privsep.sock
Oct 14 08:29:48 np0005486759.ooo.test sudo[72904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:29:48 np0005486759.ooo.test podman[72908]: 2025-10-14 08:29:48.446823275 +0000 UTC m=+0.071054587 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=)
Oct 14 08:29:48 np0005486759.ooo.test podman[72906]: 2025-10-14 08:29:48.510845261 +0000 UTC m=+0.137132166 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team)
Oct 14 08:29:48 np0005486759.ooo.test podman[72907]: 2025-10-14 08:29:48.568685654 +0000 UTC m=+0.193328198 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, vcs-type=git, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:29:48 np0005486759.ooo.test podman[72907]: 2025-10-14 08:29:48.58010944 +0000 UTC m=+0.204751944 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1, architecture=x86_64, build-date=2025-07-21T14:45:33, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:29:48 np0005486759.ooo.test podman[72907]: unhealthy
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:29:48 np0005486759.ooo.test podman[72906]: 2025-10-14 08:29:48.594652454 +0000 UTC m=+0.220939369 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, release=1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., container_name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.9)
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:29:48 np0005486759.ooo.test podman[72908]: 2025-10-14 08:29:48.635791976 +0000 UTC m=+0.260023328 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.9, distribution-scope=public, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:29:48 np0005486759.ooo.test podman[72908]: unhealthy
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:29:48 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:29:48 np0005486759.ooo.test sudo[72904]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:49 np0005486759.ooo.test sudo[72977]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp910ht6f4/privsep.sock
Oct 14 08:29:49 np0005486759.ooo.test sudo[72977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:49 np0005486759.ooo.test sudo[72977]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:49 np0005486759.ooo.test sudo[72988]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp58fjtyei/privsep.sock
Oct 14 08:29:49 np0005486759.ooo.test sudo[72988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:50 np0005486759.ooo.test sudo[72988]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:50 np0005486759.ooo.test sudo[72999]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpthlfb6vo/privsep.sock
Oct 14 08:29:50 np0005486759.ooo.test sudo[72999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:51 np0005486759.ooo.test sudo[72999]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:51 np0005486759.ooo.test sudo[73016]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxcwy9e32/privsep.sock
Oct 14 08:29:51 np0005486759.ooo.test sudo[73016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:52 np0005486759.ooo.test sudo[73016]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:52 np0005486759.ooo.test sudo[73027]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfbhx97yu/privsep.sock
Oct 14 08:29:52 np0005486759.ooo.test sudo[73027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:53 np0005486759.ooo.test sudo[73027]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:53 np0005486759.ooo.test sudo[73038]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwiue1_g_/privsep.sock
Oct 14 08:29:53 np0005486759.ooo.test sudo[73038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:54 np0005486759.ooo.test sudo[73038]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:29:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:29:54 np0005486759.ooo.test systemd[1]: tmp-crun.aQm2pN.mount: Deactivated successfully.
Oct 14 08:29:54 np0005486759.ooo.test podman[73045]: 2025-10-14 08:29:54.220642485 +0000 UTC m=+0.144247019 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git)
Oct 14 08:29:54 np0005486759.ooo.test podman[73045]: 2025-10-14 08:29:54.24036367 +0000 UTC m=+0.163968194 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_id=tripleo_step4)
Oct 14 08:29:54 np0005486759.ooo.test podman[73044]: 2025-10-14 08:29:54.198111953 +0000 UTC m=+0.125358280 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Oct 14 08:29:54 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:29:54 np0005486759.ooo.test podman[73044]: 2025-10-14 08:29:54.281284215 +0000 UTC m=+0.208530512 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:29:54 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:29:54 np0005486759.ooo.test sudo[73097]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpus21jifk/privsep.sock
Oct 14 08:29:54 np0005486759.ooo.test sudo[73097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:54 np0005486759.ooo.test sudo[73097]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:55 np0005486759.ooo.test sudo[73108]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpitcroers/privsep.sock
Oct 14 08:29:55 np0005486759.ooo.test sudo[73108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:55 np0005486759.ooo.test sudo[73108]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:29:55 np0005486759.ooo.test systemd[1]: tmp-crun.o6gaCw.mount: Deactivated successfully.
Oct 14 08:29:55 np0005486759.ooo.test podman[73112]: 2025-10-14 08:29:55.952821879 +0000 UTC m=+0.068514437 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, release=1, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12)
Oct 14 08:29:56 np0005486759.ooo.test sudo[73140]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd84zrqkt/privsep.sock
Oct 14 08:29:56 np0005486759.ooo.test sudo[73140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:56 np0005486759.ooo.test podman[73112]: 2025-10-14 08:29:56.386321764 +0000 UTC m=+0.502014352 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step4, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T14:48:37, version=17.1.9, com.redhat.component=openstack-nova-compute-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 08:29:56 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:29:56 np0005486759.ooo.test sudo[73140]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:56 np0005486759.ooo.test sudo[73158]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxmlxafsr/privsep.sock
Oct 14 08:29:56 np0005486759.ooo.test sudo[73158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:57 np0005486759.ooo.test sudo[73158]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:57 np0005486759.ooo.test sudo[73169]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkae96duu/privsep.sock
Oct 14 08:29:57 np0005486759.ooo.test sudo[73169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:58 np0005486759.ooo.test sudo[73169]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:58 np0005486759.ooo.test sudo[73180]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeetau7jr/privsep.sock
Oct 14 08:29:58 np0005486759.ooo.test sudo[73180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:29:59 np0005486759.ooo.test sudo[73180]: pam_unix(sudo:session): session closed for user root
Oct 14 08:29:59 np0005486759.ooo.test sudo[73192]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeqp21fk4/privsep.sock
Oct 14 08:29:59 np0005486759.ooo.test sudo[73192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:00 np0005486759.ooo.test sudo[73192]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:00 np0005486759.ooo.test sudo[73203]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsiwc2kwo/privsep.sock
Oct 14 08:30:00 np0005486759.ooo.test sudo[73203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:01 np0005486759.ooo.test sudo[73203]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:01 np0005486759.ooo.test sudo[73214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2xwp2ojx/privsep.sock
Oct 14 08:30:01 np0005486759.ooo.test sudo[73214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:02 np0005486759.ooo.test sudo[73214]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:02 np0005486759.ooo.test sudo[73231]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxt_0v182/privsep.sock
Oct 14 08:30:02 np0005486759.ooo.test sudo[73231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:02 np0005486759.ooo.test sudo[73231]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:03 np0005486759.ooo.test sudo[73242]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp61hjv2km/privsep.sock
Oct 14 08:30:03 np0005486759.ooo.test sudo[73242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:03 np0005486759.ooo.test sudo[73242]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:03 np0005486759.ooo.test sudo[73253]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7c2jqs8l/privsep.sock
Oct 14 08:30:03 np0005486759.ooo.test sudo[73253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:30:04 np0005486759.ooo.test podman[73256]: 2025-10-14 08:30:04.434097149 +0000 UTC m=+0.065248616 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, release=1)
Oct 14 08:30:04 np0005486759.ooo.test sudo[73253]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:04 np0005486759.ooo.test podman[73256]: 2025-10-14 08:30:04.597023888 +0000 UTC m=+0.228175375 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, version=17.1.9, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Oct 14 08:30:04 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:30:04 np0005486759.ooo.test sudo[73292]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz5_fl04t/privsep.sock
Oct 14 08:30:04 np0005486759.ooo.test sudo[73292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:30:05 np0005486759.ooo.test sudo[73292]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: tmp-crun.Db08F9.mount: Deactivated successfully.
Oct 14 08:30:05 np0005486759.ooo.test podman[73298]: 2025-10-14 08:30:05.465407382 +0000 UTC m=+0.089840242 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, build-date=2025-07-21T13:04:03)
Oct 14 08:30:05 np0005486759.ooo.test podman[73298]: 2025-10-14 08:30:05.500472425 +0000 UTC m=+0.124905365 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, release=2, tcib_managed=true, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:30:05 np0005486759.ooo.test podman[73296]: 2025-10-14 08:30:05.501999203 +0000 UTC m=+0.127907889 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, build-date=2025-07-21T13:27:15, version=17.1.9, io.openshift.expose-services=, release=1)
Oct 14 08:30:05 np0005486759.ooo.test podman[73297]: 2025-10-14 08:30:05.568917649 +0000 UTC m=+0.194495055 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, version=17.1.9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:30:05 np0005486759.ooo.test podman[73296]: 2025-10-14 08:30:05.585426984 +0000 UTC m=+0.211335810 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, container_name=iscsid, distribution-scope=public)
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:30:05 np0005486759.ooo.test podman[73297]: 2025-10-14 08:30:05.625329528 +0000 UTC m=+0.250906934 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, io.openshift.expose-services=, container_name=nova_compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 08:30:05 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:30:05 np0005486759.ooo.test sudo[73367]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplgtmzu8l/privsep.sock
Oct 14 08:30:05 np0005486759.ooo.test sudo[73367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:06 np0005486759.ooo.test sudo[73367]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:06 np0005486759.ooo.test sudo[73378]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpewaldkr1/privsep.sock
Oct 14 08:30:06 np0005486759.ooo.test sudo[73378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:07 np0005486759.ooo.test sudo[73378]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:07 np0005486759.ooo.test sudo[73392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbhz_1y2a/privsep.sock
Oct 14 08:30:07 np0005486759.ooo.test sudo[73392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:07 np0005486759.ooo.test sudo[73392]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:08 np0005486759.ooo.test sudo[73406]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9sby4g2e/privsep.sock
Oct 14 08:30:08 np0005486759.ooo.test sudo[73406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:08 np0005486759.ooo.test sudo[73406]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:09 np0005486759.ooo.test sudo[73417]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprtfv776t/privsep.sock
Oct 14 08:30:09 np0005486759.ooo.test sudo[73417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:09 np0005486759.ooo.test sudo[73417]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:10 np0005486759.ooo.test sudo[73428]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkohze6n8/privsep.sock
Oct 14 08:30:10 np0005486759.ooo.test sudo[73428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:10 np0005486759.ooo.test sudo[73428]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:10 np0005486759.ooo.test sudo[73439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuhkovk5y/privsep.sock
Oct 14 08:30:10 np0005486759.ooo.test sudo[73439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:11 np0005486759.ooo.test sudo[73439]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:11 np0005486759.ooo.test sudo[73450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3otonnwr/privsep.sock
Oct 14 08:30:11 np0005486759.ooo.test sudo[73450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:12 np0005486759.ooo.test sudo[73450]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:12 np0005486759.ooo.test sudo[73463]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptu86nkqf/privsep.sock
Oct 14 08:30:12 np0005486759.ooo.test sudo[73463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:13 np0005486759.ooo.test sudo[73463]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:13 np0005486759.ooo.test sudo[73478]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9s_s9e8e/privsep.sock
Oct 14 08:30:13 np0005486759.ooo.test sudo[73478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:14 np0005486759.ooo.test sudo[73478]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:14 np0005486759.ooo.test sudo[73489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt_fco8sx/privsep.sock
Oct 14 08:30:14 np0005486759.ooo.test sudo[73489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:15 np0005486759.ooo.test sudo[73489]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:15 np0005486759.ooo.test sudo[73500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpebr931kl/privsep.sock
Oct 14 08:30:15 np0005486759.ooo.test sudo[73500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:16 np0005486759.ooo.test sudo[73500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:16 np0005486759.ooo.test sudo[73511]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm6ff0co1/privsep.sock
Oct 14 08:30:16 np0005486759.ooo.test sudo[73511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:16 np0005486759.ooo.test sudo[73511]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:17 np0005486759.ooo.test sudo[73522]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnouvqwab/privsep.sock
Oct 14 08:30:17 np0005486759.ooo.test sudo[73522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:17 np0005486759.ooo.test sudo[73522]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:18 np0005486759.ooo.test sudo[73535]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2jqmvv1v/privsep.sock
Oct 14 08:30:18 np0005486759.ooo.test sudo[73535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:18 np0005486759.ooo.test sudo[73535]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: tmp-crun.9WgTp5.mount: Deactivated successfully.
Oct 14 08:30:18 np0005486759.ooo.test podman[73550]: 2025-10-14 08:30:18.850655522 +0000 UTC m=+0.078884330 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, release=1, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team)
Oct 14 08:30:18 np0005486759.ooo.test podman[73546]: 2025-10-14 08:30:18.82300316 +0000 UTC m=+0.056919416 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vcs-type=git, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, version=17.1.9, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:30:18 np0005486759.ooo.test podman[73545]: 2025-10-14 08:30:18.885706144 +0000 UTC m=+0.122995966 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step4)
Oct 14 08:30:18 np0005486759.ooo.test podman[73546]: 2025-10-14 08:30:18.909360962 +0000 UTC m=+0.143277188 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9)
Oct 14 08:30:18 np0005486759.ooo.test podman[73546]: unhealthy
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:30:18 np0005486759.ooo.test podman[73550]: 2025-10-14 08:30:18.938925503 +0000 UTC m=+0.167154321 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 08:30:18 np0005486759.ooo.test podman[73550]: unhealthy
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:30:18 np0005486759.ooo.test podman[73545]: 2025-10-14 08:30:18.974203904 +0000 UTC m=+0.211493766 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, release=1, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, maintainer=OpenStack TripleO Team)
Oct 14 08:30:18 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:30:19 np0005486759.ooo.test sudo[73609]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpif4k52mm/privsep.sock
Oct 14 08:30:19 np0005486759.ooo.test sudo[73609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:19 np0005486759.ooo.test sudo[73609]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:19 np0005486759.ooo.test sudo[73620]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpktc7wqut/privsep.sock
Oct 14 08:30:19 np0005486759.ooo.test sudo[73620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:20 np0005486759.ooo.test sudo[73620]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:20 np0005486759.ooo.test sudo[73631]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxy8xwppf/privsep.sock
Oct 14 08:30:20 np0005486759.ooo.test sudo[73631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:21 np0005486759.ooo.test sudo[73631]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:21 np0005486759.ooo.test sudo[73642]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpetauladb/privsep.sock
Oct 14 08:30:21 np0005486759.ooo.test sudo[73642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:22 np0005486759.ooo.test sudo[73642]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:22 np0005486759.ooo.test sudo[73653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2zhezji9/privsep.sock
Oct 14 08:30:22 np0005486759.ooo.test sudo[73653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:23 np0005486759.ooo.test sudo[73653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:23 np0005486759.ooo.test sudo[73669]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4tzmk2ay/privsep.sock
Oct 14 08:30:23 np0005486759.ooo.test sudo[73669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:24 np0005486759.ooo.test sudo[73669]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:30:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:30:24 np0005486759.ooo.test podman[73675]: 2025-10-14 08:30:24.333253742 +0000 UTC m=+0.060604850 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, release=1)
Oct 14 08:30:24 np0005486759.ooo.test podman[73694]: 2025-10-14 08:30:24.411194502 +0000 UTC m=+0.070355854 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-07-21T16:28:53, distribution-scope=public, container_name=ovn_metadata_agent, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:30:24 np0005486759.ooo.test podman[73675]: 2025-10-14 08:30:24.431800355 +0000 UTC m=+0.159151423 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.9, release=1, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:30:24 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:30:24 np0005486759.ooo.test podman[73694]: 2025-10-14 08:30:24.469612304 +0000 UTC m=+0.128773586 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20250721.1, version=17.1.9, release=1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Oct 14 08:30:24 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:30:24 np0005486759.ooo.test sudo[73729]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptlus848f/privsep.sock
Oct 14 08:30:24 np0005486759.ooo.test sudo[73729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:25 np0005486759.ooo.test sudo[73729]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:25 np0005486759.ooo.test sudo[73740]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjdnzd68u/privsep.sock
Oct 14 08:30:25 np0005486759.ooo.test sudo[73740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:25 np0005486759.ooo.test sudo[73740]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:26 np0005486759.ooo.test sudo[73751]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjvlhdbkn/privsep.sock
Oct 14 08:30:26 np0005486759.ooo.test sudo[73751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:26 np0005486759.ooo.test sudo[73751]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:30:26 np0005486759.ooo.test podman[73756]: 2025-10-14 08:30:26.975870421 +0000 UTC m=+0.070328864 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:30:27 np0005486759.ooo.test sudo[73782]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm_g3iooz/privsep.sock
Oct 14 08:30:27 np0005486759.ooo.test sudo[73782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:27 np0005486759.ooo.test podman[73756]: 2025-10-14 08:30:27.406527767 +0000 UTC m=+0.500986210 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Oct 14 08:30:27 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:30:27 np0005486759.ooo.test sudo[73782]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:28 np0005486759.ooo.test sudo[73794]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0ba2h57_/privsep.sock
Oct 14 08:30:28 np0005486759.ooo.test sudo[73794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:28 np0005486759.ooo.test sudo[73794]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:29 np0005486759.ooo.test sudo[73810]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6v_qkpfa/privsep.sock
Oct 14 08:30:29 np0005486759.ooo.test sudo[73810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:29 np0005486759.ooo.test sudo[73810]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:30 np0005486759.ooo.test sudo[73822]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmapeh6ej/privsep.sock
Oct 14 08:30:30 np0005486759.ooo.test sudo[73822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:30 np0005486759.ooo.test sudo[73822]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:30 np0005486759.ooo.test sudo[73833]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaxq4s843/privsep.sock
Oct 14 08:30:30 np0005486759.ooo.test sudo[73833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:31 np0005486759.ooo.test sudo[73833]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:31 np0005486759.ooo.test sudo[73844]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9iz96emd/privsep.sock
Oct 14 08:30:31 np0005486759.ooo.test sudo[73844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:32 np0005486759.ooo.test sudo[73844]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:32 np0005486759.ooo.test sudo[73855]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp5cl4iii/privsep.sock
Oct 14 08:30:32 np0005486759.ooo.test sudo[73855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:33 np0005486759.ooo.test sudo[73855]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:33 np0005486759.ooo.test sudo[73866]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcawt_syp/privsep.sock
Oct 14 08:30:33 np0005486759.ooo.test sudo[73866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:34 np0005486759.ooo.test sudo[73866]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:34 np0005486759.ooo.test sudo[73883]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaov4ktpb/privsep.sock
Oct 14 08:30:34 np0005486759.ooo.test sudo[73883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:35 np0005486759.ooo.test sudo[73883]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: tmp-crun.UcQaAr.mount: Deactivated successfully.
Oct 14 08:30:35 np0005486759.ooo.test podman[73887]: 2025-10-14 08:30:35.088078804 +0000 UTC m=+0.050980570 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, release=1, tcib_managed=true, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1)
Oct 14 08:30:35 np0005486759.ooo.test podman[73887]: 2025-10-14 08:30:35.297859145 +0000 UTC m=+0.260760951 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, architecture=x86_64, container_name=metrics_qdr, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team)
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:30:35 np0005486759.ooo.test sudo[73923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpszcjszgu/privsep.sock
Oct 14 08:30:35 np0005486759.ooo.test sudo[73923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:35 np0005486759.ooo.test sudo[73923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:30:35 np0005486759.ooo.test systemd[1]: tmp-crun.cVYskw.mount: Deactivated successfully.
Oct 14 08:30:35 np0005486759.ooo.test podman[73929]: 2025-10-14 08:30:35.953222647 +0000 UTC m=+0.061206659 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1)
Oct 14 08:30:36 np0005486759.ooo.test podman[73936]: 2025-10-14 08:30:36.014399115 +0000 UTC m=+0.114734359 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_id=tripleo_step3, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.component=openstack-collectd-container, vcs-type=git, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:30:36 np0005486759.ooo.test podman[73936]: 2025-10-14 08:30:36.022121805 +0000 UTC m=+0.122456999 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct 14 08:30:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:30:36 np0005486759.ooo.test podman[73930]: 2025-10-14 08:30:35.986186105 +0000 UTC m=+0.086499288 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Oct 14 08:30:36 np0005486759.ooo.test podman[73930]: 2025-10-14 08:30:36.069273465 +0000 UTC m=+0.169586638 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, version=17.1.9, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible)
Oct 14 08:30:36 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:30:36 np0005486759.ooo.test podman[73929]: 2025-10-14 08:30:36.089231367 +0000 UTC m=+0.197215369 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:30:36 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:30:36 np0005486759.ooo.test sudo[73996]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi9z53kf9/privsep.sock
Oct 14 08:30:36 np0005486759.ooo.test sudo[73996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:36 np0005486759.ooo.test sudo[73996]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:36 np0005486759.ooo.test sudo[74007]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbyn169ww/privsep.sock
Oct 14 08:30:36 np0005486759.ooo.test sudo[74007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:37 np0005486759.ooo.test sudo[74007]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:37 np0005486759.ooo.test sudo[74018]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps4wegd1m/privsep.sock
Oct 14 08:30:37 np0005486759.ooo.test sudo[74018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:38 np0005486759.ooo.test sudo[74018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:38 np0005486759.ooo.test sudo[74029]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt6qtr_jh/privsep.sock
Oct 14 08:30:38 np0005486759.ooo.test sudo[74029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:39 np0005486759.ooo.test sudo[74029]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:39 np0005486759.ooo.test sudo[74042]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfvvqrqn/privsep.sock
Oct 14 08:30:39 np0005486759.ooo.test sudo[74042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:40 np0005486759.ooo.test sudo[74042]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:40 np0005486759.ooo.test sudo[74057]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi6xb21fo/privsep.sock
Oct 14 08:30:40 np0005486759.ooo.test sudo[74057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:41 np0005486759.ooo.test sudo[74057]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:41 np0005486759.ooo.test sudo[74068]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw87ckzoa/privsep.sock
Oct 14 08:30:41 np0005486759.ooo.test sudo[74068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:41 np0005486759.ooo.test sudo[74068]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:42 np0005486759.ooo.test sudo[74079]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2100e5mi/privsep.sock
Oct 14 08:30:42 np0005486759.ooo.test sudo[74079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:42 np0005486759.ooo.test sudo[74079]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:43 np0005486759.ooo.test sudo[74090]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2cbkf5q9/privsep.sock
Oct 14 08:30:43 np0005486759.ooo.test sudo[74090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:43 np0005486759.ooo.test sudo[74090]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:43 np0005486759.ooo.test sudo[74101]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkangbeu2/privsep.sock
Oct 14 08:30:43 np0005486759.ooo.test sudo[74101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:44 np0005486759.ooo.test sudo[74101]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:44 np0005486759.ooo.test sudo[74112]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2byscs3t/privsep.sock
Oct 14 08:30:44 np0005486759.ooo.test sudo[74112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:45 np0005486759.ooo.test sudo[74112]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:45 np0005486759.ooo.test sudo[74129]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp79mpxta8/privsep.sock
Oct 14 08:30:45 np0005486759.ooo.test sudo[74129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:46 np0005486759.ooo.test sudo[74129]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:46 np0005486759.ooo.test sudo[74140]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaq0vkv45/privsep.sock
Oct 14 08:30:46 np0005486759.ooo.test sudo[74140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:47 np0005486759.ooo.test sudo[74140]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:47 np0005486759.ooo.test sudo[74151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpje4insa9/privsep.sock
Oct 14 08:30:47 np0005486759.ooo.test sudo[74151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:48 np0005486759.ooo.test sudo[74151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:48 np0005486759.ooo.test sudo[74162]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkwe78a6b/privsep.sock
Oct 14 08:30:48 np0005486759.ooo.test sudo[74162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:48 np0005486759.ooo.test sudo[74162]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: tmp-crun.vwdv10.mount: Deactivated successfully.
Oct 14 08:30:49 np0005486759.ooo.test podman[74169]: 2025-10-14 08:30:49.118381225 +0000 UTC m=+0.079930852 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Oct 14 08:30:49 np0005486759.ooo.test podman[74169]: 2025-10-14 08:30:49.130249886 +0000 UTC m=+0.091799493 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, architecture=x86_64, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33)
Oct 14 08:30:49 np0005486759.ooo.test podman[74169]: unhealthy
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:30:49 np0005486759.ooo.test podman[74170]: 2025-10-14 08:30:49.100490818 +0000 UTC m=+0.066397332 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Oct 14 08:30:49 np0005486759.ooo.test podman[74168]: 2025-10-14 08:30:49.154995726 +0000 UTC m=+0.125062559 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:30:49 np0005486759.ooo.test podman[74168]: 2025-10-14 08:30:49.163516852 +0000 UTC m=+0.133583685 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, distribution-scope=public, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron)
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:30:49 np0005486759.ooo.test podman[74170]: 2025-10-14 08:30:49.182320348 +0000 UTC m=+0.148226852 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., distribution-scope=public, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, architecture=x86_64, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20250721.1, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:30:49 np0005486759.ooo.test podman[74170]: unhealthy
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:30:49 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:30:49 np0005486759.ooo.test sudo[74229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpus1wqlh4/privsep.sock
Oct 14 08:30:49 np0005486759.ooo.test sudo[74229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:49 np0005486759.ooo.test sudo[74229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:50 np0005486759.ooo.test systemd[1]: tmp-crun.S1XFpQ.mount: Deactivated successfully.
Oct 14 08:30:50 np0005486759.ooo.test sudo[74240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp037t14op/privsep.sock
Oct 14 08:30:50 np0005486759.ooo.test sudo[74240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:50 np0005486759.ooo.test sudo[74240]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:51 np0005486759.ooo.test sudo[74257]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo2smmj9l/privsep.sock
Oct 14 08:30:51 np0005486759.ooo.test sudo[74257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:51 np0005486759.ooo.test sudo[74257]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:51 np0005486759.ooo.test sudo[74268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2h3i8nub/privsep.sock
Oct 14 08:30:51 np0005486759.ooo.test sudo[74268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:52 np0005486759.ooo.test sudo[74268]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:52 np0005486759.ooo.test sudo[74279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2ssusjv5/privsep.sock
Oct 14 08:30:52 np0005486759.ooo.test sudo[74279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:53 np0005486759.ooo.test sudo[74279]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:53 np0005486759.ooo.test sudo[74290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4b2fv_14/privsep.sock
Oct 14 08:30:53 np0005486759.ooo.test sudo[74290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:54 np0005486759.ooo.test sudo[74290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:54 np0005486759.ooo.test sudo[74301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7uo9704r/privsep.sock
Oct 14 08:30:54 np0005486759.ooo.test sudo[74301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:30:54 np0005486759.ooo.test podman[74303]: 2025-10-14 08:30:54.531952824 +0000 UTC m=+0.053827239 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:30:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:30:54 np0005486759.ooo.test podman[74321]: 2025-10-14 08:30:54.610870564 +0000 UTC m=+0.056769771 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, distribution-scope=public, release=1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:30:54 np0005486759.ooo.test podman[74321]: 2025-10-14 08:30:54.635349237 +0000 UTC m=+0.081248444 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 08:30:54 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:30:54 np0005486759.ooo.test podman[74303]: 2025-10-14 08:30:54.689181385 +0000 UTC m=+0.211055810 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 08:30:54 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:30:55 np0005486759.ooo.test sudo[74301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:55 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:30:55 np0005486759.ooo.test recover_tripleo_nova_virtqemud[74356]: 47951
Oct 14 08:30:55 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:30:55 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:30:55 np0005486759.ooo.test sudo[74362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppoy5ozi7/privsep.sock
Oct 14 08:30:55 np0005486759.ooo.test sudo[74362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:55 np0005486759.ooo.test sudo[74362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:56 np0005486759.ooo.test sudo[74379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpld5zhq4g/privsep.sock
Oct 14 08:30:56 np0005486759.ooo.test sudo[74379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:56 np0005486759.ooo.test sudo[74379]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:57 np0005486759.ooo.test sudo[74390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6fwckuyp/privsep.sock
Oct 14 08:30:57 np0005486759.ooo.test sudo[74390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:57 np0005486759.ooo.test sudo[74390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:30:57 np0005486759.ooo.test systemd[1]: tmp-crun.a0E8Ev.mount: Deactivated successfully.
Oct 14 08:30:57 np0005486759.ooo.test podman[74395]: 2025-10-14 08:30:57.754717219 +0000 UTC m=+0.093459804 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:30:57 np0005486759.ooo.test sudo[74422]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyfs2e14b/privsep.sock
Oct 14 08:30:57 np0005486759.ooo.test sudo[74422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:58 np0005486759.ooo.test podman[74395]: 2025-10-14 08:30:58.145322258 +0000 UTC m=+0.484064793 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:30:58 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:30:58 np0005486759.ooo.test sudo[74422]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:58 np0005486759.ooo.test sudo[74434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdv6lma_q/privsep.sock
Oct 14 08:30:58 np0005486759.ooo.test sudo[74434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:30:59 np0005486759.ooo.test sudo[74434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:30:59 np0005486759.ooo.test sudo[74445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7mwopiou/privsep.sock
Oct 14 08:30:59 np0005486759.ooo.test sudo[74445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:00 np0005486759.ooo.test sudo[74445]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:00 np0005486759.ooo.test sudo[74456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7gp5g4rw/privsep.sock
Oct 14 08:31:00 np0005486759.ooo.test sudo[74456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:01 np0005486759.ooo.test sudo[74456]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:01 np0005486759.ooo.test sudo[74470]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsf0xf5ff/privsep.sock
Oct 14 08:31:01 np0005486759.ooo.test sudo[74470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:02 np0005486759.ooo.test sudo[74470]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:02 np0005486759.ooo.test sudo[74484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe_5e7lom/privsep.sock
Oct 14 08:31:02 np0005486759.ooo.test sudo[74484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:02 np0005486759.ooo.test sudo[74484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:03 np0005486759.ooo.test sudo[74495]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc8b1_qfj/privsep.sock
Oct 14 08:31:03 np0005486759.ooo.test sudo[74495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:03 np0005486759.ooo.test sudo[74495]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:04 np0005486759.ooo.test sudo[74506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgtrpd9m5/privsep.sock
Oct 14 08:31:04 np0005486759.ooo.test sudo[74506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:04 np0005486759.ooo.test sudo[74506]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:04 np0005486759.ooo.test sudo[74517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_710770l/privsep.sock
Oct 14 08:31:04 np0005486759.ooo.test sudo[74517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:31:05 np0005486759.ooo.test podman[74520]: 2025-10-14 08:31:05.457046974 +0000 UTC m=+0.082242275 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:31:05 np0005486759.ooo.test sudo[74517]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:05 np0005486759.ooo.test podman[74520]: 2025-10-14 08:31:05.652326672 +0000 UTC m=+0.277522053 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-type=git, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public)
Oct 14 08:31:05 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:31:05 np0005486759.ooo.test sudo[74557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqaaxa13/privsep.sock
Oct 14 08:31:05 np0005486759.ooo.test sudo[74557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:31:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:31:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:31:06 np0005486759.ooo.test sudo[74557]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:06 np0005486759.ooo.test podman[74562]: 2025-10-14 08:31:06.43583152 +0000 UTC m=+0.067505026 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, container_name=nova_compute, tcib_managed=true, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:31:06 np0005486759.ooo.test podman[74561]: 2025-10-14 08:31:06.444623404 +0000 UTC m=+0.076627940 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:31:06 np0005486759.ooo.test podman[74561]: 2025-10-14 08:31:06.44965227 +0000 UTC m=+0.081656816 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, tcib_managed=true, io.buildah.version=1.33.12, vcs-type=git)
Oct 14 08:31:06 np0005486759.ooo.test podman[74562]: 2025-10-14 08:31:06.460190529 +0000 UTC m=+0.091864045 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute)
Oct 14 08:31:06 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:31:06 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:31:06 np0005486759.ooo.test podman[74563]: 2025-10-14 08:31:06.534614079 +0000 UTC m=+0.163419976 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, distribution-scope=public)
Oct 14 08:31:06 np0005486759.ooo.test podman[74563]: 2025-10-14 08:31:06.568455784 +0000 UTC m=+0.197261691 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, version=17.1.9, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=collectd, release=2, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, build-date=2025-07-21T13:04:03)
Oct 14 08:31:06 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:31:06 np0005486759.ooo.test sudo[74630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsyrg36ov/privsep.sock
Oct 14 08:31:06 np0005486759.ooo.test sudo[74630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:07 np0005486759.ooo.test sudo[74630]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:07 np0005486759.ooo.test sudo[74647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvsoxch6h/privsep.sock
Oct 14 08:31:07 np0005486759.ooo.test sudo[74647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:08 np0005486759.ooo.test sudo[74647]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:08 np0005486759.ooo.test sudo[74658]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp96n9di7n/privsep.sock
Oct 14 08:31:08 np0005486759.ooo.test sudo[74658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:08 np0005486759.ooo.test sudo[74658]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:09 np0005486759.ooo.test sudo[74669]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_shx179w/privsep.sock
Oct 14 08:31:09 np0005486759.ooo.test sudo[74669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:09 np0005486759.ooo.test sudo[74669]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:09 np0005486759.ooo.test sudo[74680]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0ghk7nub/privsep.sock
Oct 14 08:31:09 np0005486759.ooo.test sudo[74680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:10 np0005486759.ooo.test sudo[74680]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:10 np0005486759.ooo.test sudo[74691]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnid6bs7h/privsep.sock
Oct 14 08:31:10 np0005486759.ooo.test sudo[74691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:11 np0005486759.ooo.test sudo[74691]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:11 np0005486759.ooo.test sudo[74702]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpke1qa3vh/privsep.sock
Oct 14 08:31:11 np0005486759.ooo.test sudo[74702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:12 np0005486759.ooo.test sudo[74702]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:12 np0005486759.ooo.test sudo[74719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfb0rbbes/privsep.sock
Oct 14 08:31:12 np0005486759.ooo.test sudo[74719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:13 np0005486759.ooo.test sudo[74719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:13 np0005486759.ooo.test sudo[74730]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzedcf69k/privsep.sock
Oct 14 08:31:13 np0005486759.ooo.test sudo[74730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:14 np0005486759.ooo.test sudo[74730]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:14 np0005486759.ooo.test sudo[74741]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl1hy51en/privsep.sock
Oct 14 08:31:14 np0005486759.ooo.test sudo[74741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:15 np0005486759.ooo.test sudo[74741]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:15 np0005486759.ooo.test sudo[74752]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqimuks7_/privsep.sock
Oct 14 08:31:15 np0005486759.ooo.test sudo[74752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:15 np0005486759.ooo.test sudo[74752]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:16 np0005486759.ooo.test sudo[74763]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9_83z6k7/privsep.sock
Oct 14 08:31:16 np0005486759.ooo.test sudo[74763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:16 np0005486759.ooo.test sudo[74763]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:17 np0005486759.ooo.test sudo[74774]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgwul0gve/privsep.sock
Oct 14 08:31:17 np0005486759.ooo.test sudo[74774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:17 np0005486759.ooo.test sudo[74774]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:18 np0005486759.ooo.test sudo[74791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7azbkl_9/privsep.sock
Oct 14 08:31:18 np0005486759.ooo.test sudo[74791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:18 np0005486759.ooo.test sudo[74791]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:18 np0005486759.ooo.test sudo[74802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5pfeb1s7/privsep.sock
Oct 14 08:31:18 np0005486759.ooo.test sudo[74802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:31:19 np0005486759.ooo.test podman[74805]: 2025-10-14 08:31:19.453378287 +0000 UTC m=+0.079155059 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Oct 14 08:31:19 np0005486759.ooo.test podman[74805]: 2025-10-14 08:31:19.466311499 +0000 UTC m=+0.092088201 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: tmp-crun.rizzSw.mount: Deactivated successfully.
Oct 14 08:31:19 np0005486759.ooo.test podman[74806]: 2025-10-14 08:31:19.508483385 +0000 UTC m=+0.129799868 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12)
Oct 14 08:31:19 np0005486759.ooo.test podman[74806]: 2025-10-14 08:31:19.520322994 +0000 UTC m=+0.141639487 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4)
Oct 14 08:31:19 np0005486759.ooo.test podman[74806]: unhealthy
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: tmp-crun.xxrkeP.mount: Deactivated successfully.
Oct 14 08:31:19 np0005486759.ooo.test podman[74807]: 2025-10-14 08:31:19.562971173 +0000 UTC m=+0.182991306 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47)
Oct 14 08:31:19 np0005486759.ooo.test sudo[74802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:19 np0005486759.ooo.test podman[74807]: 2025-10-14 08:31:19.60008333 +0000 UTC m=+0.220103473 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Oct 14 08:31:19 np0005486759.ooo.test podman[74807]: unhealthy
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:31:19 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:31:19 np0005486759.ooo.test sudo[74867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp20swe50w/privsep.sock
Oct 14 08:31:19 np0005486759.ooo.test sudo[74867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:20 np0005486759.ooo.test sudo[74867]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:20 np0005486759.ooo.test sudo[74878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpanmriqgo/privsep.sock
Oct 14 08:31:20 np0005486759.ooo.test sudo[74878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:21 np0005486759.ooo.test sudo[74878]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:21 np0005486759.ooo.test sudo[74889]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1bg1j05l/privsep.sock
Oct 14 08:31:21 np0005486759.ooo.test sudo[74889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:22 np0005486759.ooo.test sudo[74889]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:22 np0005486759.ooo.test sudo[74900]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1x8cqnxn/privsep.sock
Oct 14 08:31:22 np0005486759.ooo.test sudo[74900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:23 np0005486759.ooo.test sudo[74900]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:23 np0005486759.ooo.test sudo[74917]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkx9jyd2f/privsep.sock
Oct 14 08:31:23 np0005486759.ooo.test sudo[74917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:23 np0005486759.ooo.test sudo[74917]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:24 np0005486759.ooo.test sudo[74928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxzeym3we/privsep.sock
Oct 14 08:31:24 np0005486759.ooo.test sudo[74928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:24 np0005486759.ooo.test sudo[74928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:31:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:31:25 np0005486759.ooo.test podman[74935]: 2025-10-14 08:31:25.005325319 +0000 UTC m=+0.082123042 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:31:25 np0005486759.ooo.test systemd[1]: tmp-crun.AtdSbX.mount: Deactivated successfully.
Oct 14 08:31:25 np0005486759.ooo.test podman[74933]: 2025-10-14 08:31:25.071726128 +0000 UTC m=+0.149009166 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20250721.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, tcib_managed=true, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Oct 14 08:31:25 np0005486759.ooo.test podman[74935]: 2025-10-14 08:31:25.086212741 +0000 UTC m=+0.163010504 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Oct 14 08:31:25 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:31:25 np0005486759.ooo.test podman[74933]: 2025-10-14 08:31:25.108289388 +0000 UTC m=+0.185572416 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, tcib_managed=true)
Oct 14 08:31:25 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:31:25 np0005486759.ooo.test sudo[74984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa84plqw6/privsep.sock
Oct 14 08:31:25 np0005486759.ooo.test sudo[74984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:25 np0005486759.ooo.test sudo[74984]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:25 np0005486759.ooo.test sudo[74995]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91orx8xk/privsep.sock
Oct 14 08:31:25 np0005486759.ooo.test sudo[74995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:26 np0005486759.ooo.test sudo[74995]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:26 np0005486759.ooo.test sudo[75006]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1kl2tc89/privsep.sock
Oct 14 08:31:26 np0005486759.ooo.test sudo[75006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:27 np0005486759.ooo.test sudo[75006]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:27 np0005486759.ooo.test sudo[75017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8iefhsgr/privsep.sock
Oct 14 08:31:27 np0005486759.ooo.test sudo[75017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:31:28 np0005486759.ooo.test sudo[75017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:28 np0005486759.ooo.test systemd[1]: tmp-crun.0SZPhI.mount: Deactivated successfully.
Oct 14 08:31:28 np0005486759.ooo.test podman[75023]: 2025-10-14 08:31:28.456206212 +0000 UTC m=+0.088538001 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=nova_migration_target, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37)
Oct 14 08:31:28 np0005486759.ooo.test sudo[75058]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpizy85i44/privsep.sock
Oct 14 08:31:28 np0005486759.ooo.test sudo[75058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:28 np0005486759.ooo.test podman[75023]: 2025-10-14 08:31:28.831221466 +0000 UTC m=+0.463553195 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1)
Oct 14 08:31:28 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:31:29 np0005486759.ooo.test sudo[75058]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:29 np0005486759.ooo.test sudo[75069]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2i2xujdv/privsep.sock
Oct 14 08:31:29 np0005486759.ooo.test sudo[75069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:30 np0005486759.ooo.test sudo[75069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:30 np0005486759.ooo.test sudo[75080]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8c8jjjex/privsep.sock
Oct 14 08:31:30 np0005486759.ooo.test sudo[75080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:30 np0005486759.ooo.test sudo[75080]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:31 np0005486759.ooo.test sudo[75091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgg_o923t/privsep.sock
Oct 14 08:31:31 np0005486759.ooo.test sudo[75091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:31 np0005486759.ooo.test sudo[75091]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:32 np0005486759.ooo.test sudo[75102]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ec_eml2/privsep.sock
Oct 14 08:31:32 np0005486759.ooo.test sudo[75102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:32 np0005486759.ooo.test sudo[75102]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:32 np0005486759.ooo.test sudo[75113]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplch8hmze/privsep.sock
Oct 14 08:31:32 np0005486759.ooo.test sudo[75113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:33 np0005486759.ooo.test sudo[75113]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:33 np0005486759.ooo.test sudo[75127]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpepd8fk3e/privsep.sock
Oct 14 08:31:33 np0005486759.ooo.test sudo[75127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:34 np0005486759.ooo.test sudo[75127]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:34 np0005486759.ooo.test sudo[75141]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk92_jzcl/privsep.sock
Oct 14 08:31:34 np0005486759.ooo.test sudo[75141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:35 np0005486759.ooo.test sudo[75141]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:35 np0005486759.ooo.test sudo[75152]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa1iq3yfx/privsep.sock
Oct 14 08:31:35 np0005486759.ooo.test sudo[75152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:36 np0005486759.ooo.test sudo[75152]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:31:36 np0005486759.ooo.test systemd[1]: tmp-crun.K2Ntl8.mount: Deactivated successfully.
Oct 14 08:31:36 np0005486759.ooo.test podman[75157]: 2025-10-14 08:31:36.234002398 +0000 UTC m=+0.078619034 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1)
Oct 14 08:31:36 np0005486759.ooo.test podman[75157]: 2025-10-14 08:31:36.395432064 +0000 UTC m=+0.240048750 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, release=1, vendor=Red Hat, Inc.)
Oct 14 08:31:36 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:31:36 np0005486759.ooo.test sudo[75194]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxnv2zy0j/privsep.sock
Oct 14 08:31:36 np0005486759.ooo.test sudo[75194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:36 np0005486759.ooo.test sudo[75194]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:31:37 np0005486759.ooo.test podman[75201]: 2025-10-14 08:31:37.068152501 +0000 UTC m=+0.045471068 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute)
Oct 14 08:31:37 np0005486759.ooo.test podman[75199]: 2025-10-14 08:31:37.130189311 +0000 UTC m=+0.107981103 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, version=17.1.9, name=rhosp17/openstack-iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:31:37 np0005486759.ooo.test podman[75199]: 2025-10-14 08:31:37.139175699 +0000 UTC m=+0.116967161 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public)
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:31:37 np0005486759.ooo.test podman[75202]: 2025-10-14 08:31:37.178852836 +0000 UTC m=+0.153719698 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, distribution-scope=public, build-date=2025-07-21T13:04:03, container_name=collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 08:31:37 np0005486759.ooo.test podman[75202]: 2025-10-14 08:31:37.188359561 +0000 UTC m=+0.163226423 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, distribution-scope=public, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.9, name=rhosp17/openstack-collectd)
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:31:37 np0005486759.ooo.test podman[75201]: 2025-10-14 08:31:37.208452523 +0000 UTC m=+0.185771160 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, tcib_managed=true)
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: tmp-crun.raEO6Q.mount: Deactivated successfully.
Oct 14 08:31:37 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:31:37 np0005486759.ooo.test sudo[75271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4ku8637/privsep.sock
Oct 14 08:31:37 np0005486759.ooo.test sudo[75271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:37 np0005486759.ooo.test sudo[75271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:38 np0005486759.ooo.test sudo[75282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4045qi4n/privsep.sock
Oct 14 08:31:38 np0005486759.ooo.test sudo[75282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:38 np0005486759.ooo.test sudo[75282]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:38 np0005486759.ooo.test sudo[75293]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwwhbhow4/privsep.sock
Oct 14 08:31:38 np0005486759.ooo.test sudo[75293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:39 np0005486759.ooo.test sudo[75293]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:39 np0005486759.ooo.test sudo[75310]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxpb6chm4/privsep.sock
Oct 14 08:31:39 np0005486759.ooo.test sudo[75310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:40 np0005486759.ooo.test sudo[75310]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:40 np0005486759.ooo.test sudo[75321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpifyolo4q/privsep.sock
Oct 14 08:31:40 np0005486759.ooo.test sudo[75321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:41 np0005486759.ooo.test sudo[75321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:41 np0005486759.ooo.test sudo[75332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpumxpwlg2/privsep.sock
Oct 14 08:31:41 np0005486759.ooo.test sudo[75332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:42 np0005486759.ooo.test sudo[75332]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:42 np0005486759.ooo.test sudo[75343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1yn4zqj6/privsep.sock
Oct 14 08:31:42 np0005486759.ooo.test sudo[75343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:42 np0005486759.ooo.test sudo[75343]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:43 np0005486759.ooo.test sudo[75354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkgf3qrut/privsep.sock
Oct 14 08:31:43 np0005486759.ooo.test sudo[75354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:43 np0005486759.ooo.test sudo[75354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:44 np0005486759.ooo.test sudo[75365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpby5jy2mp/privsep.sock
Oct 14 08:31:44 np0005486759.ooo.test sudo[75365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:44 np0005486759.ooo.test sudo[75365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:44 np0005486759.ooo.test sudo[75381]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplcv36g7p/privsep.sock
Oct 14 08:31:44 np0005486759.ooo.test sudo[75381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:45 np0005486759.ooo.test sudo[75381]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:45 np0005486759.ooo.test sudo[75393]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8637dz4d/privsep.sock
Oct 14 08:31:45 np0005486759.ooo.test sudo[75393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:46 np0005486759.ooo.test sudo[75393]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:46 np0005486759.ooo.test sudo[75404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1jzcw7ji/privsep.sock
Oct 14 08:31:46 np0005486759.ooo.test sudo[75404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:47 np0005486759.ooo.test sudo[75404]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:47 np0005486759.ooo.test sudo[75415]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_ata13k5/privsep.sock
Oct 14 08:31:47 np0005486759.ooo.test sudo[75415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:48 np0005486759.ooo.test sudo[75415]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:48 np0005486759.ooo.test sudo[75426]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx8y0na5m/privsep.sock
Oct 14 08:31:48 np0005486759.ooo.test sudo[75426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:48 np0005486759.ooo.test sudo[75426]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:49 np0005486759.ooo.test sudo[75437]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6zm2_8x9/privsep.sock
Oct 14 08:31:49 np0005486759.ooo.test sudo[75437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:49 np0005486759.ooo.test sudo[75437]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:31:49 np0005486759.ooo.test podman[75443]: 2025-10-14 08:31:49.899256654 +0000 UTC m=+0.141210051 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, batch=17.1_20250721.1, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: tmp-crun.eON0FI.mount: Deactivated successfully.
Oct 14 08:31:49 np0005486759.ooo.test podman[75444]: 2025-10-14 08:31:49.91723273 +0000 UTC m=+0.155078530 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 08:31:49 np0005486759.ooo.test podman[75443]: 2025-10-14 08:31:49.930891282 +0000 UTC m=+0.172844689 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:31:49 np0005486759.ooo.test podman[75445]: 2025-10-14 08:31:49.946069213 +0000 UTC m=+0.178811355 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:31:49 np0005486759.ooo.test podman[75445]: 2025-10-14 08:31:49.978110524 +0000 UTC m=+0.210852686 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc.)
Oct 14 08:31:49 np0005486759.ooo.test sudo[75504]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7_bfa3b9/privsep.sock
Oct 14 08:31:49 np0005486759.ooo.test sudo[75504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:49 np0005486759.ooo.test podman[75445]: unhealthy
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:31:49 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:31:49 np0005486759.ooo.test podman[75444]: 2025-10-14 08:31:49.996330708 +0000 UTC m=+0.234176498 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, release=1, config_id=tripleo_step4, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team)
Oct 14 08:31:50 np0005486759.ooo.test podman[75444]: unhealthy
Oct 14 08:31:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:31:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:31:50 np0005486759.ooo.test sudo[75504]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:50 np0005486759.ooo.test sudo[75519]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph95r1l5k/privsep.sock
Oct 14 08:31:50 np0005486759.ooo.test sudo[75519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:50 np0005486759.ooo.test systemd[1]: tmp-crun.9d4QO4.mount: Deactivated successfully.
Oct 14 08:31:51 np0005486759.ooo.test sudo[75519]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:51 np0005486759.ooo.test sudo[75530]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo4uje02b/privsep.sock
Oct 14 08:31:51 np0005486759.ooo.test sudo[75530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:52 np0005486759.ooo.test sudo[75530]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:52 np0005486759.ooo.test sudo[75541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzkzb3xui/privsep.sock
Oct 14 08:31:52 np0005486759.ooo.test sudo[75541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:53 np0005486759.ooo.test sudo[75541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:53 np0005486759.ooo.test sudo[75552]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppphveps0/privsep.sock
Oct 14 08:31:53 np0005486759.ooo.test sudo[75552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:54 np0005486759.ooo.test sudo[75552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:54 np0005486759.ooo.test sudo[75563]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy7ahc3te/privsep.sock
Oct 14 08:31:54 np0005486759.ooo.test sudo[75563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:54 np0005486759.ooo.test sudo[75563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:55 np0005486759.ooo.test sudo[75574]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprjg4o6ez/privsep.sock
Oct 14 08:31:55 np0005486759.ooo.test sudo[75574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:31:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:31:55 np0005486759.ooo.test podman[75577]: 2025-10-14 08:31:55.25469304 +0000 UTC m=+0.065070376 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9)
Oct 14 08:31:55 np0005486759.ooo.test podman[75577]: 2025-10-14 08:31:55.276248716 +0000 UTC m=+0.086626062 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:31:55 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:31:55 np0005486759.ooo.test systemd[1]: tmp-crun.3dF5qd.mount: Deactivated successfully.
Oct 14 08:31:55 np0005486759.ooo.test podman[75576]: 2025-10-14 08:31:55.340745172 +0000 UTC m=+0.152214601 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:31:55 np0005486759.ooo.test podman[75576]: 2025-10-14 08:31:55.403124423 +0000 UTC m=+0.214593872 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, tcib_managed=true, build-date=2025-07-21T16:28:53, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1)
Oct 14 08:31:55 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:31:55 np0005486759.ooo.test sudo[75574]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:56 np0005486759.ooo.test sudo[75637]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmy59zsn8/privsep.sock
Oct 14 08:31:56 np0005486759.ooo.test sudo[75637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:56 np0005486759.ooo.test sudo[75637]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:56 np0005486759.ooo.test sudo[75648]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6_ya896c/privsep.sock
Oct 14 08:31:56 np0005486759.ooo.test sudo[75648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:57 np0005486759.ooo.test sudo[75648]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:57 np0005486759.ooo.test sudo[75659]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsg8cz8lo/privsep.sock
Oct 14 08:31:57 np0005486759.ooo.test sudo[75659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:58 np0005486759.ooo.test sudo[75659]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:58 np0005486759.ooo.test sudo[75670]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqi984fk/privsep.sock
Oct 14 08:31:58 np0005486759.ooo.test sudo[75670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:59 np0005486759.ooo.test sudo[75670]: pam_unix(sudo:session): session closed for user root
Oct 14 08:31:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:31:59 np0005486759.ooo.test podman[75675]: 2025-10-14 08:31:59.31120529 +0000 UTC m=+0.071776493 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, architecture=x86_64, version=17.1.9, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:31:59 np0005486759.ooo.test sudo[75703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkiu5b5tv/privsep.sock
Oct 14 08:31:59 np0005486759.ooo.test sudo[75703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:31:59 np0005486759.ooo.test podman[75675]: 2025-10-14 08:31:59.710048542 +0000 UTC m=+0.470619805 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:31:59 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:32:00 np0005486759.ooo.test sudo[75703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:00 np0005486759.ooo.test sudo[75715]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc371f99e/privsep.sock
Oct 14 08:32:00 np0005486759.ooo.test sudo[75715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:00 np0005486759.ooo.test sudo[75715]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:01 np0005486759.ooo.test sudo[75732]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu42j1hlc/privsep.sock
Oct 14 08:32:01 np0005486759.ooo.test sudo[75732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:01 np0005486759.ooo.test sudo[75732]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:01 np0005486759.ooo.test sudo[75743]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxpsn1p48/privsep.sock
Oct 14 08:32:01 np0005486759.ooo.test sudo[75743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:02 np0005486759.ooo.test sudo[75743]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:02 np0005486759.ooo.test sudo[75754]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyz9r8_xk/privsep.sock
Oct 14 08:32:02 np0005486759.ooo.test sudo[75754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:03 np0005486759.ooo.test sudo[75754]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:03 np0005486759.ooo.test sudo[75765]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd0pydvqc/privsep.sock
Oct 14 08:32:03 np0005486759.ooo.test sudo[75765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:04 np0005486759.ooo.test sudo[75765]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:04 np0005486759.ooo.test sudo[75776]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ibiv0w_/privsep.sock
Oct 14 08:32:04 np0005486759.ooo.test sudo[75776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:05 np0005486759.ooo.test sudo[75776]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:05 np0005486759.ooo.test sudo[75787]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeb2a1rvy/privsep.sock
Oct 14 08:32:05 np0005486759.ooo.test sudo[75787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:05 np0005486759.ooo.test sudo[75787]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:06 np0005486759.ooo.test sudo[75800]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmy02a6c2/privsep.sock
Oct 14 08:32:06 np0005486759.ooo.test sudo[75800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:06 np0005486759.ooo.test sudo[75800]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:32:06 np0005486759.ooo.test podman[75808]: 2025-10-14 08:32:06.821106035 +0000 UTC m=+0.053814696 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:32:06 np0005486759.ooo.test sudo[75844]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_uzroc1n/privsep.sock
Oct 14 08:32:06 np0005486759.ooo.test sudo[75844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:07 np0005486759.ooo.test podman[75808]: 2025-10-14 08:32:07.021537238 +0000 UTC m=+0.254245919 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, release=1)
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: tmp-crun.cvgKbA.mount: Deactivated successfully.
Oct 14 08:32:07 np0005486759.ooo.test podman[75848]: 2025-10-14 08:32:07.43746446 +0000 UTC m=+0.063140586 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, vcs-type=git, container_name=nova_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: tmp-crun.OLK2az.mount: Deactivated successfully.
Oct 14 08:32:07 np0005486759.ooo.test podman[75848]: 2025-10-14 08:32:07.517008901 +0000 UTC m=+0.142685067 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, distribution-scope=public, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:32:07 np0005486759.ooo.test podman[75853]: 2025-10-14 08:32:07.464473565 +0000 UTC m=+0.083335450 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, config_id=tripleo_step3, release=2, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git)
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:32:07 np0005486759.ooo.test podman[75847]: 2025-10-14 08:32:07.490053776 +0000 UTC m=+0.117215858 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, container_name=iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9)
Oct 14 08:32:07 np0005486759.ooo.test sudo[75844]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:07 np0005486759.ooo.test podman[75853]: 2025-10-14 08:32:07.595941073 +0000 UTC m=+0.214803008 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64)
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:32:07 np0005486759.ooo.test podman[75847]: 2025-10-14 08:32:07.620040909 +0000 UTC m=+0.247202921 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., container_name=iscsid, tcib_managed=true, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:32:07 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:32:07 np0005486759.ooo.test sudo[75916]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv6tfof2e/privsep.sock
Oct 14 08:32:07 np0005486759.ooo.test sudo[75916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:08 np0005486759.ooo.test sudo[75916]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:08 np0005486759.ooo.test sudo[75927]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1g6rxxvw/privsep.sock
Oct 14 08:32:08 np0005486759.ooo.test sudo[75927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:09 np0005486759.ooo.test sudo[75927]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:09 np0005486759.ooo.test sudo[75938]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwfw_3wz1/privsep.sock
Oct 14 08:32:09 np0005486759.ooo.test sudo[75938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:10 np0005486759.ooo.test sudo[75938]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:10 np0005486759.ooo.test sudo[75949]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi49ltnj7/privsep.sock
Oct 14 08:32:10 np0005486759.ooo.test sudo[75949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:10 np0005486759.ooo.test sudo[75949]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:11 np0005486759.ooo.test sudo[75960]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfolxvkoe/privsep.sock
Oct 14 08:32:11 np0005486759.ooo.test sudo[75960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:11 np0005486759.ooo.test sudo[75960]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:12 np0005486759.ooo.test sudo[75977]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgnfczg8e/privsep.sock
Oct 14 08:32:12 np0005486759.ooo.test sudo[75977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:12 np0005486759.ooo.test sudo[75977]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:12 np0005486759.ooo.test sudo[75988]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ffpvfdf/privsep.sock
Oct 14 08:32:12 np0005486759.ooo.test sudo[75988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:13 np0005486759.ooo.test sudo[75988]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:13 np0005486759.ooo.test sudo[75999]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprigce7mh/privsep.sock
Oct 14 08:32:13 np0005486759.ooo.test sudo[75999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:14 np0005486759.ooo.test sudo[75999]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:14 np0005486759.ooo.test sudo[76010]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw4g6cgan/privsep.sock
Oct 14 08:32:14 np0005486759.ooo.test sudo[76010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:15 np0005486759.ooo.test sudo[76010]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:15 np0005486759.ooo.test sudo[76021]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwipjvaqd/privsep.sock
Oct 14 08:32:15 np0005486759.ooo.test sudo[76021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:16 np0005486759.ooo.test sudo[76021]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:16 np0005486759.ooo.test sudo[76032]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxqwzdzl3/privsep.sock
Oct 14 08:32:16 np0005486759.ooo.test sudo[76032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:16 np0005486759.ooo.test sudo[76032]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:17 np0005486759.ooo.test sudo[76048]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ss__gka/privsep.sock
Oct 14 08:32:17 np0005486759.ooo.test sudo[76048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:17 np0005486759.ooo.test sudo[76048]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:18 np0005486759.ooo.test sudo[76060]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaiq_hguk/privsep.sock
Oct 14 08:32:18 np0005486759.ooo.test sudo[76060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:18 np0005486759.ooo.test sudo[76060]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:18 np0005486759.ooo.test sudo[76071]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvrfpbt62/privsep.sock
Oct 14 08:32:18 np0005486759.ooo.test sudo[76071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:19 np0005486759.ooo.test sudo[76071]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:19 np0005486759.ooo.test sudo[76082]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpso5c7pdk/privsep.sock
Oct 14 08:32:19 np0005486759.ooo.test sudo[76082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:32:20 np0005486759.ooo.test sudo[76082]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:20 np0005486759.ooo.test podman[76087]: 2025-10-14 08:32:20.454830065 +0000 UTC m=+0.079261883 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 08:32:20 np0005486759.ooo.test podman[76087]: 2025-10-14 08:32:20.466357522 +0000 UTC m=+0.090789310 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33)
Oct 14 08:32:20 np0005486759.ooo.test podman[76087]: unhealthy
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:32:20 np0005486759.ooo.test podman[76086]: 2025-10-14 08:32:20.502698347 +0000 UTC m=+0.131553232 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:32:20 np0005486759.ooo.test podman[76088]: 2025-10-14 08:32:20.534644346 +0000 UTC m=+0.159313591 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47)
Oct 14 08:32:20 np0005486759.ooo.test podman[76086]: 2025-10-14 08:32:20.540305811 +0000 UTC m=+0.169160696 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 08:32:20 np0005486759.ooo.test podman[76088]: 2025-10-14 08:32:20.548178945 +0000 UTC m=+0.172848190 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., architecture=x86_64, release=1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:32:20 np0005486759.ooo.test podman[76088]: unhealthy
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:32:20 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:32:20 np0005486759.ooo.test sudo[76150]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpay4c8y2v/privsep.sock
Oct 14 08:32:20 np0005486759.ooo.test sudo[76150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:21 np0005486759.ooo.test sudo[76150]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:21 np0005486759.ooo.test sudo[76161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpudwrkhro/privsep.sock
Oct 14 08:32:21 np0005486759.ooo.test sudo[76161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:22 np0005486759.ooo.test sudo[76161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:22 np0005486759.ooo.test sudo[76174]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8n4x6zsn/privsep.sock
Oct 14 08:32:22 np0005486759.ooo.test sudo[76174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:22 np0005486759.ooo.test sudo[76174]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:23 np0005486759.ooo.test sudo[76189]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwzmuyb3r/privsep.sock
Oct 14 08:32:23 np0005486759.ooo.test sudo[76189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:23 np0005486759.ooo.test sudo[76189]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:24 np0005486759.ooo.test sudo[76200]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpenlfrdgt/privsep.sock
Oct 14 08:32:24 np0005486759.ooo.test sudo[76200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:24 np0005486759.ooo.test sudo[76200]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:24 np0005486759.ooo.test sudo[76211]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk1hqfmbo/privsep.sock
Oct 14 08:32:24 np0005486759.ooo.test sudo[76211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:32:25 np0005486759.ooo.test podman[76214]: 2025-10-14 08:32:25.426896298 +0000 UTC m=+0.060864835 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, release=1, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-ovn-controller-container)
Oct 14 08:32:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:32:25 np0005486759.ooo.test sudo[76211]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:25 np0005486759.ooo.test podman[76214]: 2025-10-14 08:32:25.460764876 +0000 UTC m=+0.094733353 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:32:25 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:32:25 np0005486759.ooo.test systemd[1]: tmp-crun.veoi1p.mount: Deactivated successfully.
Oct 14 08:32:25 np0005486759.ooo.test podman[76233]: 2025-10-14 08:32:25.533221468 +0000 UTC m=+0.077854930 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, release=1, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:32:25 np0005486759.ooo.test podman[76233]: 2025-10-14 08:32:25.56948313 +0000 UTC m=+0.114116622 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, release=1, architecture=x86_64, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true)
Oct 14 08:32:25 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:32:25 np0005486759.ooo.test sudo[76268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmltl9fp4/privsep.sock
Oct 14 08:32:25 np0005486759.ooo.test sudo[76268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:26 np0005486759.ooo.test sudo[76268]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:26 np0005486759.ooo.test sudo[76279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprrl7ny7c/privsep.sock
Oct 14 08:32:26 np0005486759.ooo.test sudo[76279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:27 np0005486759.ooo.test sudo[76279]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:27 np0005486759.ooo.test sudo[76290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp852kmlgn/privsep.sock
Oct 14 08:32:27 np0005486759.ooo.test sudo[76290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:27 np0005486759.ooo.test sudo[76290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:28 np0005486759.ooo.test sudo[76307]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1xop0f4e/privsep.sock
Oct 14 08:32:28 np0005486759.ooo.test sudo[76307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:28 np0005486759.ooo.test sudo[76307]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:29 np0005486759.ooo.test sudo[76318]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpceisnest/privsep.sock
Oct 14 08:32:29 np0005486759.ooo.test sudo[76318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:29 np0005486759.ooo.test sudo[76318]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:29 np0005486759.ooo.test sudo[76329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpad8ykbqj/privsep.sock
Oct 14 08:32:29 np0005486759.ooo.test sudo[76329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:32:29 np0005486759.ooo.test systemd[1]: tmp-crun.ogu3l0.mount: Deactivated successfully.
Oct 14 08:32:29 np0005486759.ooo.test podman[76331]: 2025-10-14 08:32:29.972909246 +0000 UTC m=+0.076473638 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=1, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Oct 14 08:32:30 np0005486759.ooo.test podman[76331]: 2025-10-14 08:32:30.415357277 +0000 UTC m=+0.518921729 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:32:30 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:32:30 np0005486759.ooo.test sudo[76329]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:30 np0005486759.ooo.test sudo[76365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnrero2ns/privsep.sock
Oct 14 08:32:30 np0005486759.ooo.test sudo[76365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:31 np0005486759.ooo.test sudo[76365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:31 np0005486759.ooo.test sudo[76376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpareqob2_/privsep.sock
Oct 14 08:32:31 np0005486759.ooo.test sudo[76376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:32 np0005486759.ooo.test sudo[76376]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:32 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:32:32 np0005486759.ooo.test recover_tripleo_nova_virtqemud[76383]: 47951
Oct 14 08:32:32 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:32:32 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:32:32 np0005486759.ooo.test sudo[76389]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprphu6kqv/privsep.sock
Oct 14 08:32:32 np0005486759.ooo.test sudo[76389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:32 np0005486759.ooo.test sudo[76389]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:33 np0005486759.ooo.test sudo[76405]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxyhfnczu/privsep.sock
Oct 14 08:32:33 np0005486759.ooo.test sudo[76405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:33 np0005486759.ooo.test sudo[76405]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:34 np0005486759.ooo.test sudo[76417]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_rifo91f/privsep.sock
Oct 14 08:32:34 np0005486759.ooo.test sudo[76417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:34 np0005486759.ooo.test sudo[76417]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:34 np0005486759.ooo.test sudo[76428]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptpo_en9l/privsep.sock
Oct 14 08:32:34 np0005486759.ooo.test sudo[76428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:35 np0005486759.ooo.test sudo[76428]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:35 np0005486759.ooo.test sudo[76439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx0yw3k0s/privsep.sock
Oct 14 08:32:35 np0005486759.ooo.test sudo[76439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:36 np0005486759.ooo.test sudo[76439]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:36 np0005486759.ooo.test sudo[76450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnnxziy0b/privsep.sock
Oct 14 08:32:36 np0005486759.ooo.test sudo[76450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:37 np0005486759.ooo.test sudo[76450]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:32:37 np0005486759.ooo.test podman[76454]: 2025-10-14 08:32:37.326887777 +0000 UTC m=+0.048127961 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:32:37 np0005486759.ooo.test podman[76454]: 2025-10-14 08:32:37.482703319 +0000 UTC m=+0.203943533 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, version=17.1.9)
Oct 14 08:32:37 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:32:37 np0005486759.ooo.test sudo[76489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6urcofke/privsep.sock
Oct 14 08:32:37 np0005486759.ooo.test sudo[76489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:38 np0005486759.ooo.test sudo[76489]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:32:38 np0005486759.ooo.test podman[76497]: 2025-10-14 08:32:38.213043169 +0000 UTC m=+0.065139527 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:32:38 np0005486759.ooo.test podman[76497]: 2025-10-14 08:32:38.22633972 +0000 UTC m=+0.078436118 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:32:38 np0005486759.ooo.test podman[76496]: 2025-10-14 08:32:38.288486213 +0000 UTC m=+0.136758423 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:32:38 np0005486759.ooo.test podman[76496]: 2025-10-14 08:32:38.306199072 +0000 UTC m=+0.154471302 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1)
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:32:38 np0005486759.ooo.test sudo[76555]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpof2fqb7c/privsep.sock
Oct 14 08:32:38 np0005486759.ooo.test sudo[76555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: tmp-crun.MHOkYj.mount: Deactivated successfully.
Oct 14 08:32:38 np0005486759.ooo.test podman[76495]: 2025-10-14 08:32:38.39759866 +0000 UTC m=+0.250685488 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, container_name=iscsid, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, distribution-scope=public, release=1, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Oct 14 08:32:38 np0005486759.ooo.test podman[76495]: 2025-10-14 08:32:38.406367111 +0000 UTC m=+0.259453919 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, release=1, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:32:38 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:32:38 np0005486759.ooo.test sudo[76555]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:39 np0005486759.ooo.test sudo[76578]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptv4vk8_3/privsep.sock
Oct 14 08:32:39 np0005486759.ooo.test sudo[76578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:39 np0005486759.ooo.test sudo[76578]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:40 np0005486759.ooo.test sudo[76589]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0qf3f_4z/privsep.sock
Oct 14 08:32:40 np0005486759.ooo.test sudo[76589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:40 np0005486759.ooo.test sudo[76589]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:40 np0005486759.ooo.test sudo[76600]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprs3qg1xt/privsep.sock
Oct 14 08:32:40 np0005486759.ooo.test sudo[76600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:41 np0005486759.ooo.test sudo[76600]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:41 np0005486759.ooo.test sudo[76611]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbwuh1bjl/privsep.sock
Oct 14 08:32:41 np0005486759.ooo.test sudo[76611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:42 np0005486759.ooo.test sudo[76611]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:42 np0005486759.ooo.test sudo[76622]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp35_35gu2/privsep.sock
Oct 14 08:32:42 np0005486759.ooo.test sudo[76622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:43 np0005486759.ooo.test sudo[76622]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:43 np0005486759.ooo.test sudo[76633]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcc2ux_1a/privsep.sock
Oct 14 08:32:43 np0005486759.ooo.test sudo[76633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:44 np0005486759.ooo.test sudo[76633]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:44 np0005486759.ooo.test sudo[76650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeq8vs8a6/privsep.sock
Oct 14 08:32:44 np0005486759.ooo.test sudo[76650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:44 np0005486759.ooo.test sudo[76650]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:45 np0005486759.ooo.test sudo[76661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwut7_dha/privsep.sock
Oct 14 08:32:45 np0005486759.ooo.test sudo[76661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:45 np0005486759.ooo.test sudo[76661]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:46 np0005486759.ooo.test sudo[76672]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbqqjc8s5/privsep.sock
Oct 14 08:32:46 np0005486759.ooo.test sudo[76672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:46 np0005486759.ooo.test sudo[76672]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:46 np0005486759.ooo.test sudo[76683]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi_3ddodk/privsep.sock
Oct 14 08:32:46 np0005486759.ooo.test sudo[76683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:47 np0005486759.ooo.test sudo[76683]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:47 np0005486759.ooo.test sudo[76694]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0y_6w24_/privsep.sock
Oct 14 08:32:47 np0005486759.ooo.test sudo[76694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:48 np0005486759.ooo.test sudo[76694]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:48 np0005486759.ooo.test sudo[76705]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxt7gaht2/privsep.sock
Oct 14 08:32:48 np0005486759.ooo.test sudo[76705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:49 np0005486759.ooo.test sudo[76705]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:49 np0005486759.ooo.test sudo[76719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp76r177fi/privsep.sock
Oct 14 08:32:49 np0005486759.ooo.test sudo[76719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:49 np0005486759.ooo.test sudo[76719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:50 np0005486759.ooo.test sudo[76733]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpznm8x2pk/privsep.sock
Oct 14 08:32:50 np0005486759.ooo.test sudo[76733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:50 np0005486759.ooo.test sudo[76733]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:32:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:32:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:32:50 np0005486759.ooo.test podman[76740]: 2025-10-14 08:32:50.858584549 +0000 UTC m=+0.072020799 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, config_id=tripleo_step4, release=1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, distribution-scope=public)
Oct 14 08:32:50 np0005486759.ooo.test podman[76740]: 2025-10-14 08:32:50.895623946 +0000 UTC m=+0.109060196 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:32:50 np0005486759.ooo.test podman[76740]: unhealthy
Oct 14 08:32:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:32:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:32:50 np0005486759.ooo.test systemd[1]: tmp-crun.lH0vEo.mount: Deactivated successfully.
Oct 14 08:32:50 np0005486759.ooo.test podman[76741]: 2025-10-14 08:32:50.920040421 +0000 UTC m=+0.129413526 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 08:32:50 np0005486759.ooo.test podman[76739]: 2025-10-14 08:32:50.958222423 +0000 UTC m=+0.172611993 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, release=1, tcib_managed=true, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Oct 14 08:32:50 np0005486759.ooo.test podman[76741]: 2025-10-14 08:32:50.987485188 +0000 UTC m=+0.196858233 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:32:50 np0005486759.ooo.test podman[76741]: unhealthy
Oct 14 08:32:51 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:32:51 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:32:51 np0005486759.ooo.test sudo[76799]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcqbr6dvu/privsep.sock
Oct 14 08:32:51 np0005486759.ooo.test sudo[76799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:51 np0005486759.ooo.test podman[76739]: 2025-10-14 08:32:51.04020177 +0000 UTC m=+0.254591320 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, name=rhosp17/openstack-cron, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:07:52, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Oct 14 08:32:51 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:32:51 np0005486759.ooo.test sudo[76799]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:51 np0005486759.ooo.test sudo[76810]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplgtcx7jw/privsep.sock
Oct 14 08:32:51 np0005486759.ooo.test sudo[76810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:52 np0005486759.ooo.test sudo[76810]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:52 np0005486759.ooo.test sudo[76821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa47qx9on/privsep.sock
Oct 14 08:32:52 np0005486759.ooo.test sudo[76821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:53 np0005486759.ooo.test sudo[76821]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:53 np0005486759.ooo.test sudo[76832]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeniik5rx/privsep.sock
Oct 14 08:32:53 np0005486759.ooo.test sudo[76832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:54 np0005486759.ooo.test sudo[76832]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:54 np0005486759.ooo.test sudo[76843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp46qel8bl/privsep.sock
Oct 14 08:32:54 np0005486759.ooo.test sudo[76843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:55 np0005486759.ooo.test sudo[76843]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:55 np0005486759.ooo.test sudo[76860]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphlihfq2h/privsep.sock
Oct 14 08:32:55 np0005486759.ooo.test sudo[76860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:55 np0005486759.ooo.test sudo[76860]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:32:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:32:55 np0005486759.ooo.test podman[76867]: 2025-10-14 08:32:55.85774692 +0000 UTC m=+0.071120122 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=ovn_controller, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 08:32:55 np0005486759.ooo.test podman[76867]: 2025-10-14 08:32:55.878301376 +0000 UTC m=+0.091674528 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, distribution-scope=public, release=1)
Oct 14 08:32:55 np0005486759.ooo.test podman[76866]: 2025-10-14 08:32:55.836501153 +0000 UTC m=+0.055121157 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1)
Oct 14 08:32:55 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:32:55 np0005486759.ooo.test podman[76866]: 2025-10-14 08:32:55.915345052 +0000 UTC m=+0.133965086 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:32:55 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:32:56 np0005486759.ooo.test sudo[76918]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj9vbdmhn/privsep.sock
Oct 14 08:32:56 np0005486759.ooo.test sudo[76918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:56 np0005486759.ooo.test sudo[76918]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:56 np0005486759.ooo.test sudo[76929]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsk6e4bw5/privsep.sock
Oct 14 08:32:56 np0005486759.ooo.test sudo[76929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:57 np0005486759.ooo.test sudo[76929]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:57 np0005486759.ooo.test sudo[76940]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc5_z3gyc/privsep.sock
Oct 14 08:32:57 np0005486759.ooo.test sudo[76940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:58 np0005486759.ooo.test sudo[76940]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:58 np0005486759.ooo.test sudo[76951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_pt4tt5w/privsep.sock
Oct 14 08:32:58 np0005486759.ooo.test sudo[76951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:59 np0005486759.ooo.test sudo[76951]: pam_unix(sudo:session): session closed for user root
Oct 14 08:32:59 np0005486759.ooo.test sudo[76962]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx2yslnuj/privsep.sock
Oct 14 08:32:59 np0005486759.ooo.test sudo[76962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:32:59 np0005486759.ooo.test sudo[76962]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:00 np0005486759.ooo.test sudo[76978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8k4pqyh0/privsep.sock
Oct 14 08:33:00 np0005486759.ooo.test sudo[76978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:00 np0005486759.ooo.test sudo[76978]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:33:00 np0005486759.ooo.test systemd[1]: tmp-crun.pMSeyW.mount: Deactivated successfully.
Oct 14 08:33:00 np0005486759.ooo.test podman[76984]: 2025-10-14 08:33:00.844097003 +0000 UTC m=+0.079740889 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:33:01 np0005486759.ooo.test sudo[77012]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwasr_5oo/privsep.sock
Oct 14 08:33:01 np0005486759.ooo.test sudo[77012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:01 np0005486759.ooo.test podman[76984]: 2025-10-14 08:33:01.21110868 +0000 UTC m=+0.446752526 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37)
Oct 14 08:33:01 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:33:01 np0005486759.ooo.test sudo[77012]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:01 np0005486759.ooo.test sudo[77024]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpughq9a1b/privsep.sock
Oct 14 08:33:01 np0005486759.ooo.test sudo[77024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:02 np0005486759.ooo.test sudo[77024]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:02 np0005486759.ooo.test sudo[77035]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp77boozyc/privsep.sock
Oct 14 08:33:02 np0005486759.ooo.test sudo[77035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:03 np0005486759.ooo.test sudo[77035]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:03 np0005486759.ooo.test sudo[77046]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps1wuh7nr/privsep.sock
Oct 14 08:33:03 np0005486759.ooo.test sudo[77046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:04 np0005486759.ooo.test sudo[77046]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:04 np0005486759.ooo.test sudo[77057]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxnvyoalq/privsep.sock
Oct 14 08:33:04 np0005486759.ooo.test sudo[77057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:05 np0005486759.ooo.test sudo[77057]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:05 np0005486759.ooo.test sudo[77068]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvypfjcmm/privsep.sock
Oct 14 08:33:05 np0005486759.ooo.test sudo[77068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:05 np0005486759.ooo.test sudo[77068]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:06 np0005486759.ooo.test sudo[77085]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnqkrs5yz/privsep.sock
Oct 14 08:33:06 np0005486759.ooo.test sudo[77085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:06 np0005486759.ooo.test sudo[77085]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:07 np0005486759.ooo.test sudo[77096]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp87g00ksk/privsep.sock
Oct 14 08:33:07 np0005486759.ooo.test sudo[77096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:07 np0005486759.ooo.test sudo[77096]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:33:07 np0005486759.ooo.test podman[77100]: 2025-10-14 08:33:07.782628458 +0000 UTC m=+0.058596424 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, version=17.1.9, vcs-type=git, config_id=tripleo_step1, release=1, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59)
Oct 14 08:33:07 np0005486759.ooo.test sudo[77136]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzkclex3h/privsep.sock
Oct 14 08:33:07 np0005486759.ooo.test sudo[77136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:07 np0005486759.ooo.test podman[77100]: 2025-10-14 08:33:07.997559608 +0000 UTC m=+0.273527594 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9)
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: tmp-crun.5o2WiA.mount: Deactivated successfully.
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:33:08 np0005486759.ooo.test podman[77141]: 2025-10-14 08:33:08.496055295 +0000 UTC m=+0.082046670 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Oct 14 08:33:08 np0005486759.ooo.test sudo[77136]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:08 np0005486759.ooo.test podman[77142]: 2025-10-14 08:33:08.506019093 +0000 UTC m=+0.087166188 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container)
Oct 14 08:33:08 np0005486759.ooo.test podman[77141]: 2025-10-14 08:33:08.518940923 +0000 UTC m=+0.104932338 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vendor=Red Hat, Inc.)
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:33:08 np0005486759.ooo.test podman[77171]: 2025-10-14 08:33:08.562991756 +0000 UTC m=+0.062991860 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vcs-type=git)
Oct 14 08:33:08 np0005486759.ooo.test podman[77171]: 2025-10-14 08:33:08.568877628 +0000 UTC m=+0.068877732 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, release=1)
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:33:08 np0005486759.ooo.test podman[77142]: 2025-10-14 08:33:08.590607541 +0000 UTC m=+0.171754696 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, container_name=collectd, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:33:08 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:33:08 np0005486759.ooo.test sudo[77214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcd64nwrs/privsep.sock
Oct 14 08:33:08 np0005486759.ooo.test sudo[77214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:09 np0005486759.ooo.test sudo[77214]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:09 np0005486759.ooo.test sudo[77225]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmtohew1f/privsep.sock
Oct 14 08:33:09 np0005486759.ooo.test sudo[77225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:10 np0005486759.ooo.test sudo[77225]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:10 np0005486759.ooo.test sudo[77236]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf2yaiuua/privsep.sock
Oct 14 08:33:10 np0005486759.ooo.test sudo[77236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:10 np0005486759.ooo.test sudo[77236]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:11 np0005486759.ooo.test sudo[77253]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz05s8_cw/privsep.sock
Oct 14 08:33:11 np0005486759.ooo.test sudo[77253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:11 np0005486759.ooo.test sudo[77253]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:11 np0005486759.ooo.test sudo[77264]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp75j9rpcx/privsep.sock
Oct 14 08:33:11 np0005486759.ooo.test sudo[77264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:12 np0005486759.ooo.test sudo[77264]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:12 np0005486759.ooo.test sudo[77275]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn3hgccol/privsep.sock
Oct 14 08:33:12 np0005486759.ooo.test sudo[77275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:13 np0005486759.ooo.test sudo[77275]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:13 np0005486759.ooo.test sudo[77286]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpweo37j_d/privsep.sock
Oct 14 08:33:13 np0005486759.ooo.test sudo[77286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:14 np0005486759.ooo.test sudo[77286]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:14 np0005486759.ooo.test sudo[77297]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj100e_bu/privsep.sock
Oct 14 08:33:14 np0005486759.ooo.test sudo[77297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:15 np0005486759.ooo.test sudo[77297]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:15 np0005486759.ooo.test sudo[77308]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6vgrlikq/privsep.sock
Oct 14 08:33:15 np0005486759.ooo.test sudo[77308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:15 np0005486759.ooo.test sudo[77308]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:16 np0005486759.ooo.test sudo[77321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpijwlg63b/privsep.sock
Oct 14 08:33:16 np0005486759.ooo.test sudo[77321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:16 np0005486759.ooo.test sudo[77321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:16 np0005486759.ooo.test sudo[77336]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpva7fdwks/privsep.sock
Oct 14 08:33:16 np0005486759.ooo.test sudo[77336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:17 np0005486759.ooo.test sudo[77336]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:17 np0005486759.ooo.test sudo[77347]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc3lw62ms/privsep.sock
Oct 14 08:33:17 np0005486759.ooo.test sudo[77347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:18 np0005486759.ooo.test sudo[77347]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:18 np0005486759.ooo.test sudo[77358]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpja9g1ym4/privsep.sock
Oct 14 08:33:18 np0005486759.ooo.test sudo[77358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:19 np0005486759.ooo.test sudo[77358]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:19 np0005486759.ooo.test sudo[77369]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpimmqysvq/privsep.sock
Oct 14 08:33:19 np0005486759.ooo.test sudo[77369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:20 np0005486759.ooo.test sudo[77369]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:20 np0005486759.ooo.test sudo[77380]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9mrokftd/privsep.sock
Oct 14 08:33:20 np0005486759.ooo.test sudo[77380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:20 np0005486759.ooo.test sudo[77380]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: tmp-crun.9HuRvM.mount: Deactivated successfully.
Oct 14 08:33:21 np0005486759.ooo.test podman[77384]: 2025-10-14 08:33:21.111259776 +0000 UTC m=+0.089232071 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, release=1, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc.)
Oct 14 08:33:21 np0005486759.ooo.test podman[77384]: 2025-10-14 08:33:21.118726157 +0000 UTC m=+0.096698442 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:33:21 np0005486759.ooo.test podman[77384]: unhealthy
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:33:21 np0005486759.ooo.test podman[77411]: 2025-10-14 08:33:21.161843482 +0000 UTC m=+0.068098959 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, container_name=logrotate_crond, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible)
Oct 14 08:33:21 np0005486759.ooo.test podman[77411]: 2025-10-14 08:33:21.16792238 +0000 UTC m=+0.074177917 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T13:07:52, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container)
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:33:21 np0005486759.ooo.test podman[77387]: 2025-10-14 08:33:21.090489374 +0000 UTC m=+0.071440601 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.9, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, release=1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47)
Oct 14 08:33:21 np0005486759.ooo.test sudo[77446]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsg_b73z5/privsep.sock
Oct 14 08:33:21 np0005486759.ooo.test sudo[77446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:21 np0005486759.ooo.test podman[77387]: 2025-10-14 08:33:21.227299538 +0000 UTC m=+0.208250775 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 08:33:21 np0005486759.ooo.test podman[77387]: unhealthy
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:33:21 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:33:21 np0005486759.ooo.test sudo[77446]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:22 np0005486759.ooo.test sudo[77465]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoy70o1df/privsep.sock
Oct 14 08:33:22 np0005486759.ooo.test sudo[77465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:22 np0005486759.ooo.test sudo[77465]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:22 np0005486759.ooo.test sudo[77476]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp88ma7aae/privsep.sock
Oct 14 08:33:22 np0005486759.ooo.test sudo[77476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:23 np0005486759.ooo.test sudo[77476]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:23 np0005486759.ooo.test sudo[77487]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwf04kdpw/privsep.sock
Oct 14 08:33:23 np0005486759.ooo.test sudo[77487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:24 np0005486759.ooo.test sudo[77487]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:24 np0005486759.ooo.test sudo[77498]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo5a66h_9/privsep.sock
Oct 14 08:33:24 np0005486759.ooo.test sudo[77498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:25 np0005486759.ooo.test sudo[77498]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:25 np0005486759.ooo.test sudo[77509]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9vj1fs67/privsep.sock
Oct 14 08:33:25 np0005486759.ooo.test sudo[77509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:26 np0005486759.ooo.test sudo[77509]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:33:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:33:26 np0005486759.ooo.test podman[77514]: 2025-10-14 08:33:26.217852322 +0000 UTC m=+0.082728251 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9)
Oct 14 08:33:26 np0005486759.ooo.test podman[77516]: 2025-10-14 08:33:26.262578196 +0000 UTC m=+0.123727310 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4)
Oct 14 08:33:26 np0005486759.ooo.test podman[77516]: 2025-10-14 08:33:26.276721483 +0000 UTC m=+0.137870537 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.buildah.version=1.33.12, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-type=git, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44)
Oct 14 08:33:26 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:33:26 np0005486759.ooo.test podman[77514]: 2025-10-14 08:33:26.28825817 +0000 UTC m=+0.153134119 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, container_name=ovn_metadata_agent)
Oct 14 08:33:26 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:33:26 np0005486759.ooo.test sudo[77567]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdcdzd4l1/privsep.sock
Oct 14 08:33:26 np0005486759.ooo.test sudo[77567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:26 np0005486759.ooo.test sudo[77567]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:27 np0005486759.ooo.test sudo[77583]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph0d39g5i/privsep.sock
Oct 14 08:33:27 np0005486759.ooo.test sudo[77583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:27 np0005486759.ooo.test sudo[77583]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:27 np0005486759.ooo.test sudo[77595]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb0tmjg8t/privsep.sock
Oct 14 08:33:27 np0005486759.ooo.test sudo[77595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:28 np0005486759.ooo.test sudo[77595]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:28 np0005486759.ooo.test sudo[77606]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk84kw715/privsep.sock
Oct 14 08:33:28 np0005486759.ooo.test sudo[77606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:29 np0005486759.ooo.test sudo[77606]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:29 np0005486759.ooo.test sudo[77617]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9szgoup2/privsep.sock
Oct 14 08:33:29 np0005486759.ooo.test sudo[77617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:30 np0005486759.ooo.test sudo[77617]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:30 np0005486759.ooo.test sudo[77628]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk5_wpbjt/privsep.sock
Oct 14 08:33:30 np0005486759.ooo.test sudo[77628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:31 np0005486759.ooo.test sudo[77628]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:33:31 np0005486759.ooo.test podman[77633]: 2025-10-14 08:33:31.434860394 +0000 UTC m=+0.077801569 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, version=17.1.9, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:33:31 np0005486759.ooo.test sudo[77662]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6r8bp97_/privsep.sock
Oct 14 08:33:31 np0005486759.ooo.test sudo[77662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:31 np0005486759.ooo.test podman[77633]: 2025-10-14 08:33:31.833929363 +0000 UTC m=+0.476870478 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:33:31 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:33:32 np0005486759.ooo.test sudo[77662]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:32 np0005486759.ooo.test sudo[77676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2qxv0r45/privsep.sock
Oct 14 08:33:32 np0005486759.ooo.test sudo[77676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:33 np0005486759.ooo.test sudo[77676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:33 np0005486759.ooo.test sudo[77690]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpap_48mff/privsep.sock
Oct 14 08:33:33 np0005486759.ooo.test sudo[77690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:33 np0005486759.ooo.test sudo[77690]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:34 np0005486759.ooo.test sudo[77701]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ytsprp2/privsep.sock
Oct 14 08:33:34 np0005486759.ooo.test sudo[77701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:34 np0005486759.ooo.test sudo[77701]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:35 np0005486759.ooo.test sudo[77712]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn8697l8o/privsep.sock
Oct 14 08:33:35 np0005486759.ooo.test sudo[77712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:35 np0005486759.ooo.test sudo[77712]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:35 np0005486759.ooo.test sudo[77723]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbn9jz5wv/privsep.sock
Oct 14 08:33:35 np0005486759.ooo.test sudo[77723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:36 np0005486759.ooo.test sudo[77723]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:36 np0005486759.ooo.test sudo[77734]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpctpakbvw/privsep.sock
Oct 14 08:33:36 np0005486759.ooo.test sudo[77734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:37 np0005486759.ooo.test sudo[77734]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:37 np0005486759.ooo.test sudo[77745]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpama58rht/privsep.sock
Oct 14 08:33:37 np0005486759.ooo.test sudo[77745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:38 np0005486759.ooo.test sudo[77745]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: tmp-crun.1DWYlR.mount: Deactivated successfully.
Oct 14 08:33:38 np0005486759.ooo.test podman[77755]: 2025-10-14 08:33:38.435872164 +0000 UTC m=+0.097949256 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team)
Oct 14 08:33:38 np0005486759.ooo.test sudo[77791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuj86ls67/privsep.sock
Oct 14 08:33:38 np0005486759.ooo.test sudo[77791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:33:38 np0005486759.ooo.test podman[77755]: 2025-10-14 08:33:38.674662958 +0000 UTC m=+0.336740090 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.buildah.version=1.33.12, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1)
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:33:38 np0005486759.ooo.test podman[77805]: 2025-10-14 08:33:38.709121799 +0000 UTC m=+0.060706288 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, batch=17.1_20250721.1, release=2, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 08:33:38 np0005486759.ooo.test podman[77805]: 2025-10-14 08:33:38.745372106 +0000 UTC m=+0.096956605 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, release=2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.9, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vcs-type=git)
Oct 14 08:33:38 np0005486759.ooo.test podman[77794]: 2025-10-14 08:33:38.758088312 +0000 UTC m=+0.120734135 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:33:38 np0005486759.ooo.test podman[77794]: 2025-10-14 08:33:38.778199597 +0000 UTC m=+0.140845430 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, tcib_managed=true, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:33:38 np0005486759.ooo.test podman[77793]: 2025-10-14 08:33:38.856198802 +0000 UTC m=+0.223633964 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, config_id=tripleo_step3, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, release=1, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:33:38 np0005486759.ooo.test podman[77793]: 2025-10-14 08:33:38.867410661 +0000 UTC m=+0.234845823 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, version=17.1.9, architecture=x86_64, release=1, tcib_managed=true, io.openshift.expose-services=, build-date=2025-07-21T13:27:15)
Oct 14 08:33:38 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:33:39 np0005486759.ooo.test sudo[77791]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:39 np0005486759.ooo.test sudo[77867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp79ficf5i/privsep.sock
Oct 14 08:33:39 np0005486759.ooo.test sudo[77867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:40 np0005486759.ooo.test sudo[77867]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:40 np0005486759.ooo.test sudo[77878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnf13ze3k/privsep.sock
Oct 14 08:33:40 np0005486759.ooo.test sudo[77878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:41 np0005486759.ooo.test sudo[77878]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:41 np0005486759.ooo.test sudo[77889]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq8603u0u/privsep.sock
Oct 14 08:33:41 np0005486759.ooo.test sudo[77889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:41 np0005486759.ooo.test sudo[77889]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:42 np0005486759.ooo.test sudo[77900]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprc6c6_0u/privsep.sock
Oct 14 08:33:42 np0005486759.ooo.test sudo[77900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:42 np0005486759.ooo.test sudo[77900]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:43 np0005486759.ooo.test sudo[77911]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqaffiu2o/privsep.sock
Oct 14 08:33:43 np0005486759.ooo.test sudo[77911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:43 np0005486759.ooo.test sudo[77911]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:43 np0005486759.ooo.test sudo[77928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj2pgk7yw/privsep.sock
Oct 14 08:33:43 np0005486759.ooo.test sudo[77928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:44 np0005486759.ooo.test sudo[77928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:44 np0005486759.ooo.test sudo[77939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy5lm75h8/privsep.sock
Oct 14 08:33:44 np0005486759.ooo.test sudo[77939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:45 np0005486759.ooo.test sudo[77939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:45 np0005486759.ooo.test sudo[77950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4nx2qd8v/privsep.sock
Oct 14 08:33:45 np0005486759.ooo.test sudo[77950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:46 np0005486759.ooo.test sudo[77950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:46 np0005486759.ooo.test sudo[77961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9c277kbb/privsep.sock
Oct 14 08:33:46 np0005486759.ooo.test sudo[77961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:47 np0005486759.ooo.test sudo[77961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:47 np0005486759.ooo.test sudo[77972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo34k8m06/privsep.sock
Oct 14 08:33:47 np0005486759.ooo.test sudo[77972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:47 np0005486759.ooo.test sudo[77972]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:48 np0005486759.ooo.test sudo[77983]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2dxr9ak7/privsep.sock
Oct 14 08:33:48 np0005486759.ooo.test sudo[77983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:48 np0005486759.ooo.test sudo[77983]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:49 np0005486759.ooo.test sudo[78000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprgkmla1c/privsep.sock
Oct 14 08:33:49 np0005486759.ooo.test sudo[78000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:49 np0005486759.ooo.test sudo[78000]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:49 np0005486759.ooo.test sudo[78011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7zl_5db4/privsep.sock
Oct 14 08:33:49 np0005486759.ooo.test sudo[78011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:50 np0005486759.ooo.test sudo[78011]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:50 np0005486759.ooo.test sudo[78022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp78juo9ej/privsep.sock
Oct 14 08:33:50 np0005486759.ooo.test sudo[78022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:51 np0005486759.ooo.test sudo[78022]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: tmp-crun.ZnBp1Q.mount: Deactivated successfully.
Oct 14 08:33:51 np0005486759.ooo.test podman[78028]: 2025-10-14 08:33:51.472989365 +0000 UTC m=+0.098195344 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, release=1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Oct 14 08:33:51 np0005486759.ooo.test podman[78028]: 2025-10-14 08:33:51.479447066 +0000 UTC m=+0.104652995 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, container_name=logrotate_crond, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20250721.1)
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:33:51 np0005486759.ooo.test podman[78030]: 2025-10-14 08:33:51.530528424 +0000 UTC m=+0.149455107 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git)
Oct 14 08:33:51 np0005486759.ooo.test podman[78030]: 2025-10-14 08:33:51.572331763 +0000 UTC m=+0.191258436 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:33:51 np0005486759.ooo.test podman[78030]: unhealthy
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:33:51 np0005486759.ooo.test sudo[78085]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2h2hfcc_/privsep.sock
Oct 14 08:33:51 np0005486759.ooo.test sudo[78085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:51 np0005486759.ooo.test podman[78029]: 2025-10-14 08:33:51.629984516 +0000 UTC m=+0.252384218 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1, build-date=2025-07-21T14:45:33, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute)
Oct 14 08:33:51 np0005486759.ooo.test podman[78029]: 2025-10-14 08:33:51.64331104 +0000 UTC m=+0.265710712 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 08:33:51 np0005486759.ooo.test podman[78029]: unhealthy
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:33:51 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:33:52 np0005486759.ooo.test sudo[78085]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:52 np0005486759.ooo.test systemd[1]: tmp-crun.Dsb6FD.mount: Deactivated successfully.
Oct 14 08:33:52 np0005486759.ooo.test sudo[78103]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphr2amqhl/privsep.sock
Oct 14 08:33:52 np0005486759.ooo.test sudo[78103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:53 np0005486759.ooo.test sudo[78103]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:53 np0005486759.ooo.test sudo[78114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa6xbqtdj/privsep.sock
Oct 14 08:33:53 np0005486759.ooo.test sudo[78114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:53 np0005486759.ooo.test sudo[78114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:54 np0005486759.ooo.test sudo[78130]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9kligf74/privsep.sock
Oct 14 08:33:54 np0005486759.ooo.test sudo[78130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:54 np0005486759.ooo.test sudo[78130]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:54 np0005486759.ooo.test sudo[78142]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn0m09efn/privsep.sock
Oct 14 08:33:54 np0005486759.ooo.test sudo[78142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:55 np0005486759.ooo.test sudo[78142]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:55 np0005486759.ooo.test sudo[78153]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeysf83az/privsep.sock
Oct 14 08:33:55 np0005486759.ooo.test sudo[78153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:33:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:33:56 np0005486759.ooo.test podman[78158]: 2025-10-14 08:33:56.416211304 +0000 UTC m=+0.048133387 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, release=1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, build-date=2025-07-21T13:28:44)
Oct 14 08:33:56 np0005486759.ooo.test podman[78158]: 2025-10-14 08:33:56.439319093 +0000 UTC m=+0.071241196 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:33:56 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:33:56 np0005486759.ooo.test sudo[78153]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:56 np0005486759.ooo.test podman[78157]: 2025-10-14 08:33:56.486230621 +0000 UTC m=+0.114342986 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:33:56 np0005486759.ooo.test podman[78157]: 2025-10-14 08:33:56.539559649 +0000 UTC m=+0.167671984 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, architecture=x86_64, version=17.1.9, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:33:56 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:33:56 np0005486759.ooo.test sudo[78209]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_wplx6kc/privsep.sock
Oct 14 08:33:56 np0005486759.ooo.test sudo[78209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:57 np0005486759.ooo.test sudo[78209]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:57 np0005486759.ooo.test sudo[78220]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp019j1tg4/privsep.sock
Oct 14 08:33:57 np0005486759.ooo.test sudo[78220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:58 np0005486759.ooo.test sudo[78220]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:58 np0005486759.ooo.test sudo[78231]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7xj_hjl2/privsep.sock
Oct 14 08:33:58 np0005486759.ooo.test sudo[78231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:59 np0005486759.ooo.test sudo[78231]: pam_unix(sudo:session): session closed for user root
Oct 14 08:33:59 np0005486759.ooo.test sudo[78244]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnh6g17i5/privsep.sock
Oct 14 08:33:59 np0005486759.ooo.test sudo[78244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:33:59 np0005486759.ooo.test sudo[78244]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:00 np0005486759.ooo.test sudo[78259]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpauvz78te/privsep.sock
Oct 14 08:34:00 np0005486759.ooo.test sudo[78259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:00 np0005486759.ooo.test sudo[78259]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:01 np0005486759.ooo.test sudo[78270]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppbqcpbhs/privsep.sock
Oct 14 08:34:01 np0005486759.ooo.test sudo[78270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:01 np0005486759.ooo.test sudo[78270]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:01 np0005486759.ooo.test sudo[78281]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo_oftaa0/privsep.sock
Oct 14 08:34:01 np0005486759.ooo.test sudo[78281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:34:01 np0005486759.ooo.test podman[78283]: 2025-10-14 08:34:01.979981586 +0000 UTC m=+0.053675859 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12)
Oct 14 08:34:02 np0005486759.ooo.test podman[78283]: 2025-10-14 08:34:02.364882223 +0000 UTC m=+0.438576526 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:34:02 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:34:02 np0005486759.ooo.test sudo[78281]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:02 np0005486759.ooo.test sudo[78315]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp30278a9d/privsep.sock
Oct 14 08:34:02 np0005486759.ooo.test sudo[78315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:03 np0005486759.ooo.test sudo[78315]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:03 np0005486759.ooo.test sudo[78326]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzvkrnw3j/privsep.sock
Oct 14 08:34:03 np0005486759.ooo.test sudo[78326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:04 np0005486759.ooo.test sudo[78326]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:04 np0005486759.ooo.test sudo[78337]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdj2mivwx/privsep.sock
Oct 14 08:34:04 np0005486759.ooo.test sudo[78337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:05 np0005486759.ooo.test sudo[78337]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:05 np0005486759.ooo.test sudo[78354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdumn_zfk/privsep.sock
Oct 14 08:34:05 np0005486759.ooo.test sudo[78354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:05 np0005486759.ooo.test sudo[78354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:06 np0005486759.ooo.test sudo[78365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0k3_k1vq/privsep.sock
Oct 14 08:34:06 np0005486759.ooo.test sudo[78365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:06 np0005486759.ooo.test sudo[78365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:07 np0005486759.ooo.test sudo[78376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx6py1cpt/privsep.sock
Oct 14 08:34:07 np0005486759.ooo.test sudo[78376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:07 np0005486759.ooo.test sudo[78376]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:07 np0005486759.ooo.test sudo[78387]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq6ui6iz8/privsep.sock
Oct 14 08:34:07 np0005486759.ooo.test sudo[78387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:08 np0005486759.ooo.test sudo[78387]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:08 np0005486759.ooo.test sudo[78398]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8j9j9eu4/privsep.sock
Oct 14 08:34:08 np0005486759.ooo.test sudo[78398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:34:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:34:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:34:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:34:08 np0005486759.ooo.test systemd[1]: tmp-crun.cWSoCW.mount: Deactivated successfully.
Oct 14 08:34:09 np0005486759.ooo.test podman[78401]: 2025-10-14 08:34:09.025592467 +0000 UTC m=+0.189772921 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3)
Oct 14 08:34:09 np0005486759.ooo.test podman[78401]: 2025-10-14 08:34:09.036440314 +0000 UTC m=+0.200620818 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 08:34:09 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:34:09 np0005486759.ooo.test podman[78435]: 2025-10-14 08:34:09.085682925 +0000 UTC m=+0.139005932 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, vcs-type=git, container_name=iscsid)
Oct 14 08:34:09 np0005486759.ooo.test podman[78402]: 2025-10-14 08:34:08.937267031 +0000 UTC m=+0.103225510 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=)
Oct 14 08:34:09 np0005486759.ooo.test podman[78400]: 2025-10-14 08:34:08.989919357 +0000 UTC m=+0.159310122 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, vcs-type=git, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, container_name=nova_compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9)
Oct 14 08:34:09 np0005486759.ooo.test podman[78435]: 2025-10-14 08:34:09.118804904 +0000 UTC m=+0.172127871 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:34:09 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:34:09 np0005486759.ooo.test podman[78402]: 2025-10-14 08:34:09.150222292 +0000 UTC m=+0.316180761 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:34:09 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:34:09 np0005486759.ooo.test podman[78400]: 2025-10-14 08:34:09.172489524 +0000 UTC m=+0.341880259 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9)
Oct 14 08:34:09 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:34:09 np0005486759.ooo.test sudo[78398]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:09 np0005486759.ooo.test sudo[78499]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkbfskco_/privsep.sock
Oct 14 08:34:09 np0005486759.ooo.test sudo[78499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:10 np0005486759.ooo.test sudo[78499]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:10 np0005486759.ooo.test sudo[78516]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_6fdhnq_/privsep.sock
Oct 14 08:34:10 np0005486759.ooo.test sudo[78516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:11 np0005486759.ooo.test sudo[78516]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:11 np0005486759.ooo.test sudo[78527]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuw1tqynf/privsep.sock
Oct 14 08:34:11 np0005486759.ooo.test sudo[78527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:12 np0005486759.ooo.test sudo[78527]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:12 np0005486759.ooo.test sudo[78538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp18q74w9g/privsep.sock
Oct 14 08:34:12 np0005486759.ooo.test sudo[78538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:12 np0005486759.ooo.test sudo[78538]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:13 np0005486759.ooo.test sudo[78549]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppfa34sc9/privsep.sock
Oct 14 08:34:13 np0005486759.ooo.test sudo[78549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:13 np0005486759.ooo.test sudo[78549]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:14 np0005486759.ooo.test sudo[78560]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8e673tsk/privsep.sock
Oct 14 08:34:14 np0005486759.ooo.test sudo[78560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:14 np0005486759.ooo.test sudo[78560]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:15 np0005486759.ooo.test sudo[78571]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnw_7u4fq/privsep.sock
Oct 14 08:34:15 np0005486759.ooo.test sudo[78571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:15 np0005486759.ooo.test sudo[78571]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:15 np0005486759.ooo.test sudo[78588]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps16tt63r/privsep.sock
Oct 14 08:34:15 np0005486759.ooo.test sudo[78588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:16 np0005486759.ooo.test sudo[78588]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:16 np0005486759.ooo.test sudo[78599]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe7pb3qkr/privsep.sock
Oct 14 08:34:16 np0005486759.ooo.test sudo[78599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:17 np0005486759.ooo.test sudo[78599]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:17 np0005486759.ooo.test sudo[78610]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj7xp92d2/privsep.sock
Oct 14 08:34:17 np0005486759.ooo.test sudo[78610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:18 np0005486759.ooo.test sudo[78610]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:18 np0005486759.ooo.test sudo[78621]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfgkjr299/privsep.sock
Oct 14 08:34:18 np0005486759.ooo.test sudo[78621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:18 np0005486759.ooo.test sudo[78621]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:19 np0005486759.ooo.test sudo[78632]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2qfubtbr/privsep.sock
Oct 14 08:34:19 np0005486759.ooo.test sudo[78632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:19 np0005486759.ooo.test sudo[78632]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:19 np0005486759.ooo.test sudo[78643]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgwy_5zp1/privsep.sock
Oct 14 08:34:19 np0005486759.ooo.test sudo[78643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:20 np0005486759.ooo.test sudo[78643]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:20 np0005486759.ooo.test sudo[78654]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfcgsbhp/privsep.sock
Oct 14 08:34:20 np0005486759.ooo.test sudo[78654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:21 np0005486759.ooo.test sudo[78654]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:21 np0005486759.ooo.test sudo[78671]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptx0x7hp0/privsep.sock
Oct 14 08:34:21 np0005486759.ooo.test sudo[78671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:34:21 np0005486759.ooo.test podman[78675]: 2025-10-14 08:34:21.825033908 +0000 UTC m=+0.070058879 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, release=1, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, tcib_managed=true)
Oct 14 08:34:21 np0005486759.ooo.test podman[78675]: 2025-10-14 08:34:21.837756844 +0000 UTC m=+0.082781815 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi)
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: tmp-crun.TO9qkN.mount: Deactivated successfully.
Oct 14 08:34:21 np0005486759.ooo.test podman[78675]: unhealthy
Oct 14 08:34:21 np0005486759.ooo.test podman[78674]: 2025-10-14 08:34:21.845812645 +0000 UTC m=+0.088881945 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public)
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:34:21 np0005486759.ooo.test podman[78674]: 2025-10-14 08:34:21.854846766 +0000 UTC m=+0.097916076 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20250721.1, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12)
Oct 14 08:34:21 np0005486759.ooo.test podman[78674]: unhealthy
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:34:21 np0005486759.ooo.test podman[78673]: 2025-10-14 08:34:21.891686281 +0000 UTC m=+0.137706342 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-cron, tcib_managed=true, batch=17.1_20250721.1)
Oct 14 08:34:21 np0005486759.ooo.test podman[78673]: 2025-10-14 08:34:21.897011136 +0000 UTC m=+0.143031207 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 08:34:21 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:34:22 np0005486759.ooo.test sudo[78671]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:22 np0005486759.ooo.test sudo[78741]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe5ar99rt/privsep.sock
Oct 14 08:34:22 np0005486759.ooo.test sudo[78741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:23 np0005486759.ooo.test sudo[78741]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:23 np0005486759.ooo.test sudo[78752]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzdd4spyk/privsep.sock
Oct 14 08:34:23 np0005486759.ooo.test sudo[78752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:23 np0005486759.ooo.test sudo[78752]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:24 np0005486759.ooo.test sudo[78763]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprd0paer4/privsep.sock
Oct 14 08:34:24 np0005486759.ooo.test sudo[78763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:24 np0005486759.ooo.test sudo[78763]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:25 np0005486759.ooo.test sudo[78774]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqb3jjw3i/privsep.sock
Oct 14 08:34:25 np0005486759.ooo.test sudo[78774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:25 np0005486759.ooo.test sudo[78774]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:25 np0005486759.ooo.test sudo[78785]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwdieejd8/privsep.sock
Oct 14 08:34:25 np0005486759.ooo.test sudo[78785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:26 np0005486759.ooo.test sudo[78785]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:34:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:34:26 np0005486759.ooo.test systemd[1]: tmp-crun.T9DXp3.mount: Deactivated successfully.
Oct 14 08:34:26 np0005486759.ooo.test podman[78797]: 2025-10-14 08:34:26.657692711 +0000 UTC m=+0.068363446 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, container_name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:34:26 np0005486759.ooo.test podman[78797]: 2025-10-14 08:34:26.703456094 +0000 UTC m=+0.114126839 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, tcib_managed=true, release=1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:34:26 np0005486759.ooo.test podman[78794]: 2025-10-14 08:34:26.709013897 +0000 UTC m=+0.120967563 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, release=1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:34:26 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:34:26 np0005486759.ooo.test podman[78794]: 2025-10-14 08:34:26.759742784 +0000 UTC m=+0.171696410 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, vcs-type=git)
Oct 14 08:34:26 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:34:26 np0005486759.ooo.test sudo[78849]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmple83bwij/privsep.sock
Oct 14 08:34:26 np0005486759.ooo.test sudo[78849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:27 np0005486759.ooo.test sudo[78849]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:27 np0005486759.ooo.test systemd[1]: tmp-crun.Hwo234.mount: Deactivated successfully.
Oct 14 08:34:27 np0005486759.ooo.test sudo[78860]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiyreazsc/privsep.sock
Oct 14 08:34:27 np0005486759.ooo.test sudo[78860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:28 np0005486759.ooo.test sudo[78860]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:28 np0005486759.ooo.test sudo[78871]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp92226olp/privsep.sock
Oct 14 08:34:28 np0005486759.ooo.test sudo[78871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:28 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:34:28 np0005486759.ooo.test recover_tripleo_nova_virtqemud[78874]: 47951
Oct 14 08:34:28 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:34:28 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:34:29 np0005486759.ooo.test sudo[78871]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:29 np0005486759.ooo.test sudo[78884]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3cwbbefp/privsep.sock
Oct 14 08:34:29 np0005486759.ooo.test sudo[78884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:30 np0005486759.ooo.test sudo[78884]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:30 np0005486759.ooo.test sudo[78895]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6bctxbd7/privsep.sock
Oct 14 08:34:30 np0005486759.ooo.test sudo[78895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:30 np0005486759.ooo.test sudo[78895]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:31 np0005486759.ooo.test sudo[78906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphbe4iio6/privsep.sock
Oct 14 08:34:31 np0005486759.ooo.test sudo[78906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:31 np0005486759.ooo.test sudo[78906]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:32 np0005486759.ooo.test sudo[78923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1walhnoc/privsep.sock
Oct 14 08:34:32 np0005486759.ooo.test sudo[78923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:32 np0005486759.ooo.test sudo[78923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:34:32 np0005486759.ooo.test podman[78928]: 2025-10-14 08:34:32.792727153 +0000 UTC m=+0.084811638 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, release=1, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, container_name=nova_migration_target, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:34:32 np0005486759.ooo.test sudo[78954]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbqry426n/privsep.sock
Oct 14 08:34:32 np0005486759.ooo.test sudo[78954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:33 np0005486759.ooo.test podman[78928]: 2025-10-14 08:34:33.172774368 +0000 UTC m=+0.464858883 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, release=1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1)
Oct 14 08:34:33 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:34:33 np0005486759.ooo.test sudo[78954]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:33 np0005486759.ooo.test sudo[78968]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6jmiu4tk/privsep.sock
Oct 14 08:34:33 np0005486759.ooo.test sudo[78968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:34 np0005486759.ooo.test sudo[78968]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:34 np0005486759.ooo.test sudo[78979]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2tjgeusy/privsep.sock
Oct 14 08:34:34 np0005486759.ooo.test sudo[78979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:35 np0005486759.ooo.test sudo[78979]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:35 np0005486759.ooo.test sudo[78990]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj_b448ws/privsep.sock
Oct 14 08:34:35 np0005486759.ooo.test sudo[78990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:36 np0005486759.ooo.test sudo[78990]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:36 np0005486759.ooo.test sudo[79001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbxioug2b/privsep.sock
Oct 14 08:34:36 np0005486759.ooo.test sudo[79001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:37 np0005486759.ooo.test sudo[79001]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:37 np0005486759.ooo.test sudo[79018]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwon1velk/privsep.sock
Oct 14 08:34:37 np0005486759.ooo.test sudo[79018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:38 np0005486759.ooo.test sudo[79018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:38 np0005486759.ooo.test sudo[79029]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyv80vxag/privsep.sock
Oct 14 08:34:38 np0005486759.ooo.test sudo[79029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:38 np0005486759.ooo.test sudo[79029]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:39 np0005486759.ooo.test sudo[79040]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6rsmndjk/privsep.sock
Oct 14 08:34:39 np0005486759.ooo.test sudo[79040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:34:39 np0005486759.ooo.test podman[79042]: 2025-10-14 08:34:39.349702322 +0000 UTC m=+0.084911250 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, container_name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:34:39 np0005486759.ooo.test podman[79042]: 2025-10-14 08:34:39.357313709 +0000 UTC m=+0.092522637 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public)
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:34:39 np0005486759.ooo.test podman[79043]: 2025-10-14 08:34:39.392846753 +0000 UTC m=+0.121979762 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc.)
Oct 14 08:34:39 np0005486759.ooo.test podman[79043]: 2025-10-14 08:34:39.448189314 +0000 UTC m=+0.177322313 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, config_id=tripleo_step5, distribution-scope=public, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_compute, tcib_managed=true)
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:34:39 np0005486759.ooo.test podman[79044]: 2025-10-14 08:34:39.451646131 +0000 UTC m=+0.181357439 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Oct 14 08:34:39 np0005486759.ooo.test podman[79045]: 2025-10-14 08:34:39.520080999 +0000 UTC m=+0.243502641 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Oct 14 08:34:39 np0005486759.ooo.test podman[79044]: 2025-10-14 08:34:39.536233111 +0000 UTC m=+0.265944399 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.9, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, release=2, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:34:39 np0005486759.ooo.test podman[79045]: 2025-10-14 08:34:39.69121919 +0000 UTC m=+0.414640892 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team)
Oct 14 08:34:39 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:34:39 np0005486759.ooo.test sudo[79040]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:40 np0005486759.ooo.test sudo[79143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8y66pcl5/privsep.sock
Oct 14 08:34:40 np0005486759.ooo.test sudo[79143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:40 np0005486759.ooo.test systemd[1]: tmp-crun.nOAz2a.mount: Deactivated successfully.
Oct 14 08:34:40 np0005486759.ooo.test sudo[79143]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:40 np0005486759.ooo.test sudo[79154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgtsuor45/privsep.sock
Oct 14 08:34:40 np0005486759.ooo.test sudo[79154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:41 np0005486759.ooo.test sudo[79154]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:41 np0005486759.ooo.test sudo[79165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8drt47m0/privsep.sock
Oct 14 08:34:41 np0005486759.ooo.test sudo[79165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:42 np0005486759.ooo.test sudo[79165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:42 np0005486759.ooo.test sudo[79181]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfyif68qp/privsep.sock
Oct 14 08:34:42 np0005486759.ooo.test sudo[79181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:43 np0005486759.ooo.test sudo[79181]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:43 np0005486759.ooo.test sudo[79193]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi_3tmvyh/privsep.sock
Oct 14 08:34:43 np0005486759.ooo.test sudo[79193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:44 np0005486759.ooo.test sudo[79193]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:44 np0005486759.ooo.test sudo[79204]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ce2g9lo/privsep.sock
Oct 14 08:34:44 np0005486759.ooo.test sudo[79204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:44 np0005486759.ooo.test sudo[79204]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:45 np0005486759.ooo.test sudo[79215]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe56olr45/privsep.sock
Oct 14 08:34:45 np0005486759.ooo.test sudo[79215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:45 np0005486759.ooo.test sudo[79215]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:46 np0005486759.ooo.test sudo[79226]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo3r49wgo/privsep.sock
Oct 14 08:34:46 np0005486759.ooo.test sudo[79226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:46 np0005486759.ooo.test sudo[79226]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:46 np0005486759.ooo.test sudo[79237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_dkyun3c/privsep.sock
Oct 14 08:34:46 np0005486759.ooo.test sudo[79237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:47 np0005486759.ooo.test sudo[79237]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:47 np0005486759.ooo.test sudo[79248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpncva4bjp/privsep.sock
Oct 14 08:34:47 np0005486759.ooo.test sudo[79248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:48 np0005486759.ooo.test sudo[79248]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:48 np0005486759.ooo.test sudo[79265]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgx118nj0/privsep.sock
Oct 14 08:34:48 np0005486759.ooo.test sudo[79265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:49 np0005486759.ooo.test sudo[79265]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:49 np0005486759.ooo.test sudo[79276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiwle9mn6/privsep.sock
Oct 14 08:34:49 np0005486759.ooo.test sudo[79276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:49 np0005486759.ooo.test sudo[79276]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:50 np0005486759.ooo.test sudo[79287]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzel2hybe/privsep.sock
Oct 14 08:34:50 np0005486759.ooo.test sudo[79287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:50 np0005486759.ooo.test sudo[79287]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:51 np0005486759.ooo.test sudo[79298]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2hc2k0ls/privsep.sock
Oct 14 08:34:51 np0005486759.ooo.test sudo[79298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:51 np0005486759.ooo.test sudo[79298]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:51 np0005486759.ooo.test sudo[79309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4rc4uxy1/privsep.sock
Oct 14 08:34:52 np0005486759.ooo.test sudo[79309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: tmp-crun.5Yg1ad.mount: Deactivated successfully.
Oct 14 08:34:52 np0005486759.ooo.test podman[79313]: 2025-10-14 08:34:52.09211326 +0000 UTC m=+0.069206452 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 08:34:52 np0005486759.ooo.test podman[79312]: 2025-10-14 08:34:52.146334317 +0000 UTC m=+0.123352326 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Oct 14 08:34:52 np0005486759.ooo.test podman[79310]: 2025-10-14 08:34:52.122244617 +0000 UTC m=+0.097991117 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, io.buildah.version=1.33.12, version=17.1.9, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52)
Oct 14 08:34:52 np0005486759.ooo.test podman[79313]: 2025-10-14 08:34:52.17604784 +0000 UTC m=+0.153141102 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:34:52 np0005486759.ooo.test podman[79313]: unhealthy
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:34:52 np0005486759.ooo.test podman[79310]: 2025-10-14 08:34:52.205172526 +0000 UTC m=+0.180918946 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:34:52 np0005486759.ooo.test sshd[23797]: Received disconnect from 192.168.122.100 port 53598:11: disconnected by user
Oct 14 08:34:52 np0005486759.ooo.test sshd[23797]: Disconnected from user zuul 192.168.122.100 port 53598
Oct 14 08:34:52 np0005486759.ooo.test podman[79312]: 2025-10-14 08:34:52.22945395 +0000 UTC m=+0.206471989 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public)
Oct 14 08:34:52 np0005486759.ooo.test sshd[23794]: pam_unix(sshd:session): session closed for user zuul
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: session-11.scope: Deactivated successfully.
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: session-11.scope: Consumed 2.879s CPU time.
Oct 14 08:34:52 np0005486759.ooo.test systemd-logind[759]: Session 11 logged out. Waiting for processes to exit.
Oct 14 08:34:52 np0005486759.ooo.test podman[79312]: unhealthy
Oct 14 08:34:52 np0005486759.ooo.test systemd-logind[759]: Removed session 11.
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:34:52 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:34:52 np0005486759.ooo.test sudo[79309]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:52 np0005486759.ooo.test sudo[79375]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5d0921ba/privsep.sock
Oct 14 08:34:52 np0005486759.ooo.test sudo[79375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:53 np0005486759.ooo.test systemd[1]: tmp-crun.x5otFR.mount: Deactivated successfully.
Oct 14 08:34:53 np0005486759.ooo.test sudo[79375]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:53 np0005486759.ooo.test sudo[79392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfehbfj5w/privsep.sock
Oct 14 08:34:53 np0005486759.ooo.test sudo[79392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:54 np0005486759.ooo.test sudo[79392]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:54 np0005486759.ooo.test sudo[79403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7sro4r6g/privsep.sock
Oct 14 08:34:54 np0005486759.ooo.test sudo[79403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:55 np0005486759.ooo.test sudo[79403]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:55 np0005486759.ooo.test sudo[79414]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvs9h8w04/privsep.sock
Oct 14 08:34:55 np0005486759.ooo.test sudo[79414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:55 np0005486759.ooo.test sudo[79414]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:56 np0005486759.ooo.test sudo[79425]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7uj3filh/privsep.sock
Oct 14 08:34:56 np0005486759.ooo.test sudo[79425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:56 np0005486759.ooo.test sudo[79425]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:34:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:34:56 np0005486759.ooo.test systemd[1]: tmp-crun.c8QkCu.mount: Deactivated successfully.
Oct 14 08:34:56 np0005486759.ooo.test podman[79432]: 2025-10-14 08:34:56.967241824 +0000 UTC m=+0.085106988 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Oct 14 08:34:56 np0005486759.ooo.test podman[79432]: 2025-10-14 08:34:56.995607206 +0000 UTC m=+0.113472310 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, container_name=ovn_controller, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1)
Oct 14 08:34:57 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:34:57 np0005486759.ooo.test podman[79429]: 2025-10-14 08:34:57.067577363 +0000 UTC m=+0.187137409 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:34:57 np0005486759.ooo.test podman[79429]: 2025-10-14 08:34:57.102067655 +0000 UTC m=+0.221627731 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:34:57 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:34:57 np0005486759.ooo.test sudo[79480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl3s3m00z/privsep.sock
Oct 14 08:34:57 np0005486759.ooo.test sudo[79480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:57 np0005486759.ooo.test sudo[79480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:57 np0005486759.ooo.test sudo[79491]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2gvhgv7n/privsep.sock
Oct 14 08:34:58 np0005486759.ooo.test sudo[79491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:58 np0005486759.ooo.test sudo[79491]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:58 np0005486759.ooo.test sudo[79505]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6hktjfjc/privsep.sock
Oct 14 08:34:58 np0005486759.ooo.test sudo[79505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:34:59 np0005486759.ooo.test sudo[79505]: pam_unix(sudo:session): session closed for user root
Oct 14 08:34:59 np0005486759.ooo.test sudo[79519]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb5oiazwa/privsep.sock
Oct 14 08:34:59 np0005486759.ooo.test sudo[79519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:00 np0005486759.ooo.test sudo[79519]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:00 np0005486759.ooo.test sudo[79530]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7_17j9rj/privsep.sock
Oct 14 08:35:00 np0005486759.ooo.test sudo[79530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:01 np0005486759.ooo.test sudo[79530]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:01 np0005486759.ooo.test sudo[79541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeaoc1qms/privsep.sock
Oct 14 08:35:01 np0005486759.ooo.test sudo[79541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:02 np0005486759.ooo.test sudo[79541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:02 np0005486759.ooo.test sudo[79552]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2eiir7fi/privsep.sock
Oct 14 08:35:02 np0005486759.ooo.test sudo[79552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:02 np0005486759.ooo.test sudo[79552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:03 np0005486759.ooo.test sudo[79563]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjddfccq2/privsep.sock
Oct 14 08:35:03 np0005486759.ooo.test sudo[79563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:35:03 np0005486759.ooo.test systemd[1]: tmp-crun.C1kn0l.mount: Deactivated successfully.
Oct 14 08:35:03 np0005486759.ooo.test podman[79565]: 2025-10-14 08:35:03.292166158 +0000 UTC m=+0.069897173 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target)
Oct 14 08:35:03 np0005486759.ooo.test podman[79565]: 2025-10-14 08:35:03.667506727 +0000 UTC m=+0.445237792 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9)
Oct 14 08:35:03 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:35:03 np0005486759.ooo.test sudo[79563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:04 np0005486759.ooo.test sudo[79599]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaeyb6icd/privsep.sock
Oct 14 08:35:04 np0005486759.ooo.test sudo[79599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:04 np0005486759.ooo.test sudo[79599]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:04 np0005486759.ooo.test sudo[79614]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpawjpj__c/privsep.sock
Oct 14 08:35:04 np0005486759.ooo.test sudo[79614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:05 np0005486759.ooo.test sudo[79614]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:05 np0005486759.ooo.test sudo[79625]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbekfmhsx/privsep.sock
Oct 14 08:35:05 np0005486759.ooo.test sudo[79625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:06 np0005486759.ooo.test sudo[79625]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:06 np0005486759.ooo.test sudo[79636]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjl1yypa9/privsep.sock
Oct 14 08:35:06 np0005486759.ooo.test sudo[79636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:07 np0005486759.ooo.test sudo[79636]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:07 np0005486759.ooo.test sudo[79647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7slio664/privsep.sock
Oct 14 08:35:07 np0005486759.ooo.test sudo[79647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:08 np0005486759.ooo.test sudo[79647]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:08 np0005486759.ooo.test sudo[79658]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5luh85ev/privsep.sock
Oct 14 08:35:08 np0005486759.ooo.test sudo[79658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:08 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:35:08 np0005486759.ooo.test recover_tripleo_nova_virtqemud[79661]: 47951
Oct 14 08:35:08 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:35:08 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:35:08 np0005486759.ooo.test sudo[79658]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:09 np0005486759.ooo.test sudo[79671]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprlg6rqg3/privsep.sock
Oct 14 08:35:09 np0005486759.ooo.test sudo[79671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:09 np0005486759.ooo.test sudo[79671]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:35:09 np0005486759.ooo.test podman[79686]: 2025-10-14 08:35:09.864417084 +0000 UTC m=+0.074661432 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 08:35:09 np0005486759.ooo.test podman[79696]: 2025-10-14 08:35:09.873171126 +0000 UTC m=+0.072838525 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, config_id=tripleo_step1, distribution-scope=public)
Oct 14 08:35:09 np0005486759.ooo.test podman[79686]: 2025-10-14 08:35:09.879146342 +0000 UTC m=+0.089390700 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, container_name=collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, tcib_managed=true, release=2)
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: tmp-crun.PJDalD.mount: Deactivated successfully.
Oct 14 08:35:09 np0005486759.ooo.test podman[79684]: 2025-10-14 08:35:09.92830777 +0000 UTC m=+0.141376576 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:35:09 np0005486759.ooo.test podman[79683]: 2025-10-14 08:35:09.966139326 +0000 UTC m=+0.182812664 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, version=17.1.9, config_id=tripleo_step3, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container)
Oct 14 08:35:09 np0005486759.ooo.test podman[79683]: 2025-10-14 08:35:09.973161554 +0000 UTC m=+0.189834872 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, name=rhosp17/openstack-iscsid)
Oct 14 08:35:09 np0005486759.ooo.test podman[79684]: 2025-10-14 08:35:09.97816475 +0000 UTC m=+0.191233556 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, container_name=nova_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:35:09 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:35:10 np0005486759.ooo.test sudo[79779]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp27ocnc5r/privsep.sock
Oct 14 08:35:10 np0005486759.ooo.test sudo[79779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:10 np0005486759.ooo.test podman[79696]: 2025-10-14 08:35:10.06019519 +0000 UTC m=+0.259862559 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-type=git)
Oct 14 08:35:10 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:35:10 np0005486759.ooo.test sudo[79779]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:10 np0005486759.ooo.test sudo[79790]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp928d7dis/privsep.sock
Oct 14 08:35:10 np0005486759.ooo.test sudo[79790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:11 np0005486759.ooo.test sudo[79790]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:11 np0005486759.ooo.test sudo[79801]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa57_z81a/privsep.sock
Oct 14 08:35:11 np0005486759.ooo.test sudo[79801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:12 np0005486759.ooo.test sudo[79801]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:12 np0005486759.ooo.test sudo[79812]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp878xeua5/privsep.sock
Oct 14 08:35:12 np0005486759.ooo.test sudo[79812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:13 np0005486759.ooo.test sudo[79812]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:13 np0005486759.ooo.test sudo[79823]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3bq9zj26/privsep.sock
Oct 14 08:35:13 np0005486759.ooo.test sudo[79823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:13 np0005486759.ooo.test sudo[79823]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:14 np0005486759.ooo.test sudo[79834]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgzz_qw51/privsep.sock
Oct 14 08:35:14 np0005486759.ooo.test sudo[79834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:14 np0005486759.ooo.test sudo[79834]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:14 np0005486759.ooo.test sudo[79847]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr3dai7vp/privsep.sock
Oct 14 08:35:14 np0005486759.ooo.test sudo[79847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:15 np0005486759.ooo.test sudo[79847]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:15 np0005486759.ooo.test sudo[79862]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpphqkdrx8/privsep.sock
Oct 14 08:35:15 np0005486759.ooo.test sudo[79862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:16 np0005486759.ooo.test sudo[79862]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:16 np0005486759.ooo.test sudo[79873]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphp0a5d2p/privsep.sock
Oct 14 08:35:16 np0005486759.ooo.test sudo[79873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:17 np0005486759.ooo.test sudo[79873]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:17 np0005486759.ooo.test sudo[79884]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy4qo5kj0/privsep.sock
Oct 14 08:35:17 np0005486759.ooo.test sudo[79884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:17 np0005486759.ooo.test sudo[79884]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:18 np0005486759.ooo.test sudo[79895]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqu21mool/privsep.sock
Oct 14 08:35:18 np0005486759.ooo.test sudo[79895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:18 np0005486759.ooo.test sudo[79895]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:18 np0005486759.ooo.test sudo[79906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2jwpdu92/privsep.sock
Oct 14 08:35:18 np0005486759.ooo.test sudo[79906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:19 np0005486759.ooo.test sudo[79906]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:19 np0005486759.ooo.test sudo[79917]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjtkfp_7b/privsep.sock
Oct 14 08:35:19 np0005486759.ooo.test sudo[79917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:20 np0005486759.ooo.test sudo[79917]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:20 np0005486759.ooo.test sudo[79934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2as4qyd0/privsep.sock
Oct 14 08:35:20 np0005486759.ooo.test sudo[79934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:21 np0005486759.ooo.test sudo[79934]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:21 np0005486759.ooo.test sudo[79945]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph8anvlcr/privsep.sock
Oct 14 08:35:21 np0005486759.ooo.test sudo[79945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:22 np0005486759.ooo.test sudo[79945]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:35:22 np0005486759.ooo.test sudo[79986]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmfbd35t6/privsep.sock
Oct 14 08:35:22 np0005486759.ooo.test sudo[79986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:22 np0005486759.ooo.test podman[79954]: 2025-10-14 08:35:22.455365863 +0000 UTC m=+0.081442203 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.openshift.expose-services=)
Oct 14 08:35:22 np0005486759.ooo.test podman[79952]: 2025-10-14 08:35:22.516205545 +0000 UTC m=+0.140133928 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, release=1)
Oct 14 08:35:22 np0005486759.ooo.test podman[79955]: 2025-10-14 08:35:22.435078053 +0000 UTC m=+0.057466988 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1, vcs-type=git)
Oct 14 08:35:22 np0005486759.ooo.test podman[79954]: 2025-10-14 08:35:22.541984036 +0000 UTC m=+0.168060436 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute)
Oct 14 08:35:22 np0005486759.ooo.test podman[79954]: unhealthy
Oct 14 08:35:22 np0005486759.ooo.test podman[79952]: 2025-10-14 08:35:22.54950272 +0000 UTC m=+0.173431103 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron)
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:35:22 np0005486759.ooo.test podman[79955]: 2025-10-14 08:35:22.567341384 +0000 UTC m=+0.189730349 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:35:22 np0005486759.ooo.test podman[79955]: unhealthy
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:35:22 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:35:23 np0005486759.ooo.test sudo[79986]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:23 np0005486759.ooo.test sudo[80024]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8fzh1bqf/privsep.sock
Oct 14 08:35:23 np0005486759.ooo.test sudo[80024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:23 np0005486759.ooo.test systemd[1]: tmp-crun.40naZb.mount: Deactivated successfully.
Oct 14 08:35:23 np0005486759.ooo.test sudo[80024]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:24 np0005486759.ooo.test sudo[80035]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6h71vswi/privsep.sock
Oct 14 08:35:24 np0005486759.ooo.test sudo[80035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:24 np0005486759.ooo.test sudo[80035]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:24 np0005486759.ooo.test sudo[80046]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe8az4n2t/privsep.sock
Oct 14 08:35:24 np0005486759.ooo.test sudo[80046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:25 np0005486759.ooo.test sudo[80046]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:25 np0005486759.ooo.test sudo[80062]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp347mzugn/privsep.sock
Oct 14 08:35:25 np0005486759.ooo.test sudo[80062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:26 np0005486759.ooo.test sudo[80062]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:26 np0005486759.ooo.test sudo[80074]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkuncs67i/privsep.sock
Oct 14 08:35:26 np0005486759.ooo.test sudo[80074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:27 np0005486759.ooo.test sudo[80074]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:35:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:35:27 np0005486759.ooo.test systemd[1]: tmp-crun.UdeYXd.mount: Deactivated successfully.
Oct 14 08:35:27 np0005486759.ooo.test podman[80081]: 2025-10-14 08:35:27.325840602 +0000 UTC m=+0.067797680 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible)
Oct 14 08:35:27 np0005486759.ooo.test systemd[1]: tmp-crun.kzLNYQ.mount: Deactivated successfully.
Oct 14 08:35:27 np0005486759.ooo.test podman[80080]: 2025-10-14 08:35:27.343016195 +0000 UTC m=+0.083571909 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Oct 14 08:35:27 np0005486759.ooo.test podman[80080]: 2025-10-14 08:35:27.386571719 +0000 UTC m=+0.127127433 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Oct 14 08:35:27 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:35:27 np0005486759.ooo.test podman[80081]: 2025-10-14 08:35:27.398454299 +0000 UTC m=+0.140411367 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, architecture=x86_64, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:35:27 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:35:27 np0005486759.ooo.test sudo[80132]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0aoiofmw/privsep.sock
Oct 14 08:35:27 np0005486759.ooo.test sudo[80132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:28 np0005486759.ooo.test sudo[80132]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:28 np0005486759.ooo.test sudo[80143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvl4v60tr/privsep.sock
Oct 14 08:35:28 np0005486759.ooo.test sudo[80143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:28 np0005486759.ooo.test sudo[80143]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:29 np0005486759.ooo.test sudo[80154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprt9cc6_m/privsep.sock
Oct 14 08:35:29 np0005486759.ooo.test sudo[80154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:29 np0005486759.ooo.test sudo[80154]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:30 np0005486759.ooo.test sudo[80165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_grw5flk/privsep.sock
Oct 14 08:35:30 np0005486759.ooo.test sudo[80165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:30 np0005486759.ooo.test sudo[80165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:30 np0005486759.ooo.test sudo[80176]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn3s0tchs/privsep.sock
Oct 14 08:35:30 np0005486759.ooo.test sudo[80176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:31 np0005486759.ooo.test sudo[80176]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:31 np0005486759.ooo.test sudo[80193]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5lutsj31/privsep.sock
Oct 14 08:35:31 np0005486759.ooo.test sudo[80193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:32 np0005486759.ooo.test sudo[80193]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:32 np0005486759.ooo.test sudo[80204]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsgcrn8ye/privsep.sock
Oct 14 08:35:32 np0005486759.ooo.test sudo[80204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:33 np0005486759.ooo.test sudo[80204]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:33 np0005486759.ooo.test sudo[80215]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7dgbkq_i/privsep.sock
Oct 14 08:35:33 np0005486759.ooo.test sudo[80215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:34 np0005486759.ooo.test sudo[80215]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:35:34 np0005486759.ooo.test podman[80219]: 2025-10-14 08:35:34.208519317 +0000 UTC m=+0.056333373 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, batch=17.1_20250721.1, architecture=x86_64, managed_by=tripleo_ansible)
Oct 14 08:35:34 np0005486759.ooo.test sudo[80249]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdpu92lns/privsep.sock
Oct 14 08:35:34 np0005486759.ooo.test sudo[80249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:34 np0005486759.ooo.test podman[80219]: 2025-10-14 08:35:34.566360061 +0000 UTC m=+0.414174157 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:35:34 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:35:34 np0005486759.ooo.test sudo[80249]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:35 np0005486759.ooo.test sudo[80260]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe9_ozzmy/privsep.sock
Oct 14 08:35:35 np0005486759.ooo.test sudo[80260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:35 np0005486759.ooo.test sudo[80260]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:36 np0005486759.ooo.test sudo[80271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6k2vlhge/privsep.sock
Oct 14 08:35:36 np0005486759.ooo.test sudo[80271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:36 np0005486759.ooo.test sudo[80271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:37 np0005486759.ooo.test sudo[80288]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyybqh2hq/privsep.sock
Oct 14 08:35:37 np0005486759.ooo.test sudo[80288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:37 np0005486759.ooo.test sudo[80288]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:37 np0005486759.ooo.test sudo[80299]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe6114mjs/privsep.sock
Oct 14 08:35:37 np0005486759.ooo.test sudo[80299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:38 np0005486759.ooo.test sudo[80299]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:38 np0005486759.ooo.test sudo[80310]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpikcxbc4c/privsep.sock
Oct 14 08:35:38 np0005486759.ooo.test sudo[80310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:39 np0005486759.ooo.test sudo[80310]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:39 np0005486759.ooo.test sudo[80321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnvlzftcj/privsep.sock
Oct 14 08:35:39 np0005486759.ooo.test sudo[80321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:40 np0005486759.ooo.test sudo[80321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: tmp-crun.GZuugC.mount: Deactivated successfully.
Oct 14 08:35:40 np0005486759.ooo.test podman[80328]: 2025-10-14 08:35:40.340773663 +0000 UTC m=+0.082938590 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, version=17.1.9, container_name=nova_compute)
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: tmp-crun.uvM67H.mount: Deactivated successfully.
Oct 14 08:35:40 np0005486759.ooo.test podman[80327]: 2025-10-14 08:35:40.385598906 +0000 UTC m=+0.130336383 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Oct 14 08:35:40 np0005486759.ooo.test podman[80328]: 2025-10-14 08:35:40.392284834 +0000 UTC m=+0.134449721 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:35:40 np0005486759.ooo.test podman[80327]: 2025-10-14 08:35:40.421352237 +0000 UTC m=+0.166089724 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15)
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:35:40 np0005486759.ooo.test podman[80329]: 2025-10-14 08:35:40.43912156 +0000 UTC m=+0.179614545 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.9)
Oct 14 08:35:40 np0005486759.ooo.test podman[80329]: 2025-10-14 08:35:40.449300316 +0000 UTC m=+0.189793371 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true)
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:35:40 np0005486759.ooo.test sudo[80405]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp05w5tvhf/privsep.sock
Oct 14 08:35:40 np0005486759.ooo.test sudo[80405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:40 np0005486759.ooo.test podman[80330]: 2025-10-14 08:35:40.495058209 +0000 UTC m=+0.235019417 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 08:35:40 np0005486759.ooo.test podman[80330]: 2025-10-14 08:35:40.740344754 +0000 UTC m=+0.480306002 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git)
Oct 14 08:35:40 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:35:41 np0005486759.ooo.test sudo[80405]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:41 np0005486759.ooo.test sudo[80434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqsutn1sr/privsep.sock
Oct 14 08:35:41 np0005486759.ooo.test sudo[80434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:41 np0005486759.ooo.test sudo[80434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:42 np0005486759.ooo.test sudo[80451]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbl6xfi_3/privsep.sock
Oct 14 08:35:42 np0005486759.ooo.test sudo[80451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:42 np0005486759.ooo.test sudo[80451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:43 np0005486759.ooo.test sudo[80462]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaqh1caqw/privsep.sock
Oct 14 08:35:43 np0005486759.ooo.test sudo[80462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:43 np0005486759.ooo.test sudo[80462]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:43 np0005486759.ooo.test sudo[80473]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzym99f6a/privsep.sock
Oct 14 08:35:43 np0005486759.ooo.test sudo[80473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:44 np0005486759.ooo.test sudo[80473]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:44 np0005486759.ooo.test sudo[80484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi1oa3210/privsep.sock
Oct 14 08:35:44 np0005486759.ooo.test sudo[80484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:45 np0005486759.ooo.test sudo[80484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:45 np0005486759.ooo.test sudo[80495]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpimnd3q5u/privsep.sock
Oct 14 08:35:45 np0005486759.ooo.test sudo[80495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:46 np0005486759.ooo.test sudo[80495]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:46 np0005486759.ooo.test sudo[80506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_8iloht/privsep.sock
Oct 14 08:35:46 np0005486759.ooo.test sudo[80506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:47 np0005486759.ooo.test sudo[80506]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:47 np0005486759.ooo.test sudo[80517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpty_yr00t/privsep.sock
Oct 14 08:35:47 np0005486759.ooo.test sudo[80517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:47 np0005486759.ooo.test sudo[80517]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:48 np0005486759.ooo.test sudo[80534]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppye6zrc_/privsep.sock
Oct 14 08:35:48 np0005486759.ooo.test sudo[80534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:48 np0005486759.ooo.test sudo[80534]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:49 np0005486759.ooo.test sudo[80545]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprvsdhj71/privsep.sock
Oct 14 08:35:49 np0005486759.ooo.test sudo[80545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:49 np0005486759.ooo.test sudo[80545]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:49 np0005486759.ooo.test sudo[80556]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_7ehwkwe/privsep.sock
Oct 14 08:35:49 np0005486759.ooo.test sudo[80556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:50 np0005486759.ooo.test sudo[80556]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:50 np0005486759.ooo.test sudo[80567]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbnzgygaw/privsep.sock
Oct 14 08:35:50 np0005486759.ooo.test sudo[80567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:51 np0005486759.ooo.test sudo[80567]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:51 np0005486759.ooo.test sudo[80578]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2kn3zvve/privsep.sock
Oct 14 08:35:51 np0005486759.ooo.test sudo[80578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:52 np0005486759.ooo.test sudo[80578]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:52 np0005486759.ooo.test sudo[80591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5csg0r3a/privsep.sock
Oct 14 08:35:52 np0005486759.ooo.test sudo[80591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:35:52 np0005486759.ooo.test podman[80594]: 2025-10-14 08:35:52.806254989 +0000 UTC m=+0.083481835 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: tmp-crun.x3wUlH.mount: Deactivated successfully.
Oct 14 08:35:52 np0005486759.ooo.test podman[80594]: 2025-10-14 08:35:52.861516592 +0000 UTC m=+0.138743498 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:35:52 np0005486759.ooo.test podman[80594]: unhealthy
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:35:52 np0005486759.ooo.test podman[80595]: 2025-10-14 08:35:52.912792861 +0000 UTC m=+0.187202700 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:35:52 np0005486759.ooo.test podman[80595]: 2025-10-14 08:35:52.926322644 +0000 UTC m=+0.200732693 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:35:52 np0005486759.ooo.test podman[80595]: unhealthy
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:35:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:35:52 np0005486759.ooo.test podman[80593]: 2025-10-14 08:35:52.864738983 +0000 UTC m=+0.145942763 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public)
Oct 14 08:35:52 np0005486759.ooo.test podman[80593]: 2025-10-14 08:35:52.995022217 +0000 UTC m=+0.276225987 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.9, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, build-date=2025-07-21T13:07:52, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Oct 14 08:35:53 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:35:53 np0005486759.ooo.test sudo[80591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:53 np0005486759.ooo.test sudo[80668]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcuhg4zfx/privsep.sock
Oct 14 08:35:53 np0005486759.ooo.test sudo[80668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:54 np0005486759.ooo.test sudo[80668]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:54 np0005486759.ooo.test sudo[80679]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpii3lacmr/privsep.sock
Oct 14 08:35:54 np0005486759.ooo.test sudo[80679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:55 np0005486759.ooo.test sudo[80679]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:55 np0005486759.ooo.test sudo[80690]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxz9ue26m/privsep.sock
Oct 14 08:35:55 np0005486759.ooo.test sudo[80690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:55 np0005486759.ooo.test sudo[80690]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:56 np0005486759.ooo.test sudo[80701]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpphbw81ff/privsep.sock
Oct 14 08:35:56 np0005486759.ooo.test sudo[80701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:56 np0005486759.ooo.test sudo[80701]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:57 np0005486759.ooo.test sudo[80712]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi5qucaro/privsep.sock
Oct 14 08:35:57 np0005486759.ooo.test sudo[80712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:57 np0005486759.ooo.test sudo[80712]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:35:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:35:57 np0005486759.ooo.test systemd[1]: tmp-crun.oQaD4s.mount: Deactivated successfully.
Oct 14 08:35:57 np0005486759.ooo.test podman[80719]: 2025-10-14 08:35:57.833734274 +0000 UTC m=+0.087338215 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, version=17.1.9, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team)
Oct 14 08:35:57 np0005486759.ooo.test podman[80719]: 2025-10-14 08:35:57.856313888 +0000 UTC m=+0.109917849 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, version=17.1.9, tcib_managed=true, release=1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:35:57 np0005486759.ooo.test podman[80716]: 2025-10-14 08:35:57.806000579 +0000 UTC m=+0.066068992 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, version=17.1.9, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team)
Oct 14 08:35:57 np0005486759.ooo.test podman[80716]: 2025-10-14 08:35:57.887264103 +0000 UTC m=+0.147332516 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 08:35:57 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:35:57 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:35:58 np0005486759.ooo.test sudo[80770]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnor084yt/privsep.sock
Oct 14 08:35:58 np0005486759.ooo.test sudo[80770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:58 np0005486759.ooo.test sudo[80770]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:58 np0005486759.ooo.test sudo[80787]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp44umlikn/privsep.sock
Oct 14 08:35:58 np0005486759.ooo.test sudo[80787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:35:59 np0005486759.ooo.test sudo[80787]: pam_unix(sudo:session): session closed for user root
Oct 14 08:35:59 np0005486759.ooo.test sudo[80798]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4k7kta_b/privsep.sock
Oct 14 08:35:59 np0005486759.ooo.test sudo[80798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:00 np0005486759.ooo.test sudo[80798]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:00 np0005486759.ooo.test sudo[80809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvo80vlbk/privsep.sock
Oct 14 08:36:00 np0005486759.ooo.test sudo[80809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:01 np0005486759.ooo.test sudo[80809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:01 np0005486759.ooo.test sudo[80820]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdo_htwdw/privsep.sock
Oct 14 08:36:01 np0005486759.ooo.test sudo[80820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:01 np0005486759.ooo.test sudo[80820]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:02 np0005486759.ooo.test sudo[80831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7kvsoaf8/privsep.sock
Oct 14 08:36:02 np0005486759.ooo.test sudo[80831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:02 np0005486759.ooo.test sudo[80831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:02 np0005486759.ooo.test sudo[80842]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprkvwdbqz/privsep.sock
Oct 14 08:36:02 np0005486759.ooo.test sudo[80842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:03 np0005486759.ooo.test sudo[80842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:03 np0005486759.ooo.test sudo[80858]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqtmqmzah/privsep.sock
Oct 14 08:36:03 np0005486759.ooo.test sudo[80858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:04 np0005486759.ooo.test sudo[80858]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:04 np0005486759.ooo.test sudo[80870]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnly6o5lm/privsep.sock
Oct 14 08:36:04 np0005486759.ooo.test sudo[80870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:36:04 np0005486759.ooo.test systemd[1]: tmp-crun.2bdXBH.mount: Deactivated successfully.
Oct 14 08:36:04 np0005486759.ooo.test podman[80872]: 2025-10-14 08:36:04.719484427 +0000 UTC m=+0.074288819 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4)
Oct 14 08:36:05 np0005486759.ooo.test podman[80872]: 2025-10-14 08:36:05.093356257 +0000 UTC m=+0.448160649 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.9, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20250721.1, tcib_managed=true, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64)
Oct 14 08:36:05 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:36:05 np0005486759.ooo.test sudo[80870]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:05 np0005486759.ooo.test sudo[80905]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnltm9_rn/privsep.sock
Oct 14 08:36:05 np0005486759.ooo.test sudo[80905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:06 np0005486759.ooo.test sudo[80905]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:06 np0005486759.ooo.test sudo[80916]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsbj3ayrq/privsep.sock
Oct 14 08:36:06 np0005486759.ooo.test sudo[80916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:06 np0005486759.ooo.test sudo[80916]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:07 np0005486759.ooo.test sudo[80927]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp12xvy_0/privsep.sock
Oct 14 08:36:07 np0005486759.ooo.test sudo[80927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:07 np0005486759.ooo.test sudo[80927]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:07 np0005486759.ooo.test sudo[80938]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbe5mph2e/privsep.sock
Oct 14 08:36:07 np0005486759.ooo.test sudo[80938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:08 np0005486759.ooo.test sudo[80938]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:08 np0005486759.ooo.test sudo[80951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnkgwpzrm/privsep.sock
Oct 14 08:36:08 np0005486759.ooo.test sudo[80951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:09 np0005486759.ooo.test sudo[80951]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:09 np0005486759.ooo.test sudo[80966]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpisitw1ht/privsep.sock
Oct 14 08:36:09 np0005486759.ooo.test sudo[80966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:10 np0005486759.ooo.test sudo[80966]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:10 np0005486759.ooo.test sudo[80977]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2uudda3z/privsep.sock
Oct 14 08:36:10 np0005486759.ooo.test sudo[80977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:36:10 np0005486759.ooo.test podman[80980]: 2025-10-14 08:36:10.762289439 +0000 UTC m=+0.123501533 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, release=1, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64)
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:36:10 np0005486759.ooo.test podman[80980]: 2025-10-14 08:36:10.800678597 +0000 UTC m=+0.161890701 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:36:10 np0005486759.ooo.test podman[80981]: 2025-10-14 08:36:10.814115345 +0000 UTC m=+0.170176728 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:36:10 np0005486759.ooo.test podman[80981]: 2025-10-14 08:36:10.826301146 +0000 UTC m=+0.182362569 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:36:10 np0005486759.ooo.test podman[80979]: 2025-10-14 08:36:10.739995093 +0000 UTC m=+0.103638213 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=iscsid, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:36:10 np0005486759.ooo.test podman[81025]: 2025-10-14 08:36:10.895040489 +0000 UTC m=+0.111428656 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Oct 14 08:36:10 np0005486759.ooo.test podman[80979]: 2025-10-14 08:36:10.918570073 +0000 UTC m=+0.282213203 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, version=17.1.9, architecture=x86_64)
Oct 14 08:36:10 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:36:11 np0005486759.ooo.test podman[81025]: 2025-10-14 08:36:11.0913122 +0000 UTC m=+0.307700377 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, release=1, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:36:11 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:36:11 np0005486759.ooo.test sudo[80977]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:11 np0005486759.ooo.test sudo[81079]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr5urq5ui/privsep.sock
Oct 14 08:36:11 np0005486759.ooo.test sudo[81079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:12 np0005486759.ooo.test sudo[81079]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:12 np0005486759.ooo.test sudo[81090]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6s12euc8/privsep.sock
Oct 14 08:36:12 np0005486759.ooo.test sudo[81090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:12 np0005486759.ooo.test sudo[81090]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:13 np0005486759.ooo.test sudo[81101]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf1xvzosu/privsep.sock
Oct 14 08:36:13 np0005486759.ooo.test sudo[81101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:13 np0005486759.ooo.test sudo[81101]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:13 np0005486759.ooo.test sudo[81112]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkt93b36_/privsep.sock
Oct 14 08:36:13 np0005486759.ooo.test sudo[81112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:14 np0005486759.ooo.test sudo[81112]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:14 np0005486759.ooo.test sudo[81129]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy6d46f3s/privsep.sock
Oct 14 08:36:14 np0005486759.ooo.test sudo[81129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:15 np0005486759.ooo.test sudo[81129]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:15 np0005486759.ooo.test sudo[81140]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg57af8o9/privsep.sock
Oct 14 08:36:15 np0005486759.ooo.test sudo[81140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:16 np0005486759.ooo.test sudo[81140]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:16 np0005486759.ooo.test sudo[81151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppjcv5e38/privsep.sock
Oct 14 08:36:16 np0005486759.ooo.test sudo[81151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:17 np0005486759.ooo.test sudo[81151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:17 np0005486759.ooo.test sudo[81162]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj4batbnp/privsep.sock
Oct 14 08:36:17 np0005486759.ooo.test sudo[81162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:17 np0005486759.ooo.test sudo[81162]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:18 np0005486759.ooo.test sudo[81173]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpor54saxa/privsep.sock
Oct 14 08:36:18 np0005486759.ooo.test sudo[81173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:18 np0005486759.ooo.test sudo[81173]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:19 np0005486759.ooo.test sudo[81184]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwxynjumw/privsep.sock
Oct 14 08:36:19 np0005486759.ooo.test sudo[81184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:19 np0005486759.ooo.test sudo[81184]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:19 np0005486759.ooo.test sudo[81200]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbef3xr2x/privsep.sock
Oct 14 08:36:19 np0005486759.ooo.test sudo[81200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:20 np0005486759.ooo.test sudo[81200]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:20 np0005486759.ooo.test sudo[81212]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnvg_ewly/privsep.sock
Oct 14 08:36:20 np0005486759.ooo.test sudo[81212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:21 np0005486759.ooo.test sudo[81212]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:21 np0005486759.ooo.test sudo[81223]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ajea3ex/privsep.sock
Oct 14 08:36:21 np0005486759.ooo.test sudo[81223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:22 np0005486759.ooo.test sudo[81223]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:22 np0005486759.ooo.test sudo[81234]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0iie0rmn/privsep.sock
Oct 14 08:36:22 np0005486759.ooo.test sudo[81234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:23 np0005486759.ooo.test sudo[81234]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:36:23 np0005486759.ooo.test podman[81240]: 2025-10-14 08:36:23.200247881 +0000 UTC m=+0.068085634 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true)
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: tmp-crun.wRE70F.mount: Deactivated successfully.
Oct 14 08:36:23 np0005486759.ooo.test podman[81242]: 2025-10-14 08:36:23.270141731 +0000 UTC m=+0.131956376 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:36:23 np0005486759.ooo.test podman[81241]: 2025-10-14 08:36:23.322785413 +0000 UTC m=+0.186378274 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:36:23 np0005486759.ooo.test podman[81241]: 2025-10-14 08:36:23.333721603 +0000 UTC m=+0.197314434 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, container_name=ceilometer_agent_compute)
Oct 14 08:36:23 np0005486759.ooo.test podman[81241]: unhealthy
Oct 14 08:36:23 np0005486759.ooo.test podman[81242]: 2025-10-14 08:36:23.343623043 +0000 UTC m=+0.205437738 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:36:23 np0005486759.ooo.test podman[81242]: unhealthy
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:36:23 np0005486759.ooo.test podman[81240]: 2025-10-14 08:36:23.38618365 +0000 UTC m=+0.254021493 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, version=17.1.9, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:36:23 np0005486759.ooo.test sudo[81298]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgu2zi_p0/privsep.sock
Oct 14 08:36:23 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:36:23 np0005486759.ooo.test sudo[81298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:24 np0005486759.ooo.test sudo[81298]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:24 np0005486759.ooo.test sudo[81309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe9ghta3v/privsep.sock
Oct 14 08:36:24 np0005486759.ooo.test sudo[81309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:24 np0005486759.ooo.test sudo[81309]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:25 np0005486759.ooo.test sudo[81322]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph16guau_/privsep.sock
Oct 14 08:36:25 np0005486759.ooo.test sudo[81322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:25 np0005486759.ooo.test sudo[81322]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:25 np0005486759.ooo.test sudo[81337]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcu8bi8lz/privsep.sock
Oct 14 08:36:25 np0005486759.ooo.test sudo[81337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:26 np0005486759.ooo.test sudo[81337]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:26 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:36:26 np0005486759.ooo.test recover_tripleo_nova_virtqemud[81344]: 47951
Oct 14 08:36:26 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:36:26 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:36:26 np0005486759.ooo.test sudo[81350]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb75ja0ud/privsep.sock
Oct 14 08:36:26 np0005486759.ooo.test sudo[81350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:27 np0005486759.ooo.test sudo[81350]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:27 np0005486759.ooo.test sudo[81361]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyzs87ejq/privsep.sock
Oct 14 08:36:27 np0005486759.ooo.test sudo[81361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:28 np0005486759.ooo.test sudo[81361]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:36:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:36:28 np0005486759.ooo.test podman[81367]: 2025-10-14 08:36:28.350861376 +0000 UTC m=+0.068178137 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public)
Oct 14 08:36:28 np0005486759.ooo.test podman[81368]: 2025-10-14 08:36:28.361931011 +0000 UTC m=+0.073935847 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, batch=17.1_20250721.1, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:36:28 np0005486759.ooo.test podman[81368]: 2025-10-14 08:36:28.381239203 +0000 UTC m=+0.093244059 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1)
Oct 14 08:36:28 np0005486759.ooo.test podman[81367]: 2025-10-14 08:36:28.389757649 +0000 UTC m=+0.107074380 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, version=17.1.9, container_name=ovn_metadata_agent, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:36:28 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:36:28 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:36:28 np0005486759.ooo.test sudo[81419]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvq6bb2lh/privsep.sock
Oct 14 08:36:28 np0005486759.ooo.test sudo[81419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:29 np0005486759.ooo.test sudo[81419]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:29 np0005486759.ooo.test sudo[81430]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxzori9g4/privsep.sock
Oct 14 08:36:29 np0005486759.ooo.test sudo[81430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:29 np0005486759.ooo.test sudo[81430]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:30 np0005486759.ooo.test sudo[81441]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8w9wlduv/privsep.sock
Oct 14 08:36:30 np0005486759.ooo.test sudo[81441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:30 np0005486759.ooo.test sudo[81441]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:31 np0005486759.ooo.test sudo[81458]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_hdc0u74/privsep.sock
Oct 14 08:36:31 np0005486759.ooo.test sudo[81458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:31 np0005486759.ooo.test sudo[81458]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:32 np0005486759.ooo.test sudo[81469]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5q0ho5kz/privsep.sock
Oct 14 08:36:32 np0005486759.ooo.test sudo[81469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:32 np0005486759.ooo.test sudo[81469]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:32 np0005486759.ooo.test sudo[81480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkaxhauqx/privsep.sock
Oct 14 08:36:32 np0005486759.ooo.test sudo[81480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:33 np0005486759.ooo.test sudo[81480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:33 np0005486759.ooo.test sudo[81491]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp_amc9cm/privsep.sock
Oct 14 08:36:33 np0005486759.ooo.test sudo[81491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:34 np0005486759.ooo.test sudo[81491]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:34 np0005486759.ooo.test sudo[81502]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwtqvk0ju/privsep.sock
Oct 14 08:36:34 np0005486759.ooo.test sudo[81502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:35 np0005486759.ooo.test sudo[81502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:36:35 np0005486759.ooo.test podman[81507]: 2025-10-14 08:36:35.435032298 +0000 UTC m=+0.076988182 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, release=1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team)
Oct 14 08:36:35 np0005486759.ooo.test sudo[81535]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkne_kg38/privsep.sock
Oct 14 08:36:35 np0005486759.ooo.test sudo[81535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:35 np0005486759.ooo.test podman[81507]: 2025-10-14 08:36:35.80571599 +0000 UTC m=+0.447671874 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:36:35 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:36:36 np0005486759.ooo.test sudo[81535]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:36 np0005486759.ooo.test sudo[81553]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt5oaw2yc/privsep.sock
Oct 14 08:36:36 np0005486759.ooo.test sudo[81553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:37 np0005486759.ooo.test sudo[81553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:37 np0005486759.ooo.test sudo[81564]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0tudfas4/privsep.sock
Oct 14 08:36:37 np0005486759.ooo.test sudo[81564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:37 np0005486759.ooo.test sudo[81564]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:38 np0005486759.ooo.test sudo[81575]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8b8oz9x6/privsep.sock
Oct 14 08:36:38 np0005486759.ooo.test sudo[81575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:38 np0005486759.ooo.test sudo[81575]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:38 np0005486759.ooo.test sudo[81586]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpij79r81m/privsep.sock
Oct 14 08:36:39 np0005486759.ooo.test sudo[81586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:39 np0005486759.ooo.test sudo[81586]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:39 np0005486759.ooo.test sudo[81597]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp589hvml5/privsep.sock
Oct 14 08:36:39 np0005486759.ooo.test sudo[81597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:40 np0005486759.ooo.test sudo[81597]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:40 np0005486759.ooo.test sudo[81608]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7zvv1px5/privsep.sock
Oct 14 08:36:40 np0005486759.ooo.test sudo[81608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:36:41 np0005486759.ooo.test sudo[81608]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: tmp-crun.ZXOdzu.mount: Deactivated successfully.
Oct 14 08:36:41 np0005486759.ooo.test podman[81616]: 2025-10-14 08:36:41.46084576 +0000 UTC m=+0.084178966 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, architecture=x86_64, build-date=2025-07-21T14:48:37)
Oct 14 08:36:41 np0005486759.ooo.test podman[81616]: 2025-10-14 08:36:41.485309433 +0000 UTC m=+0.108642629 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, release=1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team)
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:36:41 np0005486759.ooo.test podman[81617]: 2025-10-14 08:36:41.435551181 +0000 UTC m=+0.060835129 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.33.12, container_name=collectd, version=17.1.9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:36:41 np0005486759.ooo.test podman[81615]: 2025-10-14 08:36:41.550058193 +0000 UTC m=+0.180459529 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, release=1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Oct 14 08:36:41 np0005486759.ooo.test podman[81617]: 2025-10-14 08:36:41.567339611 +0000 UTC m=+0.192623609 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-collectd, release=2, batch=17.1_20250721.1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, container_name=collectd, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:36:41 np0005486759.ooo.test podman[81615]: 2025-10-14 08:36:41.583297109 +0000 UTC m=+0.213698455 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, version=17.1.9, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:36:41 np0005486759.ooo.test sudo[81699]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbgl4is15/privsep.sock
Oct 14 08:36:41 np0005486759.ooo.test sudo[81699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:41 np0005486759.ooo.test podman[81623]: 2025-10-14 08:36:41.65385039 +0000 UTC m=+0.278501257 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, tcib_managed=true, version=17.1.9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:36:41 np0005486759.ooo.test podman[81623]: 2025-10-14 08:36:41.843468194 +0000 UTC m=+0.468119091 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:36:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:36:42 np0005486759.ooo.test sudo[81699]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:42 np0005486759.ooo.test systemd[1]: tmp-crun.ceONMy.mount: Deactivated successfully.
Oct 14 08:36:42 np0005486759.ooo.test sudo[81729]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpasy1asbi/privsep.sock
Oct 14 08:36:42 np0005486759.ooo.test sudo[81729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:43 np0005486759.ooo.test sudo[81729]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:43 np0005486759.ooo.test sudo[81740]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp30_0deaw/privsep.sock
Oct 14 08:36:43 np0005486759.ooo.test sudo[81740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:43 np0005486759.ooo.test sudo[81740]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:44 np0005486759.ooo.test sudo[81751]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm864mecj/privsep.sock
Oct 14 08:36:44 np0005486759.ooo.test sudo[81751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:44 np0005486759.ooo.test sudo[81751]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:45 np0005486759.ooo.test sudo[81762]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9axb7xee/privsep.sock
Oct 14 08:36:45 np0005486759.ooo.test sudo[81762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:45 np0005486759.ooo.test sudo[81762]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:45 np0005486759.ooo.test sudo[81773]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplgehxve9/privsep.sock
Oct 14 08:36:45 np0005486759.ooo.test sudo[81773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:46 np0005486759.ooo.test sudo[81773]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:46 np0005486759.ooo.test sudo[81786]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvm_2ujh_/privsep.sock
Oct 14 08:36:46 np0005486759.ooo.test sudo[81786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:47 np0005486759.ooo.test sudo[81786]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:47 np0005486759.ooo.test sudo[81801]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcvqgf_ge/privsep.sock
Oct 14 08:36:47 np0005486759.ooo.test sudo[81801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:48 np0005486759.ooo.test sudo[81801]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:48 np0005486759.ooo.test sudo[81812]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl904r5s5/privsep.sock
Oct 14 08:36:48 np0005486759.ooo.test sudo[81812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:48 np0005486759.ooo.test sudo[81812]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:49 np0005486759.ooo.test sudo[81823]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn0oqve2m/privsep.sock
Oct 14 08:36:49 np0005486759.ooo.test sudo[81823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:49 np0005486759.ooo.test sudo[81823]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:49 np0005486759.ooo.test sudo[81834]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsri1dsuo/privsep.sock
Oct 14 08:36:49 np0005486759.ooo.test sudo[81834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:50 np0005486759.ooo.test sudo[81834]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:50 np0005486759.ooo.test sudo[81845]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw2lilwho/privsep.sock
Oct 14 08:36:50 np0005486759.ooo.test sudo[81845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:51 np0005486759.ooo.test sudo[81845]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:51 np0005486759.ooo.test sudo[81856]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkhjp9i03/privsep.sock
Oct 14 08:36:51 np0005486759.ooo.test sudo[81856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:52 np0005486759.ooo.test sudo[81856]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:52 np0005486759.ooo.test sudo[81873]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph8l2s6wk/privsep.sock
Oct 14 08:36:52 np0005486759.ooo.test sudo[81873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:53 np0005486759.ooo.test sudo[81873]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:53 np0005486759.ooo.test sudo[81884]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyka_9oul/privsep.sock
Oct 14 08:36:53 np0005486759.ooo.test sudo[81884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: tmp-crun.TakK8h.mount: Deactivated successfully.
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: tmp-crun.vWOzdY.mount: Deactivated successfully.
Oct 14 08:36:53 np0005486759.ooo.test podman[81887]: 2025-10-14 08:36:53.450033797 +0000 UTC m=+0.075893538 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc.)
Oct 14 08:36:53 np0005486759.ooo.test podman[81887]: 2025-10-14 08:36:53.4632586 +0000 UTC m=+0.089118321 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47)
Oct 14 08:36:53 np0005486759.ooo.test podman[81887]: unhealthy
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:36:53 np0005486759.ooo.test podman[81886]: 2025-10-14 08:36:53.45172325 +0000 UTC m=+0.084955181 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:45:33, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:36:53 np0005486759.ooo.test podman[81913]: 2025-10-14 08:36:53.509790201 +0000 UTC m=+0.063038017 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, release=1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, config_id=tripleo_step4, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc.)
Oct 14 08:36:53 np0005486759.ooo.test podman[81913]: 2025-10-14 08:36:53.515009214 +0000 UTC m=+0.068257080 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:36:53 np0005486759.ooo.test podman[81886]: 2025-10-14 08:36:53.536408161 +0000 UTC m=+0.169640082 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1)
Oct 14 08:36:53 np0005486759.ooo.test podman[81886]: unhealthy
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:36:53 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:36:53 np0005486759.ooo.test sudo[81884]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:54 np0005486759.ooo.test sudo[81950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz5fpurph/privsep.sock
Oct 14 08:36:54 np0005486759.ooo.test sudo[81950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:54 np0005486759.ooo.test sudo[81950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:55 np0005486759.ooo.test sudo[81961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph4nz5xby/privsep.sock
Oct 14 08:36:55 np0005486759.ooo.test sudo[81961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:55 np0005486759.ooo.test sudo[81961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:55 np0005486759.ooo.test sudo[81972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjww2_wl5/privsep.sock
Oct 14 08:36:55 np0005486759.ooo.test sudo[81972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:56 np0005486759.ooo.test sudo[81972]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:56 np0005486759.ooo.test sudo[81983]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp068b0lp/privsep.sock
Oct 14 08:36:56 np0005486759.ooo.test sudo[81983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:57 np0005486759.ooo.test sudo[81983]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:57 np0005486759.ooo.test sudo[81997]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgzw5wjrw/privsep.sock
Oct 14 08:36:57 np0005486759.ooo.test sudo[81997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:58 np0005486759.ooo.test sudo[81997]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:58 np0005486759.ooo.test sudo[82011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi96t7ndg/privsep.sock
Oct 14 08:36:58 np0005486759.ooo.test sudo[82011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:58 np0005486759.ooo.test sudo[82011]: pam_unix(sudo:session): session closed for user root
Oct 14 08:36:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:36:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:36:58 np0005486759.ooo.test podman[82018]: 2025-10-14 08:36:58.98339386 +0000 UTC m=+0.073799292 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:36:59 np0005486759.ooo.test podman[82018]: 2025-10-14 08:36:59.006667976 +0000 UTC m=+0.097073408 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:36:59 np0005486759.ooo.test podman[82015]: 2025-10-14 08:36:59.018860977 +0000 UTC m=+0.111797398 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, release=1, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=)
Oct 14 08:36:59 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:36:59 np0005486759.ooo.test podman[82015]: 2025-10-14 08:36:59.085093012 +0000 UTC m=+0.178029493 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, release=1, vcs-type=git, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:36:59 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:36:59 np0005486759.ooo.test sudo[82069]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb4rhi2nq/privsep.sock
Oct 14 08:36:59 np0005486759.ooo.test sudo[82069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:36:59 np0005486759.ooo.test sudo[82069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:00 np0005486759.ooo.test sudo[82080]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpefqhi702/privsep.sock
Oct 14 08:37:00 np0005486759.ooo.test sudo[82080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:00 np0005486759.ooo.test sudo[82080]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:00 np0005486759.ooo.test sudo[82091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprcndsctp/privsep.sock
Oct 14 08:37:00 np0005486759.ooo.test sudo[82091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:01 np0005486759.ooo.test sudo[82091]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:01 np0005486759.ooo.test sudo[82102]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxvix7zwe/privsep.sock
Oct 14 08:37:01 np0005486759.ooo.test sudo[82102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:02 np0005486759.ooo.test sudo[82102]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:02 np0005486759.ooo.test sudo[82114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpunf5chsv/privsep.sock
Oct 14 08:37:02 np0005486759.ooo.test sudo[82114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:03 np0005486759.ooo.test sudo[82114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:03 np0005486759.ooo.test sudo[82131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx6lx4z43/privsep.sock
Oct 14 08:37:03 np0005486759.ooo.test sudo[82131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:04 np0005486759.ooo.test sudo[82131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:04 np0005486759.ooo.test sudo[82142]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpybwf6l6c/privsep.sock
Oct 14 08:37:04 np0005486759.ooo.test sudo[82142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:04 np0005486759.ooo.test sudo[82142]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:05 np0005486759.ooo.test sudo[82153]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz__j5vin/privsep.sock
Oct 14 08:37:05 np0005486759.ooo.test sudo[82153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:05 np0005486759.ooo.test sudo[82153]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:05 np0005486759.ooo.test sudo[82164]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpepzbymiv/privsep.sock
Oct 14 08:37:05 np0005486759.ooo.test sudo[82164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:37:06 np0005486759.ooo.test systemd[1]: tmp-crun.WMTq2Q.mount: Deactivated successfully.
Oct 14 08:37:06 np0005486759.ooo.test podman[82165]: 2025-10-14 08:37:06.018225293 +0000 UTC m=+0.066736152 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:37:06 np0005486759.ooo.test podman[82165]: 2025-10-14 08:37:06.389462612 +0000 UTC m=+0.437973471 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37)
Oct 14 08:37:06 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:37:06 np0005486759.ooo.test sudo[82164]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:06 np0005486759.ooo.test sudo[82198]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd85nnstf/privsep.sock
Oct 14 08:37:06 np0005486759.ooo.test sudo[82198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:07 np0005486759.ooo.test sudo[82198]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:07 np0005486759.ooo.test sudo[82209]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkl9rfa4c/privsep.sock
Oct 14 08:37:07 np0005486759.ooo.test sudo[82209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:08 np0005486759.ooo.test sudo[82209]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:08 np0005486759.ooo.test sudo[82226]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6vmou00g/privsep.sock
Oct 14 08:37:08 np0005486759.ooo.test sudo[82226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:09 np0005486759.ooo.test sudo[82226]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:09 np0005486759.ooo.test sudo[82237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp66z7amvt/privsep.sock
Oct 14 08:37:09 np0005486759.ooo.test sudo[82237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:09 np0005486759.ooo.test sudo[82237]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:10 np0005486759.ooo.test sudo[82248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwfnk1hor/privsep.sock
Oct 14 08:37:10 np0005486759.ooo.test sudo[82248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:10 np0005486759.ooo.test sudo[82248]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:11 np0005486759.ooo.test sudo[82259]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxo0qp27o/privsep.sock
Oct 14 08:37:11 np0005486759.ooo.test sudo[82259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:11 np0005486759.ooo.test sudo[82259]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:37:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:37:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:37:11 np0005486759.ooo.test systemd[1]: tmp-crun.XHsTUM.mount: Deactivated successfully.
Oct 14 08:37:11 np0005486759.ooo.test podman[82265]: 2025-10-14 08:37:11.811552924 +0000 UTC m=+0.154980504 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, release=1, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15)
Oct 14 08:37:11 np0005486759.ooo.test podman[82265]: 2025-10-14 08:37:11.852486841 +0000 UTC m=+0.195914471 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 08:37:11 np0005486759.ooo.test sudo[82325]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg7irq1d2/privsep.sock
Oct 14 08:37:11 np0005486759.ooo.test sudo[82325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:37:11 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:37:11 np0005486759.ooo.test podman[82266]: 2025-10-14 08:37:11.762127323 +0000 UTC m=+0.102870430 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, container_name=nova_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1)
Oct 14 08:37:12 np0005486759.ooo.test podman[82327]: 2025-10-14 08:37:11.995991817 +0000 UTC m=+0.100653441 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:37:12 np0005486759.ooo.test podman[82266]: 2025-10-14 08:37:12.055277396 +0000 UTC m=+0.396020503 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:37:12 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:37:12 np0005486759.ooo.test podman[82267]: 2025-10-14 08:37:12.066175036 +0000 UTC m=+0.404085574 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, release=2, container_name=collectd, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:37:12 np0005486759.ooo.test podman[82267]: 2025-10-14 08:37:12.157411222 +0000 UTC m=+0.495321780 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, release=2, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Oct 14 08:37:12 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:37:12 np0005486759.ooo.test podman[82327]: 2025-10-14 08:37:12.250568697 +0000 UTC m=+0.355230341 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.9, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, distribution-scope=public, container_name=metrics_qdr)
Oct 14 08:37:12 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:37:12 np0005486759.ooo.test sudo[82325]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:12 np0005486759.ooo.test sudo[82370]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz25vjbsg/privsep.sock
Oct 14 08:37:12 np0005486759.ooo.test sudo[82370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:13 np0005486759.ooo.test sudo[82370]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:13 np0005486759.ooo.test sudo[82384]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptjpjenk0/privsep.sock
Oct 14 08:37:13 np0005486759.ooo.test sudo[82384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:14 np0005486759.ooo.test sudo[82384]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:14 np0005486759.ooo.test sudo[82398]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo8hlr7yb/privsep.sock
Oct 14 08:37:14 np0005486759.ooo.test sudo[82398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:15 np0005486759.ooo.test sudo[82398]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:15 np0005486759.ooo.test sudo[82409]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps9cesdd0/privsep.sock
Oct 14 08:37:15 np0005486759.ooo.test sudo[82409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:16 np0005486759.ooo.test sudo[82409]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:16 np0005486759.ooo.test sudo[82420]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcu4u3djj/privsep.sock
Oct 14 08:37:16 np0005486759.ooo.test sudo[82420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:16 np0005486759.ooo.test sudo[82420]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:17 np0005486759.ooo.test sudo[82431]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprh4u5ugv/privsep.sock
Oct 14 08:37:17 np0005486759.ooo.test sudo[82431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:17 np0005486759.ooo.test sudo[82431]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:17 np0005486759.ooo.test sudo[82442]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx4lz5uxf/privsep.sock
Oct 14 08:37:17 np0005486759.ooo.test sudo[82442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:18 np0005486759.ooo.test sudo[82442]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:18 np0005486759.ooo.test sudo[82453]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsiq85wor/privsep.sock
Oct 14 08:37:18 np0005486759.ooo.test sudo[82453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:19 np0005486759.ooo.test sudo[82453]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:19 np0005486759.ooo.test sudo[82470]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprzyxc1te/privsep.sock
Oct 14 08:37:19 np0005486759.ooo.test sudo[82470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:20 np0005486759.ooo.test sudo[82470]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:20 np0005486759.ooo.test sudo[82481]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpakbi2b2b/privsep.sock
Oct 14 08:37:20 np0005486759.ooo.test sudo[82481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:20 np0005486759.ooo.test sudo[82481]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:21 np0005486759.ooo.test sudo[82492]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph0snnhgs/privsep.sock
Oct 14 08:37:21 np0005486759.ooo.test sudo[82492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:21 np0005486759.ooo.test sudo[82492]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:22 np0005486759.ooo.test sudo[82503]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuryuqaec/privsep.sock
Oct 14 08:37:22 np0005486759.ooo.test sudo[82503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:22 np0005486759.ooo.test sudo[82503]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:22 np0005486759.ooo.test sudo[82514]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7ca47842/privsep.sock
Oct 14 08:37:22 np0005486759.ooo.test sudo[82514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:23 np0005486759.ooo.test sudo[82514]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: tmp-crun.YyYzCj.mount: Deactivated successfully.
Oct 14 08:37:23 np0005486759.ooo.test podman[82519]: 2025-10-14 08:37:23.614151921 +0000 UTC m=+0.065373209 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1)
Oct 14 08:37:23 np0005486759.ooo.test podman[82527]: 2025-10-14 08:37:23.645919733 +0000 UTC m=+0.085478538 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:37:23 np0005486759.ooo.test podman[82519]: 2025-10-14 08:37:23.653262861 +0000 UTC m=+0.104484189 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, release=1, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:37:23 np0005486759.ooo.test podman[82519]: unhealthy
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:37:23 np0005486759.ooo.test podman[82527]: 2025-10-14 08:37:23.708406101 +0000 UTC m=+0.147964926 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:37:23 np0005486759.ooo.test podman[82527]: unhealthy
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:37:23 np0005486759.ooo.test podman[82521]: 2025-10-14 08:37:23.720922942 +0000 UTC m=+0.167193556 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, release=1)
Oct 14 08:37:23 np0005486759.ooo.test podman[82521]: 2025-10-14 08:37:23.731206462 +0000 UTC m=+0.177477086 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp17/openstack-cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:37:23 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:37:23 np0005486759.ooo.test sudo[82579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8li3rsr3/privsep.sock
Oct 14 08:37:23 np0005486759.ooo.test sudo[82579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:24 np0005486759.ooo.test sudo[82579]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:24 np0005486759.ooo.test sudo[82596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_dywkw_s/privsep.sock
Oct 14 08:37:24 np0005486759.ooo.test sudo[82596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:25 np0005486759.ooo.test sudo[82596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:25 np0005486759.ooo.test sudo[82607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjmo8fp8x/privsep.sock
Oct 14 08:37:25 np0005486759.ooo.test sudo[82607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:26 np0005486759.ooo.test sudo[82607]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:26 np0005486759.ooo.test sudo[82618]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn7ydw9rl/privsep.sock
Oct 14 08:37:26 np0005486759.ooo.test sudo[82618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:27 np0005486759.ooo.test sudo[82618]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:27 np0005486759.ooo.test sudo[82629]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0_6bog9m/privsep.sock
Oct 14 08:37:27 np0005486759.ooo.test sudo[82629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:27 np0005486759.ooo.test sudo[82629]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:28 np0005486759.ooo.test sudo[82640]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpizinbin6/privsep.sock
Oct 14 08:37:28 np0005486759.ooo.test sudo[82640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:28 np0005486759.ooo.test sudo[82640]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:29 np0005486759.ooo.test sudo[82651]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpssvbevce/privsep.sock
Oct 14 08:37:29 np0005486759.ooo.test sudo[82651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:37:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:37:29 np0005486759.ooo.test systemd[1]: tmp-crun.tClVNP.mount: Deactivated successfully.
Oct 14 08:37:29 np0005486759.ooo.test podman[82652]: 2025-10-14 08:37:29.201476358 +0000 UTC m=+0.073852875 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:37:29 np0005486759.ooo.test podman[82652]: 2025-10-14 08:37:29.233421583 +0000 UTC m=+0.105798110 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Oct 14 08:37:29 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:37:29 np0005486759.ooo.test podman[82654]: 2025-10-14 08:37:29.271690337 +0000 UTC m=+0.139344447 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public)
Oct 14 08:37:29 np0005486759.ooo.test podman[82654]: 2025-10-14 08:37:29.294491108 +0000 UTC m=+0.162145288 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=ovn_controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:37:29 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:37:29 np0005486759.ooo.test sudo[82651]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:29 np0005486759.ooo.test sudo[82716]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk3b0rc5d/privsep.sock
Oct 14 08:37:29 np0005486759.ooo.test sudo[82716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:30 np0005486759.ooo.test sudo[82716]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:30 np0005486759.ooo.test sudo[82728]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj4ighb1y/privsep.sock
Oct 14 08:37:30 np0005486759.ooo.test sudo[82728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:31 np0005486759.ooo.test sudo[82728]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:31 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:37:31 np0005486759.ooo.test recover_tripleo_nova_virtqemud[82735]: 47951
Oct 14 08:37:31 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:37:31 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:37:31 np0005486759.ooo.test sudo[82741]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2w0d22xz/privsep.sock
Oct 14 08:37:31 np0005486759.ooo.test sudo[82741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:32 np0005486759.ooo.test sudo[82741]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:32 np0005486759.ooo.test sudo[82752]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6s_tnr32/privsep.sock
Oct 14 08:37:32 np0005486759.ooo.test sudo[82752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:33 np0005486759.ooo.test sudo[82752]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:33 np0005486759.ooo.test sudo[82763]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8fcz7nyo/privsep.sock
Oct 14 08:37:33 np0005486759.ooo.test sudo[82763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:34 np0005486759.ooo.test sudo[82763]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:34 np0005486759.ooo.test sudo[82774]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjjerkl9s/privsep.sock
Oct 14 08:37:34 np0005486759.ooo.test sudo[82774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:35 np0005486759.ooo.test sudo[82774]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:35 np0005486759.ooo.test sudo[82790]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgvkuzlrr/privsep.sock
Oct 14 08:37:35 np0005486759.ooo.test sudo[82790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:35 np0005486759.ooo.test sudo[82790]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:36 np0005486759.ooo.test sudo[82802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpitov7o92/privsep.sock
Oct 14 08:37:36 np0005486759.ooo.test sudo[82802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:36 np0005486759.ooo.test sudo[82802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:37:36 np0005486759.ooo.test systemd[1]: tmp-crun.TEFe3O.mount: Deactivated successfully.
Oct 14 08:37:36 np0005486759.ooo.test podman[82808]: 2025-10-14 08:37:36.819865251 +0000 UTC m=+0.067948480 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:37:36 np0005486759.ooo.test sudo[82835]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptz7lbyio/privsep.sock
Oct 14 08:37:36 np0005486759.ooo.test sudo[82835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:37 np0005486759.ooo.test podman[82808]: 2025-10-14 08:37:37.19847577 +0000 UTC m=+0.446559059 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37)
Oct 14 08:37:37 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:37:37 np0005486759.ooo.test sudo[82835]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:37 np0005486759.ooo.test sudo[82847]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_irlnhv7/privsep.sock
Oct 14 08:37:37 np0005486759.ooo.test sudo[82847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:38 np0005486759.ooo.test sudo[82847]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:38 np0005486759.ooo.test sudo[82858]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp94bdz2sc/privsep.sock
Oct 14 08:37:38 np0005486759.ooo.test sudo[82858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:39 np0005486759.ooo.test sudo[82858]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:39 np0005486759.ooo.test sudo[82869]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv6cvztar/privsep.sock
Oct 14 08:37:39 np0005486759.ooo.test sudo[82869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:40 np0005486759.ooo.test sudo[82869]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:40 np0005486759.ooo.test sudo[82882]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ubov9s7/privsep.sock
Oct 14 08:37:40 np0005486759.ooo.test sudo[82882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:41 np0005486759.ooo.test sudo[82882]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:41 np0005486759.ooo.test sudo[82897]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppcfcrgj3/privsep.sock
Oct 14 08:37:41 np0005486759.ooo.test sudo[82897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:41 np0005486759.ooo.test sudo[82897]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:37:41 np0005486759.ooo.test podman[82901]: 2025-10-14 08:37:41.986099174 +0000 UTC m=+0.073579787 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=iscsid, version=17.1.9, batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 08:37:41 np0005486759.ooo.test podman[82901]: 2025-10-14 08:37:41.991557984 +0000 UTC m=+0.079038417 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64)
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:37:42 np0005486759.ooo.test sudo[82928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo1v3b9ey/privsep.sock
Oct 14 08:37:42 np0005486759.ooo.test sudo[82928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: tmp-crun.hrySSH.mount: Deactivated successfully.
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:37:42 np0005486759.ooo.test podman[82930]: 2025-10-14 08:37:42.291268812 +0000 UTC m=+0.095521501 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, config_id=tripleo_step5, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 08:37:42 np0005486759.ooo.test podman[82930]: 2025-10-14 08:37:42.303252045 +0000 UTC m=+0.107504804 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, version=17.1.9, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, batch=17.1_20250721.1, vcs-type=git, managed_by=tripleo_ansible)
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:37:42 np0005486759.ooo.test podman[82951]: 2025-10-14 08:37:42.366134336 +0000 UTC m=+0.067491056 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64)
Oct 14 08:37:42 np0005486759.ooo.test podman[82947]: 2025-10-14 08:37:42.369030627 +0000 UTC m=+0.070655515 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-07-21T13:04:03, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, release=2, vendor=Red Hat, Inc., vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:37:42 np0005486759.ooo.test podman[82947]: 2025-10-14 08:37:42.453300915 +0000 UTC m=+0.154925793 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd)
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:37:42 np0005486759.ooo.test podman[82951]: 2025-10-14 08:37:42.563961257 +0000 UTC m=+0.265317947 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64)
Oct 14 08:37:42 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:37:42 np0005486759.ooo.test sudo[82928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:43 np0005486759.ooo.test sudo[83014]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphn0344_s/privsep.sock
Oct 14 08:37:43 np0005486759.ooo.test sudo[83014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:43 np0005486759.ooo.test sudo[83014]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:44 np0005486759.ooo.test sudo[83025]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwm2lxgfb/privsep.sock
Oct 14 08:37:44 np0005486759.ooo.test sudo[83025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:44 np0005486759.ooo.test sudo[83025]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:44 np0005486759.ooo.test sudo[83036]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqojwfahm/privsep.sock
Oct 14 08:37:44 np0005486759.ooo.test sudo[83036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:45 np0005486759.ooo.test sudo[83036]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:45 np0005486759.ooo.test sudo[83048]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp048n7020/privsep.sock
Oct 14 08:37:45 np0005486759.ooo.test sudo[83048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:46 np0005486759.ooo.test sudo[83048]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:46 np0005486759.ooo.test sudo[83065]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbzl2dgwa/privsep.sock
Oct 14 08:37:46 np0005486759.ooo.test sudo[83065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:47 np0005486759.ooo.test sudo[83065]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:47 np0005486759.ooo.test sudo[83076]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw8nvjlap/privsep.sock
Oct 14 08:37:47 np0005486759.ooo.test sudo[83076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:48 np0005486759.ooo.test sudo[83076]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:48 np0005486759.ooo.test sudo[83087]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp54f71hp4/privsep.sock
Oct 14 08:37:48 np0005486759.ooo.test sudo[83087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:49 np0005486759.ooo.test sudo[83087]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:49 np0005486759.ooo.test sudo[83098]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp84p1qn81/privsep.sock
Oct 14 08:37:49 np0005486759.ooo.test sudo[83098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:49 np0005486759.ooo.test sudo[83098]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:50 np0005486759.ooo.test sudo[83109]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphig_3t3l/privsep.sock
Oct 14 08:37:50 np0005486759.ooo.test sudo[83109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:50 np0005486759.ooo.test sudo[83109]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:51 np0005486759.ooo.test sudo[83120]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6bv3pfkf/privsep.sock
Oct 14 08:37:51 np0005486759.ooo.test sudo[83120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:51 np0005486759.ooo.test sudo[83120]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:52 np0005486759.ooo.test sudo[83137]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoz7eo6jx/privsep.sock
Oct 14 08:37:52 np0005486759.ooo.test sudo[83137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:52 np0005486759.ooo.test sudo[83137]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:52 np0005486759.ooo.test sudo[83148]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9gvi3toi/privsep.sock
Oct 14 08:37:52 np0005486759.ooo.test sudo[83148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:53 np0005486759.ooo.test sudo[83148]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:53 np0005486759.ooo.test sudo[83159]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpze3ldbf7/privsep.sock
Oct 14 08:37:53 np0005486759.ooo.test sudo[83159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: tmp-crun.3Y5XbO.mount: Deactivated successfully.
Oct 14 08:37:53 np0005486759.ooo.test podman[83162]: 2025-10-14 08:37:53.83429454 +0000 UTC m=+0.095224001 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, vcs-type=git, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1)
Oct 14 08:37:53 np0005486759.ooo.test podman[83162]: 2025-10-14 08:37:53.846886512 +0000 UTC m=+0.107815983 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, managed_by=tripleo_ansible)
Oct 14 08:37:53 np0005486759.ooo.test podman[83162]: unhealthy
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: tmp-crun.SDv7U6.mount: Deactivated successfully.
Oct 14 08:37:53 np0005486759.ooo.test podman[83161]: 2025-10-14 08:37:53.937036405 +0000 UTC m=+0.197771500 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:37:53 np0005486759.ooo.test podman[83186]: 2025-10-14 08:37:53.969119975 +0000 UTC m=+0.137074476 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, version=17.1.9, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12)
Oct 14 08:37:53 np0005486759.ooo.test podman[83161]: 2025-10-14 08:37:53.980340375 +0000 UTC m=+0.241075430 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 08:37:53 np0005486759.ooo.test podman[83161]: unhealthy
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:37:53 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:37:54 np0005486759.ooo.test podman[83186]: 2025-10-14 08:37:54.002790005 +0000 UTC m=+0.170744486 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, release=1)
Oct 14 08:37:54 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:37:54 np0005486759.ooo.test sudo[83159]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:54 np0005486759.ooo.test sudo[83225]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpivspfebc/privsep.sock
Oct 14 08:37:54 np0005486759.ooo.test sudo[83225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:55 np0005486759.ooo.test sudo[83225]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:55 np0005486759.ooo.test sudo[83236]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpewcfw8ea/privsep.sock
Oct 14 08:37:55 np0005486759.ooo.test sudo[83236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:55 np0005486759.ooo.test sudo[83236]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:56 np0005486759.ooo.test sudo[83248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmyud80wd/privsep.sock
Oct 14 08:37:56 np0005486759.ooo.test sudo[83248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:56 np0005486759.ooo.test sudo[83248]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:57 np0005486759.ooo.test sudo[83265]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyzsoek5e/privsep.sock
Oct 14 08:37:57 np0005486759.ooo.test sudo[83265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:57 np0005486759.ooo.test sudo[83265]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:58 np0005486759.ooo.test sudo[83276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr9ahhsgu/privsep.sock
Oct 14 08:37:58 np0005486759.ooo.test sudo[83276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:58 np0005486759.ooo.test sudo[83276]: pam_unix(sudo:session): session closed for user root
Oct 14 08:37:59 np0005486759.ooo.test sudo[83287]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprkpmf6cx/privsep.sock
Oct 14 08:37:59 np0005486759.ooo.test sudo[83287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:37:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:37:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:37:59 np0005486759.ooo.test podman[83290]: 2025-10-14 08:37:59.465379881 +0000 UTC m=+0.087582213 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9)
Oct 14 08:37:59 np0005486759.ooo.test podman[83291]: 2025-10-14 08:37:59.519785798 +0000 UTC m=+0.141422212 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.33.12)
Oct 14 08:37:59 np0005486759.ooo.test podman[83291]: 2025-10-14 08:37:59.568522008 +0000 UTC m=+0.190158462 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:37:59 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:37:59 np0005486759.ooo.test podman[83290]: 2025-10-14 08:37:59.623545284 +0000 UTC m=+0.245747676 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:37:59 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:37:59 np0005486759.ooo.test sudo[83287]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:00 np0005486759.ooo.test sudo[83346]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7kemwplp/privsep.sock
Oct 14 08:38:00 np0005486759.ooo.test sudo[83346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:00 np0005486759.ooo.test sudo[83346]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:00 np0005486759.ooo.test sudo[83357]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk0xi7zk9/privsep.sock
Oct 14 08:38:00 np0005486759.ooo.test sudo[83357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:01 np0005486759.ooo.test sudo[83357]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:01 np0005486759.ooo.test sudo[83368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjywb5f4c/privsep.sock
Oct 14 08:38:01 np0005486759.ooo.test sudo[83368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:02 np0005486759.ooo.test sudo[83368]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:02 np0005486759.ooo.test sudo[83385]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp60i8uv0r/privsep.sock
Oct 14 08:38:02 np0005486759.ooo.test sudo[83385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:03 np0005486759.ooo.test sudo[83385]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:03 np0005486759.ooo.test sudo[83396]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2bucz5y5/privsep.sock
Oct 14 08:38:03 np0005486759.ooo.test sudo[83396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:04 np0005486759.ooo.test sudo[83396]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:04 np0005486759.ooo.test sudo[83407]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6nj7zzh8/privsep.sock
Oct 14 08:38:04 np0005486759.ooo.test sudo[83407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:05 np0005486759.ooo.test sudo[83407]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:05 np0005486759.ooo.test sudo[83418]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptftfotgm/privsep.sock
Oct 14 08:38:05 np0005486759.ooo.test sudo[83418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:05 np0005486759.ooo.test sudo[83418]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:06 np0005486759.ooo.test sudo[83429]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ppk4gsa/privsep.sock
Oct 14 08:38:06 np0005486759.ooo.test sudo[83429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:06 np0005486759.ooo.test sudo[83429]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:07 np0005486759.ooo.test sudo[83440]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkpdtr_an/privsep.sock
Oct 14 08:38:07 np0005486759.ooo.test sudo[83440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:38:07 np0005486759.ooo.test podman[83443]: 2025-10-14 08:38:07.437076884 +0000 UTC m=+0.064078239 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, container_name=nova_migration_target, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, config_id=tripleo_step4, version=17.1.9)
Oct 14 08:38:07 np0005486759.ooo.test sudo[83440]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:07 np0005486759.ooo.test podman[83443]: 2025-10-14 08:38:07.820533844 +0000 UTC m=+0.447535169 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:38:07 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:38:07 np0005486759.ooo.test sudo[83480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp05vvy18f/privsep.sock
Oct 14 08:38:07 np0005486759.ooo.test sudo[83480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:08 np0005486759.ooo.test sudo[83480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:08 np0005486759.ooo.test sudo[83491]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr0bknrc4/privsep.sock
Oct 14 08:38:08 np0005486759.ooo.test sudo[83491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:09 np0005486759.ooo.test sudo[83491]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:09 np0005486759.ooo.test sudo[83502]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn7x70poo/privsep.sock
Oct 14 08:38:09 np0005486759.ooo.test sudo[83502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:10 np0005486759.ooo.test sudo[83502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:10 np0005486759.ooo.test sudo[83513]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg9m97qx6/privsep.sock
Oct 14 08:38:10 np0005486759.ooo.test sudo[83513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:11 np0005486759.ooo.test sudo[83513]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:11 np0005486759.ooo.test sudo[83524]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwwhvpgla/privsep.sock
Oct 14 08:38:11 np0005486759.ooo.test sudo[83524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:12 np0005486759.ooo.test sudo[83524]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: tmp-crun.1qXqKO.mount: Deactivated successfully.
Oct 14 08:38:12 np0005486759.ooo.test podman[83528]: 2025-10-14 08:38:12.203644421 +0000 UTC m=+0.090275637 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, tcib_managed=true, vcs-type=git, version=17.1.9, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container)
Oct 14 08:38:12 np0005486759.ooo.test podman[83528]: 2025-10-14 08:38:12.241700318 +0000 UTC m=+0.128331544 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:38:12 np0005486759.ooo.test sudo[83555]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzpvy6n_j/privsep.sock
Oct 14 08:38:12 np0005486759.ooo.test sudo[83555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:38:12 np0005486759.ooo.test podman[83556]: 2025-10-14 08:38:12.445439573 +0000 UTC m=+0.073444843 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:38:12 np0005486759.ooo.test podman[83556]: 2025-10-14 08:38:12.468515762 +0000 UTC m=+0.096521092 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5)
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:38:12 np0005486759.ooo.test podman[83583]: 2025-10-14 08:38:12.567608503 +0000 UTC m=+0.078795479 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64)
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:38:12 np0005486759.ooo.test podman[83583]: 2025-10-14 08:38:12.6034224 +0000 UTC m=+0.114609406 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, container_name=collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, release=2)
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:38:12 np0005486759.ooo.test podman[83602]: 2025-10-14 08:38:12.682680312 +0000 UTC m=+0.080825542 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, release=1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, config_id=tripleo_step1)
Oct 14 08:38:12 np0005486759.ooo.test podman[83602]: 2025-10-14 08:38:12.909926289 +0000 UTC m=+0.308071519 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:38:12 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:38:13 np0005486759.ooo.test sudo[83555]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:13 np0005486759.ooo.test sudo[83645]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfrhkshrp/privsep.sock
Oct 14 08:38:13 np0005486759.ooo.test sudo[83645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:13 np0005486759.ooo.test sudo[83645]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:14 np0005486759.ooo.test sudo[83656]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5e3siik5/privsep.sock
Oct 14 08:38:14 np0005486759.ooo.test sudo[83656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:14 np0005486759.ooo.test sudo[83656]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:14 np0005486759.ooo.test sudo[83667]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi7f36ocm/privsep.sock
Oct 14 08:38:14 np0005486759.ooo.test sudo[83667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:15 np0005486759.ooo.test sudo[83667]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:15 np0005486759.ooo.test sudo[83678]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeswjjs6k/privsep.sock
Oct 14 08:38:15 np0005486759.ooo.test sudo[83678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:16 np0005486759.ooo.test sudo[83678]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:16 np0005486759.ooo.test sudo[83689]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwh67j07h/privsep.sock
Oct 14 08:38:16 np0005486759.ooo.test sudo[83689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:17 np0005486759.ooo.test sudo[83689]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:17 np0005486759.ooo.test sudo[83700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp08pker9a/privsep.sock
Oct 14 08:38:17 np0005486759.ooo.test sudo[83700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:18 np0005486759.ooo.test sudo[83700]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:18 np0005486759.ooo.test sudo[83713]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiyf9jesk/privsep.sock
Oct 14 08:38:18 np0005486759.ooo.test sudo[83713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:18 np0005486759.ooo.test sudo[83713]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:19 np0005486759.ooo.test sudo[83728]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdzxppeyk/privsep.sock
Oct 14 08:38:19 np0005486759.ooo.test sudo[83728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:19 np0005486759.ooo.test sudo[83728]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:19 np0005486759.ooo.test sudo[83739]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdn9cl3hx/privsep.sock
Oct 14 08:38:19 np0005486759.ooo.test sudo[83739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:20 np0005486759.ooo.test sudo[83739]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:20 np0005486759.ooo.test sudo[83750]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyqvzzjmt/privsep.sock
Oct 14 08:38:20 np0005486759.ooo.test sudo[83750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:21 np0005486759.ooo.test sudo[83750]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:21 np0005486759.ooo.test sudo[83761]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg9r8lzmv/privsep.sock
Oct 14 08:38:21 np0005486759.ooo.test sudo[83761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:22 np0005486759.ooo.test sudo[83761]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:22 np0005486759.ooo.test sudo[83772]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm_wzxgmp/privsep.sock
Oct 14 08:38:22 np0005486759.ooo.test sudo[83772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:22 np0005486759.ooo.test sudo[83772]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:23 np0005486759.ooo.test sudo[83783]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoj7hosz_/privsep.sock
Oct 14 08:38:23 np0005486759.ooo.test sudo[83783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:23 np0005486759.ooo.test sudo[83783]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:38:23 np0005486759.ooo.test systemd[1]: tmp-crun.Em0ygO.mount: Deactivated successfully.
Oct 14 08:38:23 np0005486759.ooo.test podman[83794]: 2025-10-14 08:38:23.992782378 +0000 UTC m=+0.069625612 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4)
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:38:24 np0005486759.ooo.test podman[83794]: 2025-10-14 08:38:24.036802931 +0000 UTC m=+0.113646165 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.expose-services=)
Oct 14 08:38:24 np0005486759.ooo.test podman[83794]: unhealthy
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:38:24 np0005486759.ooo.test podman[83814]: 2025-10-14 08:38:24.097003549 +0000 UTC m=+0.065498755 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, release=1, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1)
Oct 14 08:38:24 np0005486759.ooo.test podman[83814]: 2025-10-14 08:38:24.11212192 +0000 UTC m=+0.080617126 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 08:38:24 np0005486759.ooo.test podman[83814]: unhealthy
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:38:24 np0005486759.ooo.test podman[83815]: 2025-10-14 08:38:24.157558537 +0000 UTC m=+0.119063984 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 08:38:24 np0005486759.ooo.test podman[83815]: 2025-10-14 08:38:24.165496765 +0000 UTC m=+0.127002212 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, release=1, architecture=x86_64, io.openshift.expose-services=)
Oct 14 08:38:24 np0005486759.ooo.test sudo[83854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpba_cyiu0/privsep.sock
Oct 14 08:38:24 np0005486759.ooo.test sudo[83854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:24 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:38:24 np0005486759.ooo.test sudo[83854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:24 np0005486759.ooo.test sudo[83865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu9qsy16k/privsep.sock
Oct 14 08:38:24 np0005486759.ooo.test sudo[83865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:25 np0005486759.ooo.test sudo[83865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:25 np0005486759.ooo.test sudo[83876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfdqii4b8/privsep.sock
Oct 14 08:38:25 np0005486759.ooo.test sudo[83876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:26 np0005486759.ooo.test sudo[83876]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:26 np0005486759.ooo.test sudo[83887]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1gupjwqp/privsep.sock
Oct 14 08:38:26 np0005486759.ooo.test sudo[83887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:27 np0005486759.ooo.test sudo[83887]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:27 np0005486759.ooo.test sudo[83898]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp75gb67ve/privsep.sock
Oct 14 08:38:27 np0005486759.ooo.test sudo[83898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:28 np0005486759.ooo.test sudo[83898]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:28 np0005486759.ooo.test sudo[83909]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb69p6fdx/privsep.sock
Oct 14 08:38:28 np0005486759.ooo.test sudo[83909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:28 np0005486759.ooo.test sudo[83909]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:29 np0005486759.ooo.test sudo[83922]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzrx1tiuo/privsep.sock
Oct 14 08:38:29 np0005486759.ooo.test sudo[83922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:29 np0005486759.ooo.test sudo[83922]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:38:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:38:29 np0005486759.ooo.test podman[83933]: 2025-10-14 08:38:29.852886031 +0000 UTC m=+0.060038323 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:38:29 np0005486759.ooo.test podman[83932]: 2025-10-14 08:38:29.863001737 +0000 UTC m=+0.068816438 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team)
Oct 14 08:38:29 np0005486759.ooo.test podman[83932]: 2025-10-14 08:38:29.888170592 +0000 UTC m=+0.093985283 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:38:29 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:38:29 np0005486759.ooo.test podman[83933]: 2025-10-14 08:38:29.941388912 +0000 UTC m=+0.148541174 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, release=1, vcs-type=git, version=17.1.9, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:38:29 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:38:30 np0005486759.ooo.test sudo[83982]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprnrhrkwn/privsep.sock
Oct 14 08:38:30 np0005486759.ooo.test sudo[83982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:30 np0005486759.ooo.test sudo[83982]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:30 np0005486759.ooo.test sudo[83993]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj_1x2ntw/privsep.sock
Oct 14 08:38:30 np0005486759.ooo.test sudo[83993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:31 np0005486759.ooo.test sudo[83993]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:31 np0005486759.ooo.test sudo[84004]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp12uyykrb/privsep.sock
Oct 14 08:38:31 np0005486759.ooo.test sudo[84004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:32 np0005486759.ooo.test sudo[84004]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:32 np0005486759.ooo.test sudo[84015]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfsjff15e/privsep.sock
Oct 14 08:38:32 np0005486759.ooo.test sudo[84015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:33 np0005486759.ooo.test sudo[84015]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:33 np0005486759.ooo.test sudo[84027]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp55i168m9/privsep.sock
Oct 14 08:38:33 np0005486759.ooo.test sudo[84027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:34 np0005486759.ooo.test sudo[84027]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:34 np0005486759.ooo.test sudo[84040]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfae6wjib/privsep.sock
Oct 14 08:38:34 np0005486759.ooo.test sudo[84040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:35 np0005486759.ooo.test sudo[84040]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:35 np0005486759.ooo.test sudo[84056]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpucl5kcyh/privsep.sock
Oct 14 08:38:35 np0005486759.ooo.test sudo[84056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:36 np0005486759.ooo.test sudo[84056]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:36 np0005486759.ooo.test sudo[84067]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4dpg7skp/privsep.sock
Oct 14 08:38:36 np0005486759.ooo.test sudo[84067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:36 np0005486759.ooo.test sudo[84067]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:37 np0005486759.ooo.test sudo[84078]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyeb4j0ni/privsep.sock
Oct 14 08:38:37 np0005486759.ooo.test sudo[84078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:37 np0005486759.ooo.test sudo[84078]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:38:37 np0005486759.ooo.test podman[84083]: 2025-10-14 08:38:37.960440181 +0000 UTC m=+0.105334206 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:38:38 np0005486759.ooo.test sudo[84109]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaz8hoixe/privsep.sock
Oct 14 08:38:38 np0005486759.ooo.test sudo[84109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:38 np0005486759.ooo.test podman[84083]: 2025-10-14 08:38:38.306249067 +0000 UTC m=+0.451143082 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.9)
Oct 14 08:38:38 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:38:38 np0005486759.ooo.test sudo[84109]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:38 np0005486759.ooo.test sudo[84122]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmputqqeln4/privsep.sock
Oct 14 08:38:38 np0005486759.ooo.test sudo[84122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:39 np0005486759.ooo.test sudo[84122]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:39 np0005486759.ooo.test sudo[84133]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp10cckf03/privsep.sock
Oct 14 08:38:39 np0005486759.ooo.test sudo[84133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:40 np0005486759.ooo.test sudo[84133]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:40 np0005486759.ooo.test sudo[84150]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8g7hojj3/privsep.sock
Oct 14 08:38:40 np0005486759.ooo.test sudo[84150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:41 np0005486759.ooo.test sudo[84150]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:41 np0005486759.ooo.test sudo[84161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ruvcuci/privsep.sock
Oct 14 08:38:41 np0005486759.ooo.test sudo[84161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:42 np0005486759.ooo.test sudo[84161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:38:42 np0005486759.ooo.test sudo[84183]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4xqgvsl8/privsep.sock
Oct 14 08:38:42 np0005486759.ooo.test sudo[84183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:42 np0005486759.ooo.test systemd[1]: tmp-crun.SnyUzZ.mount: Deactivated successfully.
Oct 14 08:38:42 np0005486759.ooo.test podman[84169]: 2025-10-14 08:38:42.452140806 +0000 UTC m=+0.085610772 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, container_name=iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12)
Oct 14 08:38:42 np0005486759.ooo.test podman[84169]: 2025-10-14 08:38:42.461076155 +0000 UTC m=+0.094546121 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, release=1, managed_by=tripleo_ansible)
Oct 14 08:38:42 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:38:42 np0005486759.ooo.test sudo[84183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:38:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:38:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:38:43 np0005486759.ooo.test podman[84197]: 2025-10-14 08:38:43.089561746 +0000 UTC m=+0.077280830 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, release=1, version=17.1.9, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public)
Oct 14 08:38:43 np0005486759.ooo.test podman[84196]: 2025-10-14 08:38:43.150491377 +0000 UTC m=+0.141219285 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, container_name=collectd, io.openshift.expose-services=, version=17.1.9, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:38:43 np0005486759.ooo.test podman[84196]: 2025-10-14 08:38:43.185888811 +0000 UTC m=+0.176616669 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, release=2, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, tcib_managed=true)
Oct 14 08:38:43 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:38:43 np0005486759.ooo.test podman[84195]: 2025-10-14 08:38:43.198799993 +0000 UTC m=+0.189657505 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step5, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:38:43 np0005486759.ooo.test podman[84195]: 2025-10-14 08:38:43.218118956 +0000 UTC m=+0.208976468 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step5, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Oct 14 08:38:43 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:38:43 np0005486759.ooo.test sudo[84271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph5nrntrx/privsep.sock
Oct 14 08:38:43 np0005486759.ooo.test sudo[84271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:43 np0005486759.ooo.test podman[84197]: 2025-10-14 08:38:43.299651769 +0000 UTC m=+0.287370843 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr)
Oct 14 08:38:43 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:38:43 np0005486759.ooo.test systemd[1]: tmp-crun.w5lyiA.mount: Deactivated successfully.
Oct 14 08:38:43 np0005486759.ooo.test sudo[84271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:44 np0005486759.ooo.test sudo[84282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2qu_vfqu/privsep.sock
Oct 14 08:38:44 np0005486759.ooo.test sudo[84282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:44 np0005486759.ooo.test sudo[84282]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:44 np0005486759.ooo.test sudo[84293]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_06w9wt1/privsep.sock
Oct 14 08:38:44 np0005486759.ooo.test sudo[84293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:45 np0005486759.ooo.test sudo[84293]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:45 np0005486759.ooo.test sudo[84310]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6zunzcon/privsep.sock
Oct 14 08:38:45 np0005486759.ooo.test sudo[84310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:46 np0005486759.ooo.test sudo[84310]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:46 np0005486759.ooo.test sudo[84321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp79ld_jcd/privsep.sock
Oct 14 08:38:46 np0005486759.ooo.test sudo[84321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:47 np0005486759.ooo.test sudo[84321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:47 np0005486759.ooo.test sudo[84332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxabvvepn/privsep.sock
Oct 14 08:38:47 np0005486759.ooo.test sudo[84332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:48 np0005486759.ooo.test sudo[84332]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:48 np0005486759.ooo.test sudo[84343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd1zxlbia/privsep.sock
Oct 14 08:38:48 np0005486759.ooo.test sudo[84343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:48 np0005486759.ooo.test sudo[84343]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:49 np0005486759.ooo.test sudo[84354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp6x0u2s1/privsep.sock
Oct 14 08:38:49 np0005486759.ooo.test sudo[84354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:49 np0005486759.ooo.test sudo[84354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:49 np0005486759.ooo.test sudo[84365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd2nqb24a/privsep.sock
Oct 14 08:38:49 np0005486759.ooo.test sudo[84365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:50 np0005486759.ooo.test sudo[84365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:50 np0005486759.ooo.test sudo[84378]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfo6vbnnn/privsep.sock
Oct 14 08:38:50 np0005486759.ooo.test sudo[84378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:51 np0005486759.ooo.test sudo[84378]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:51 np0005486759.ooo.test sudo[84393]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8rqk4xn5/privsep.sock
Oct 14 08:38:51 np0005486759.ooo.test sudo[84393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:52 np0005486759.ooo.test sudo[84393]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:52 np0005486759.ooo.test sudo[84404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp45xxu_ja/privsep.sock
Oct 14 08:38:52 np0005486759.ooo.test sudo[84404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:53 np0005486759.ooo.test sudo[84404]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:53 np0005486759.ooo.test sudo[84415]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7g0vl3hz/privsep.sock
Oct 14 08:38:53 np0005486759.ooo.test sudo[84415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:54 np0005486759.ooo.test sudo[84415]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: tmp-crun.qbfIiJ.mount: Deactivated successfully.
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:38:54 np0005486759.ooo.test podman[84422]: 2025-10-14 08:38:54.259622016 +0000 UTC m=+0.080720999 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64)
Oct 14 08:38:54 np0005486759.ooo.test podman[84421]: 2025-10-14 08:38:54.266807519 +0000 UTC m=+0.087170029 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=ceilometer_agent_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, release=1)
Oct 14 08:38:54 np0005486759.ooo.test podman[84422]: 2025-10-14 08:38:54.272394124 +0000 UTC m=+0.093493137 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git)
Oct 14 08:38:54 np0005486759.ooo.test podman[84422]: unhealthy
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:38:54 np0005486759.ooo.test podman[84421]: 2025-10-14 08:38:54.285292556 +0000 UTC m=+0.105655036 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Oct 14 08:38:54 np0005486759.ooo.test podman[84421]: unhealthy
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:38:54 np0005486759.ooo.test podman[84446]: 2025-10-14 08:38:54.377991257 +0000 UTC m=+0.115899205 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:38:54 np0005486759.ooo.test podman[84446]: 2025-10-14 08:38:54.383683835 +0000 UTC m=+0.121591693 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:38:54 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:38:54 np0005486759.ooo.test sudo[84480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcxp35tj2/privsep.sock
Oct 14 08:38:54 np0005486759.ooo.test sudo[84480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:54 np0005486759.ooo.test sudo[84480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:55 np0005486759.ooo.test sudo[84491]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp66n3kepb/privsep.sock
Oct 14 08:38:55 np0005486759.ooo.test sudo[84491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:55 np0005486759.ooo.test sudo[84491]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:56 np0005486759.ooo.test sudo[84502]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj1q7yal5/privsep.sock
Oct 14 08:38:56 np0005486759.ooo.test sudo[84502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:56 np0005486759.ooo.test sudo[84502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:57 np0005486759.ooo.test sudo[84551]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprebi5qw3/privsep.sock
Oct 14 08:38:57 np0005486759.ooo.test sudo[84551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:57 np0005486759.ooo.test sudo[84625]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsryrizbrcoxmmvuzbwhrriwxnktobjw ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431136.7855217-136405-128548503603487/AnsiballZ_setup.py
Oct 14 08:38:57 np0005486759.ooo.test sudo[84625]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:38:57 np0005486759.ooo.test sudo[84551]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:57 np0005486759.ooo.test python3[84627]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 08:38:57 np0005486759.ooo.test sudo[84625]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:57 np0005486759.ooo.test sudo[84679]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muhlplrmgbglykjvcanytpiiroqeqxzz ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431136.7855217-136405-128548503603487/AnsiballZ_dnf.py
Oct 14 08:38:57 np0005486759.ooo.test sudo[84679]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:38:57 np0005486759.ooo.test sudo[84685]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmd4s3htm/privsep.sock
Oct 14 08:38:57 np0005486759.ooo.test sudo[84685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:58 np0005486759.ooo.test python3[84683]: ansible-ansible.legacy.dnf Invoked with name=['crudini'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Oct 14 08:38:58 np0005486759.ooo.test sudo[84685]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:58 np0005486759.ooo.test sudo[84698]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1kma9uos/privsep.sock
Oct 14 08:38:58 np0005486759.ooo.test sudo[84698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:38:59 np0005486759.ooo.test sudo[84698]: pam_unix(sudo:session): session closed for user root
Oct 14 08:38:59 np0005486759.ooo.test sudo[84709]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsqlpjj3f/privsep.sock
Oct 14 08:38:59 np0005486759.ooo.test sudo[84709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:00 np0005486759.ooo.test sudo[84709]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:39:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:39:00 np0005486759.ooo.test systemd[1]: tmp-crun.wkXDeQ.mount: Deactivated successfully.
Oct 14 08:39:00 np0005486759.ooo.test podman[84716]: 2025-10-14 08:39:00.128739618 +0000 UTC m=+0.062149989 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, name=rhosp17/openstack-ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 08:39:00 np0005486759.ooo.test podman[84716]: 2025-10-14 08:39:00.173397431 +0000 UTC m=+0.106807812 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller, release=1, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, com.redhat.component=openstack-ovn-controller-container)
Oct 14 08:39:00 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:39:00 np0005486759.ooo.test systemd[1]: tmp-crun.wkhfpo.mount: Deactivated successfully.
Oct 14 08:39:00 np0005486759.ooo.test podman[84715]: 2025-10-14 08:39:00.206019648 +0000 UTC m=+0.139742409 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, version=17.1.9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:39:00 np0005486759.ooo.test podman[84715]: 2025-10-14 08:39:00.244149068 +0000 UTC m=+0.177871829 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9)
Oct 14 08:39:00 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:39:00 np0005486759.ooo.test sudo[84768]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm6yf4j01/privsep.sock
Oct 14 08:39:00 np0005486759.ooo.test sudo[84768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:00 np0005486759.ooo.test sudo[84679]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:00 np0005486759.ooo.test sudo[84768]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:01 np0005486759.ooo.test sudo[84793]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8iry0aoy/privsep.sock
Oct 14 08:39:01 np0005486759.ooo.test sudo[84793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:01 np0005486759.ooo.test sudo[84793]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:01 np0005486759.ooo.test sudo[84809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6f9x068t/privsep.sock
Oct 14 08:39:01 np0005486759.ooo.test sudo[84809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:02 np0005486759.ooo.test sudo[84809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:02 np0005486759.ooo.test sudo[84821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_ciwdrpb/privsep.sock
Oct 14 08:39:02 np0005486759.ooo.test sudo[84821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:03 np0005486759.ooo.test sudo[84821]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:03 np0005486759.ooo.test sudo[84832]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbn2r9sax/privsep.sock
Oct 14 08:39:03 np0005486759.ooo.test sudo[84832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:04 np0005486759.ooo.test sudo[84832]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:04 np0005486759.ooo.test sudo[84843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5_f7s7e4/privsep.sock
Oct 14 08:39:04 np0005486759.ooo.test sudo[84843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:04 np0005486759.ooo.test sudo[84843]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:05 np0005486759.ooo.test sudo[84854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe93jepr6/privsep.sock
Oct 14 08:39:05 np0005486759.ooo.test sudo[84854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:05 np0005486759.ooo.test sudo[84854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:06 np0005486759.ooo.test sudo[84865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd_d5fvq1/privsep.sock
Oct 14 08:39:06 np0005486759.ooo.test sudo[84865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:06 np0005486759.ooo.test sudo[84865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:06 np0005486759.ooo.test sudo[84876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyit7m0o3/privsep.sock
Oct 14 08:39:06 np0005486759.ooo.test sudo[84876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:07 np0005486759.ooo.test sudo[84876]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:07 np0005486759.ooo.test sudo[84893]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi5ijcq9g/privsep.sock
Oct 14 08:39:07 np0005486759.ooo.test sudo[84893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:08 np0005486759.ooo.test sudo[84893]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:39:08 np0005486759.ooo.test systemd[1]: tmp-crun.PTA37Z.mount: Deactivated successfully.
Oct 14 08:39:08 np0005486759.ooo.test podman[84898]: 2025-10-14 08:39:08.440471397 +0000 UTC m=+0.073873056 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Oct 14 08:39:08 np0005486759.ooo.test sudo[84927]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcup7kgoa/privsep.sock
Oct 14 08:39:08 np0005486759.ooo.test sudo[84927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:08 np0005486759.ooo.test podman[84898]: 2025-10-14 08:39:08.800543427 +0000 UTC m=+0.433945036 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:39:08 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:39:09 np0005486759.ooo.test sudo[84927]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:09 np0005486759.ooo.test sudo[84939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfb1vwfps/privsep.sock
Oct 14 08:39:09 np0005486759.ooo.test sudo[84939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:10 np0005486759.ooo.test sudo[84939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:10 np0005486759.ooo.test sudo[84950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0fioop1x/privsep.sock
Oct 14 08:39:10 np0005486759.ooo.test sudo[84950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:10 np0005486759.ooo.test sudo[84950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:11 np0005486759.ooo.test sudo[84961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyo0nsmqo/privsep.sock
Oct 14 08:39:11 np0005486759.ooo.test sudo[84961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:11 np0005486759.ooo.test sudo[84961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:11 np0005486759.ooo.test sudo[84972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzj7kqket/privsep.sock
Oct 14 08:39:11 np0005486759.ooo.test sudo[84972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:12 np0005486759.ooo.test sudo[84972]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:39:12 np0005486759.ooo.test podman[84978]: 2025-10-14 08:39:12.648431841 +0000 UTC m=+0.092591439 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, release=1, io.buildah.version=1.33.12)
Oct 14 08:39:12 np0005486759.ooo.test podman[84978]: 2025-10-14 08:39:12.659414244 +0000 UTC m=+0.103573792 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:39:12 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:39:12 np0005486759.ooo.test sudo[85007]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd22k098r/privsep.sock
Oct 14 08:39:12 np0005486759.ooo.test sudo[85007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:39:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:39:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:39:13 np0005486759.ooo.test sudo[85007]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:13 np0005486759.ooo.test podman[85012]: 2025-10-14 08:39:13.432612249 +0000 UTC m=+0.061337754 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_id=tripleo_step5, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:39:13 np0005486759.ooo.test podman[85012]: 2025-10-14 08:39:13.450224098 +0000 UTC m=+0.078949613 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:39:13 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:39:13 np0005486759.ooo.test podman[85013]: 2025-10-14 08:39:13.494866541 +0000 UTC m=+0.119505289 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, tcib_managed=true, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, version=17.1.9, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:39:13 np0005486759.ooo.test podman[85013]: 2025-10-14 08:39:13.532475903 +0000 UTC m=+0.157114681 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:39:13 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:39:13 np0005486759.ooo.test podman[85019]: 2025-10-14 08:39:13.547904965 +0000 UTC m=+0.168702062 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, vcs-type=git, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, distribution-scope=public, release=1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:39:13 np0005486759.ooo.test sudo[85089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkosy23fl/privsep.sock
Oct 14 08:39:13 np0005486759.ooo.test sudo[85089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:13 np0005486759.ooo.test podman[85019]: 2025-10-14 08:39:13.715468162 +0000 UTC m=+0.336265289 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, release=1)
Oct 14 08:39:13 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:39:14 np0005486759.ooo.test sudo[85089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:14 np0005486759.ooo.test sudo[85100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprt4q_ku2/privsep.sock
Oct 14 08:39:14 np0005486759.ooo.test sudo[85100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:15 np0005486759.ooo.test sudo[85100]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:15 np0005486759.ooo.test sudo[85111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptvzw3bp5/privsep.sock
Oct 14 08:39:15 np0005486759.ooo.test sudo[85111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:15 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:39:15 np0005486759.ooo.test recover_tripleo_nova_virtqemud[85114]: 47951
Oct 14 08:39:15 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:39:15 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:39:15 np0005486759.ooo.test sudo[85111]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:16 np0005486759.ooo.test sudo[85124]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppse08q5a/privsep.sock
Oct 14 08:39:16 np0005486759.ooo.test sudo[85124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:16 np0005486759.ooo.test sudo[85124]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:17 np0005486759.ooo.test sudo[85135]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7ha4n2_s/privsep.sock
Oct 14 08:39:17 np0005486759.ooo.test sudo[85135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:17 np0005486759.ooo.test sudo[85135]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:17 np0005486759.ooo.test sudo[85146]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpucazb4qe/privsep.sock
Oct 14 08:39:17 np0005486759.ooo.test sudo[85146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:18 np0005486759.ooo.test sudo[85146]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:18 np0005486759.ooo.test sudo[85163]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfuzzfp8c/privsep.sock
Oct 14 08:39:18 np0005486759.ooo.test sudo[85163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:19 np0005486759.ooo.test sudo[85163]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:19 np0005486759.ooo.test sudo[85174]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpurkktxu9/privsep.sock
Oct 14 08:39:19 np0005486759.ooo.test sudo[85174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:20 np0005486759.ooo.test sudo[85174]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:20 np0005486759.ooo.test sudo[85185]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpooy683cz/privsep.sock
Oct 14 08:39:20 np0005486759.ooo.test sudo[85185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:21 np0005486759.ooo.test sudo[85185]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:21 np0005486759.ooo.test sudo[85196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmput16glat/privsep.sock
Oct 14 08:39:21 np0005486759.ooo.test sudo[85196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:21 np0005486759.ooo.test sudo[85196]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:22 np0005486759.ooo.test sudo[85207]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph5j2yy49/privsep.sock
Oct 14 08:39:22 np0005486759.ooo.test sudo[85207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:22 np0005486759.ooo.test sudo[85207]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:23 np0005486759.ooo.test sudo[85218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvndvwizo/privsep.sock
Oct 14 08:39:23 np0005486759.ooo.test sudo[85218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:23 np0005486759.ooo.test sudo[85218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:23 np0005486759.ooo.test sudo[85235]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg98x39o8/privsep.sock
Oct 14 08:39:23 np0005486759.ooo.test sudo[85235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:39:24 np0005486759.ooo.test podman[85239]: 2025-10-14 08:39:24.44208276 +0000 UTC m=+0.068812448 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.expose-services=, release=1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T15:29:47, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:39:24 np0005486759.ooo.test podman[85239]: 2025-10-14 08:39:24.462408103 +0000 UTC m=+0.089137731 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=)
Oct 14 08:39:24 np0005486759.ooo.test podman[85239]: unhealthy
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:39:24 np0005486759.ooo.test podman[85238]: 2025-10-14 08:39:24.501002577 +0000 UTC m=+0.128454387 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, release=1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33)
Oct 14 08:39:24 np0005486759.ooo.test podman[85238]: 2025-10-14 08:39:24.516450109 +0000 UTC m=+0.143901929 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, maintainer=OpenStack TripleO Team)
Oct 14 08:39:24 np0005486759.ooo.test podman[85238]: unhealthy
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:39:24 np0005486759.ooo.test podman[85265]: 2025-10-14 08:39:24.570167465 +0000 UTC m=+0.112415728 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, version=17.1.9)
Oct 14 08:39:24 np0005486759.ooo.test sudo[85235]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:24 np0005486759.ooo.test podman[85265]: 2025-10-14 08:39:24.604415763 +0000 UTC m=+0.146664026 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, release=1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-cron)
Oct 14 08:39:24 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:39:24 np0005486759.ooo.test sudo[85307]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv9frwxtg/privsep.sock
Oct 14 08:39:24 np0005486759.ooo.test sudo[85307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:25 np0005486759.ooo.test sudo[85307]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:25 np0005486759.ooo.test sudo[85318]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1ulwu7z2/privsep.sock
Oct 14 08:39:25 np0005486759.ooo.test sudo[85318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:26 np0005486759.ooo.test sudo[85318]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:26 np0005486759.ooo.test sudo[85329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3xih1wi8/privsep.sock
Oct 14 08:39:26 np0005486759.ooo.test sudo[85329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:27 np0005486759.ooo.test sudo[85329]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:27 np0005486759.ooo.test sudo[85340]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp07qxez_s/privsep.sock
Oct 14 08:39:27 np0005486759.ooo.test sudo[85340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:28 np0005486759.ooo.test sudo[85340]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:28 np0005486759.ooo.test sudo[85351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpulliv_69/privsep.sock
Oct 14 08:39:28 np0005486759.ooo.test sudo[85351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:29 np0005486759.ooo.test sudo[85351]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:29 np0005486759.ooo.test sudo[85368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc_up2pbm/privsep.sock
Oct 14 08:39:29 np0005486759.ooo.test sudo[85368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:29 np0005486759.ooo.test sudo[85368]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:30 np0005486759.ooo.test sudo[85379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjqhma2ks/privsep.sock
Oct 14 08:39:30 np0005486759.ooo.test sudo[85379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:39:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:39:30 np0005486759.ooo.test podman[85382]: 2025-10-14 08:39:30.437802162 +0000 UTC m=+0.071255243 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, version=17.1.9, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:39:30 np0005486759.ooo.test podman[85383]: 2025-10-14 08:39:30.497854165 +0000 UTC m=+0.126437485 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Oct 14 08:39:30 np0005486759.ooo.test podman[85383]: 2025-10-14 08:39:30.524460595 +0000 UTC m=+0.153043875 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T13:28:44, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, container_name=ovn_controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct 14 08:39:30 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:39:30 np0005486759.ooo.test podman[85382]: 2025-10-14 08:39:30.579032576 +0000 UTC m=+0.212485637 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:39:30 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:39:30 np0005486759.ooo.test sudo[85379]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:30 np0005486759.ooo.test sudo[85439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzvzajd9_/privsep.sock
Oct 14 08:39:30 np0005486759.ooo.test sudo[85439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:31 np0005486759.ooo.test sudo[85439]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:31 np0005486759.ooo.test sudo[85450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0qgxhln4/privsep.sock
Oct 14 08:39:31 np0005486759.ooo.test sudo[85450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:32 np0005486759.ooo.test sudo[85450]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:32 np0005486759.ooo.test sudo[85461]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp13z5a7mz/privsep.sock
Oct 14 08:39:32 np0005486759.ooo.test sudo[85461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:33 np0005486759.ooo.test sudo[85461]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:33 np0005486759.ooo.test sudo[85472]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvfz90u9m/privsep.sock
Oct 14 08:39:33 np0005486759.ooo.test sudo[85472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:34 np0005486759.ooo.test sudo[85472]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:34 np0005486759.ooo.test sudo[85489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphpd4etf_/privsep.sock
Oct 14 08:39:34 np0005486759.ooo.test sudo[85489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:35 np0005486759.ooo.test sudo[85489]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:35 np0005486759.ooo.test sudo[85500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcgvh3y1k/privsep.sock
Oct 14 08:39:35 np0005486759.ooo.test sudo[85500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:36 np0005486759.ooo.test sudo[85500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:36 np0005486759.ooo.test sudo[85511]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt03b9hi_/privsep.sock
Oct 14 08:39:36 np0005486759.ooo.test sudo[85511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:36 np0005486759.ooo.test sudo[85511]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:37 np0005486759.ooo.test sudo[85522]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4cpksjio/privsep.sock
Oct 14 08:39:37 np0005486759.ooo.test sudo[85522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:37 np0005486759.ooo.test sudo[85522]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:38 np0005486759.ooo.test sudo[85533]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnnp4594a/privsep.sock
Oct 14 08:39:38 np0005486759.ooo.test sudo[85533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:38 np0005486759.ooo.test sudo[85533]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:38 np0005486759.ooo.test sudo[85544]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0lly202k/privsep.sock
Oct 14 08:39:38 np0005486759.ooo.test sudo[85544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:39:39 np0005486759.ooo.test podman[85546]: 2025-10-14 08:39:39.005856446 +0000 UTC m=+0.084942061 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, container_name=nova_migration_target, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-07-21T14:48:37, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 08:39:39 np0005486759.ooo.test podman[85546]: 2025-10-14 08:39:39.35548125 +0000 UTC m=+0.434566815 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, tcib_managed=true, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:39:39 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:39:39 np0005486759.ooo.test sudo[85544]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:39 np0005486759.ooo.test sudo[85584]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1_crqzuj/privsep.sock
Oct 14 08:39:39 np0005486759.ooo.test sudo[85584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:40 np0005486759.ooo.test sudo[85584]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:40 np0005486759.ooo.test sudo[85596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmtnswvwt/privsep.sock
Oct 14 08:39:40 np0005486759.ooo.test sudo[85596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:41 np0005486759.ooo.test sudo[85596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:41 np0005486759.ooo.test sudo[85607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4i669pz_/privsep.sock
Oct 14 08:39:41 np0005486759.ooo.test sudo[85607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:42 np0005486759.ooo.test sudo[85607]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:42 np0005486759.ooo.test sudo[85618]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx4v51xvw/privsep.sock
Oct 14 08:39:42 np0005486759.ooo.test sudo[85618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:42 np0005486759.ooo.test sudo[85618]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:39:42 np0005486759.ooo.test podman[85622]: 2025-10-14 08:39:42.985712685 +0000 UTC m=+0.079636544 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=iscsid, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, tcib_managed=true)
Oct 14 08:39:42 np0005486759.ooo.test podman[85622]: 2025-10-14 08:39:42.99741214 +0000 UTC m=+0.091335969 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, version=17.1.9, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, tcib_managed=true, release=1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:39:43 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:39:43 np0005486759.ooo.test sudo[85649]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa_6vqrp6/privsep.sock
Oct 14 08:39:43 np0005486759.ooo.test sudo[85649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:43 np0005486759.ooo.test sudo[85649]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:39:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:39:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:39:43 np0005486759.ooo.test podman[85656]: 2025-10-14 08:39:43.930058999 +0000 UTC m=+0.070860901 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, release=2, version=17.1.9, build-date=2025-07-21T13:04:03)
Oct 14 08:39:43 np0005486759.ooo.test systemd[1]: tmp-crun.mchQJc.mount: Deactivated successfully.
Oct 14 08:39:44 np0005486759.ooo.test podman[85657]: 2025-10-14 08:39:44.001921 +0000 UTC m=+0.140394509 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1)
Oct 14 08:39:44 np0005486759.ooo.test podman[85656]: 2025-10-14 08:39:44.014842654 +0000 UTC m=+0.155644606 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=2, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1)
Oct 14 08:39:44 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:39:44 np0005486759.ooo.test podman[85654]: 2025-10-14 08:39:44.107198774 +0000 UTC m=+0.249566656 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-type=git, container_name=nova_compute, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9)
Oct 14 08:39:44 np0005486759.ooo.test podman[85654]: 2025-10-14 08:39:44.138323575 +0000 UTC m=+0.280691497 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, container_name=nova_compute)
Oct 14 08:39:44 np0005486759.ooo.test sudo[85727]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpybegg__b/privsep.sock
Oct 14 08:39:44 np0005486759.ooo.test sudo[85727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:44 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:39:44 np0005486759.ooo.test podman[85657]: 2025-10-14 08:39:44.224766741 +0000 UTC m=+0.363240280 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, distribution-scope=public)
Oct 14 08:39:44 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:39:44 np0005486759.ooo.test sudo[85727]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:44 np0005486759.ooo.test systemd[1]: tmp-crun.NapIQq.mount: Deactivated successfully.
Oct 14 08:39:44 np0005486759.ooo.test sudo[85743]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjnklyfql/privsep.sock
Oct 14 08:39:44 np0005486759.ooo.test sudo[85743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:45 np0005486759.ooo.test sudo[85743]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:45 np0005486759.ooo.test sudo[85758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ryqhl1n/privsep.sock
Oct 14 08:39:45 np0005486759.ooo.test sudo[85758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:46 np0005486759.ooo.test sudo[85758]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:46 np0005486759.ooo.test sudo[85769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj985mzfk/privsep.sock
Oct 14 08:39:46 np0005486759.ooo.test sudo[85769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:47 np0005486759.ooo.test sudo[85769]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:47 np0005486759.ooo.test sudo[85780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpny_z5nf2/privsep.sock
Oct 14 08:39:47 np0005486759.ooo.test sudo[85780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:48 np0005486759.ooo.test sudo[85780]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:48 np0005486759.ooo.test sudo[85791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8pc_cub7/privsep.sock
Oct 14 08:39:48 np0005486759.ooo.test sudo[85791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:48 np0005486759.ooo.test sudo[85791]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:49 np0005486759.ooo.test sudo[85802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpovoub3i1/privsep.sock
Oct 14 08:39:49 np0005486759.ooo.test sudo[85802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:49 np0005486759.ooo.test sudo[85802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:50 np0005486759.ooo.test sudo[85813]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsevtg1id/privsep.sock
Oct 14 08:39:50 np0005486759.ooo.test sudo[85813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:50 np0005486759.ooo.test sudo[85813]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:50 np0005486759.ooo.test sudo[85830]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7bpbd9t8/privsep.sock
Oct 14 08:39:50 np0005486759.ooo.test sudo[85830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:51 np0005486759.ooo.test sudo[85830]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:51 np0005486759.ooo.test sudo[85841]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsm8of08m/privsep.sock
Oct 14 08:39:51 np0005486759.ooo.test sudo[85841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:52 np0005486759.ooo.test sudo[85841]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:52 np0005486759.ooo.test sudo[85852]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_t2q7u_c/privsep.sock
Oct 14 08:39:52 np0005486759.ooo.test sudo[85852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:53 np0005486759.ooo.test sudo[85852]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:53 np0005486759.ooo.test sudo[85863]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpskjozfyx/privsep.sock
Oct 14 08:39:53 np0005486759.ooo.test sudo[85863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:54 np0005486759.ooo.test sudo[85863]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:54 np0005486759.ooo.test sudo[85874]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5b6zft39/privsep.sock
Oct 14 08:39:54 np0005486759.ooo.test sudo[85874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:54 np0005486759.ooo.test sudo[85874]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:39:55 np0005486759.ooo.test podman[85880]: 2025-10-14 08:39:55.084788089 +0000 UTC m=+0.066087752 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, version=17.1.9, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, release=1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 08:39:55 np0005486759.ooo.test podman[85881]: 2025-10-14 08:39:55.143006495 +0000 UTC m=+0.121798240 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, version=17.1.9, release=1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1)
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: tmp-crun.vvpAXg.mount: Deactivated successfully.
Oct 14 08:39:55 np0005486759.ooo.test podman[85882]: 2025-10-14 08:39:55.193632154 +0000 UTC m=+0.171274793 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1)
Oct 14 08:39:55 np0005486759.ooo.test podman[85882]: 2025-10-14 08:39:55.205239106 +0000 UTC m=+0.182881735 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:39:55 np0005486759.ooo.test podman[85882]: unhealthy
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:39:55 np0005486759.ooo.test sudo[85939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu6msc7v1/privsep.sock
Oct 14 08:39:55 np0005486759.ooo.test podman[85880]: 2025-10-14 08:39:55.217387545 +0000 UTC m=+0.198687208 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, name=rhosp17/openstack-cron, release=1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:39:55 np0005486759.ooo.test sudo[85939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:39:55 np0005486759.ooo.test podman[85881]: 2025-10-14 08:39:55.272207195 +0000 UTC m=+0.250998980 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git)
Oct 14 08:39:55 np0005486759.ooo.test podman[85881]: unhealthy
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:39:55 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:39:55 np0005486759.ooo.test sudo[85939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:56 np0005486759.ooo.test sudo[85956]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6n_5xccb/privsep.sock
Oct 14 08:39:56 np0005486759.ooo.test sudo[85956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:56 np0005486759.ooo.test sudo[85956]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:56 np0005486759.ooo.test sudo[85968]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpshf6v1jo/privsep.sock
Oct 14 08:39:56 np0005486759.ooo.test sudo[85968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:57 np0005486759.ooo.test sudo[85968]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:57 np0005486759.ooo.test sudo[85979]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl1wnm5pm/privsep.sock
Oct 14 08:39:57 np0005486759.ooo.test sudo[85979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:58 np0005486759.ooo.test sudo[85979]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:58 np0005486759.ooo.test sudo[85990]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphhuwder0/privsep.sock
Oct 14 08:39:58 np0005486759.ooo.test sudo[85990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:39:59 np0005486759.ooo.test sudo[85990]: pam_unix(sudo:session): session closed for user root
Oct 14 08:39:59 np0005486759.ooo.test sudo[86001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpctv6o95v/privsep.sock
Oct 14 08:39:59 np0005486759.ooo.test sudo[86001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:00 np0005486759.ooo.test sudo[86001]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:00 np0005486759.ooo.test sudo[86012]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm7aexh6_/privsep.sock
Oct 14 08:40:00 np0005486759.ooo.test sudo[86012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:40:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:40:00 np0005486759.ooo.test systemd[1]: tmp-crun.CIxu1l.mount: Deactivated successfully.
Oct 14 08:40:00 np0005486759.ooo.test podman[86014]: 2025-10-14 08:40:00.717622045 +0000 UTC m=+0.086072336 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4)
Oct 14 08:40:00 np0005486759.ooo.test podman[86014]: 2025-10-14 08:40:00.753591757 +0000 UTC m=+0.122042118 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:40:00 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:40:00 np0005486759.ooo.test podman[86015]: 2025-10-14 08:40:00.781361253 +0000 UTC m=+0.147192463 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step4, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 08:40:00 np0005486759.ooo.test podman[86015]: 2025-10-14 08:40:00.826587483 +0000 UTC m=+0.192418663 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:40:00 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:40:01 np0005486759.ooo.test sudo[86012]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:01 np0005486759.ooo.test sudo[86075]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkojisfil/privsep.sock
Oct 14 08:40:01 np0005486759.ooo.test sudo[86075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:01 np0005486759.ooo.test systemd[1]: tmp-crun.D8flHG.mount: Deactivated successfully.
Oct 14 08:40:02 np0005486759.ooo.test sudo[86075]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:02 np0005486759.ooo.test sudo[86087]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3ykjo3xj/privsep.sock
Oct 14 08:40:02 np0005486759.ooo.test sudo[86087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:03 np0005486759.ooo.test sudo[86087]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:03 np0005486759.ooo.test sudo[86098]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg46355sc/privsep.sock
Oct 14 08:40:03 np0005486759.ooo.test sudo[86098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:03 np0005486759.ooo.test sudo[86098]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:04 np0005486759.ooo.test sudo[86109]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeuzk30td/privsep.sock
Oct 14 08:40:04 np0005486759.ooo.test sudo[86109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:04 np0005486759.ooo.test sudo[86109]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:05 np0005486759.ooo.test sudo[86120]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi32s1toz/privsep.sock
Oct 14 08:40:05 np0005486759.ooo.test sudo[86120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:05 np0005486759.ooo.test sudo[86120]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:05 np0005486759.ooo.test sudo[86131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdp1bcpvs/privsep.sock
Oct 14 08:40:05 np0005486759.ooo.test sudo[86131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:06 np0005486759.ooo.test sudo[86131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:06 np0005486759.ooo.test sudo[86144]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmh6g6rsz/privsep.sock
Oct 14 08:40:06 np0005486759.ooo.test sudo[86144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:07 np0005486759.ooo.test sudo[86144]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:07 np0005486759.ooo.test sudo[86159]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp29knr2ti/privsep.sock
Oct 14 08:40:07 np0005486759.ooo.test sudo[86159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:08 np0005486759.ooo.test sudo[86159]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:08 np0005486759.ooo.test sudo[86170]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3hu3yp5c/privsep.sock
Oct 14 08:40:08 np0005486759.ooo.test sudo[86170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:09 np0005486759.ooo.test sudo[86170]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:09 np0005486759.ooo.test sudo[86181]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj_sfunf2/privsep.sock
Oct 14 08:40:09 np0005486759.ooo.test sudo[86181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:09 np0005486759.ooo.test sudo[86181]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:40:09 np0005486759.ooo.test systemd[1]: tmp-crun.2OfAzW.mount: Deactivated successfully.
Oct 14 08:40:10 np0005486759.ooo.test podman[86185]: 2025-10-14 08:40:10.001525992 +0000 UTC m=+0.088233327 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12)
Oct 14 08:40:10 np0005486759.ooo.test sudo[86214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5_c3sxlr/privsep.sock
Oct 14 08:40:10 np0005486759.ooo.test sudo[86214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:10 np0005486759.ooo.test podman[86185]: 2025-10-14 08:40:10.415537567 +0000 UTC m=+0.502244942 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:40:10 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:40:10 np0005486759.ooo.test sudo[86214]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:11 np0005486759.ooo.test sudo[86226]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuk0i6kgs/privsep.sock
Oct 14 08:40:11 np0005486759.ooo.test sudo[86226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:11 np0005486759.ooo.test sudo[86226]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:11 np0005486759.ooo.test sudo[86237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsjn3l8ot/privsep.sock
Oct 14 08:40:11 np0005486759.ooo.test sudo[86237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:12 np0005486759.ooo.test sudo[86237]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:12 np0005486759.ooo.test sudo[86254]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdx3622di/privsep.sock
Oct 14 08:40:12 np0005486759.ooo.test sudo[86254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:40:13 np0005486759.ooo.test systemd[1]: tmp-crun.p3ZqFg.mount: Deactivated successfully.
Oct 14 08:40:13 np0005486759.ooo.test podman[86258]: 2025-10-14 08:40:13.424511616 +0000 UTC m=+0.056080049 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.9)
Oct 14 08:40:13 np0005486759.ooo.test sudo[86254]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:13 np0005486759.ooo.test podman[86258]: 2025-10-14 08:40:13.437128377 +0000 UTC m=+0.068696800 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:40:13 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:40:13 np0005486759.ooo.test sudo[86283]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxbo3z3r_/privsep.sock
Oct 14 08:40:13 np0005486759.ooo.test sudo[86283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:14 np0005486759.ooo.test sudo[86283]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:40:14 np0005486759.ooo.test podman[86289]: 2025-10-14 08:40:14.337255562 +0000 UTC m=+0.078070241 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T14:48:37, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Oct 14 08:40:14 np0005486759.ooo.test podman[86289]: 2025-10-14 08:40:14.378424468 +0000 UTC m=+0.119239117 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team)
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:40:14 np0005486759.ooo.test podman[86291]: 2025-10-14 08:40:14.391856944 +0000 UTC m=+0.124792939 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, version=17.1.9, release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: tmp-crun.xN4g9o.mount: Deactivated successfully.
Oct 14 08:40:14 np0005486759.ooo.test podman[86290]: 2025-10-14 08:40:14.44204252 +0000 UTC m=+0.178617698 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, name=rhosp17/openstack-collectd, version=17.1.9, com.redhat.component=openstack-collectd-container, tcib_managed=true, build-date=2025-07-21T13:04:03, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, distribution-scope=public)
Oct 14 08:40:14 np0005486759.ooo.test podman[86290]: 2025-10-14 08:40:14.48041969 +0000 UTC m=+0.216994838 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, tcib_managed=true, version=17.1.9, vcs-type=git)
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:40:14 np0005486759.ooo.test sudo[86367]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2gn8o4vq/privsep.sock
Oct 14 08:40:14 np0005486759.ooo.test sudo[86367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:14 np0005486759.ooo.test podman[86291]: 2025-10-14 08:40:14.604327761 +0000 UTC m=+0.337263726 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd)
Oct 14 08:40:14 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:40:15 np0005486759.ooo.test sudo[86367]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:15 np0005486759.ooo.test sudo[86378]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph78h6qql/privsep.sock
Oct 14 08:40:15 np0005486759.ooo.test sudo[86378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:16 np0005486759.ooo.test sudo[86378]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:16 np0005486759.ooo.test sudo[86389]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptdct7ddr/privsep.sock
Oct 14 08:40:16 np0005486759.ooo.test sudo[86389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:16 np0005486759.ooo.test sudo[86389]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:17 np0005486759.ooo.test sudo[86400]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqay4pnyb/privsep.sock
Oct 14 08:40:17 np0005486759.ooo.test sudo[86400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:17 np0005486759.ooo.test sudo[86400]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:17 np0005486759.ooo.test sudo[86417]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1a5d70bt/privsep.sock
Oct 14 08:40:17 np0005486759.ooo.test sudo[86417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:18 np0005486759.ooo.test sudo[86417]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:18 np0005486759.ooo.test sudo[86428]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq_n9jf2e/privsep.sock
Oct 14 08:40:18 np0005486759.ooo.test sudo[86428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:19 np0005486759.ooo.test sudo[86428]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:19 np0005486759.ooo.test sudo[86439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx2o9mmbv/privsep.sock
Oct 14 08:40:19 np0005486759.ooo.test sudo[86439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:20 np0005486759.ooo.test sudo[86439]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:20 np0005486759.ooo.test sudo[86450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6cwmdj0o/privsep.sock
Oct 14 08:40:20 np0005486759.ooo.test sudo[86450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:21 np0005486759.ooo.test sudo[86450]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:21 np0005486759.ooo.test sudo[86461]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf2evsmnc/privsep.sock
Oct 14 08:40:21 np0005486759.ooo.test sudo[86461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:21 np0005486759.ooo.test sudo[86461]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:22 np0005486759.ooo.test sudo[86472]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps_dvivcz/privsep.sock
Oct 14 08:40:22 np0005486759.ooo.test sudo[86472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:22 np0005486759.ooo.test sudo[86472]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:23 np0005486759.ooo.test sudo[86488]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmqywudo4/privsep.sock
Oct 14 08:40:23 np0005486759.ooo.test sudo[86488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:23 np0005486759.ooo.test sudo[86488]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:23 np0005486759.ooo.test sudo[86500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfbmj0fg8/privsep.sock
Oct 14 08:40:23 np0005486759.ooo.test sudo[86500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:24 np0005486759.ooo.test sudo[86500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:24 np0005486759.ooo.test sudo[86511]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp58dr3edw/privsep.sock
Oct 14 08:40:24 np0005486759.ooo.test sudo[86511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:25 np0005486759.ooo.test sudo[86511]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:40:25 np0005486759.ooo.test podman[86516]: 2025-10-14 08:40:25.343488371 +0000 UTC m=+0.079312739 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=logrotate_crond, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12)
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:40:25 np0005486759.ooo.test podman[86516]: 2025-10-14 08:40:25.386231477 +0000 UTC m=+0.122055825 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, tcib_managed=true, container_name=logrotate_crond, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9)
Oct 14 08:40:25 np0005486759.ooo.test podman[86545]: 2025-10-14 08:40:25.416829806 +0000 UTC m=+0.056638268 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:40:25 np0005486759.ooo.test podman[86545]: 2025-10-14 08:40:25.453505593 +0000 UTC m=+0.093314035 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:40:25 np0005486759.ooo.test podman[86545]: unhealthy
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:40:25 np0005486759.ooo.test sudo[86578]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfoxxw8z/privsep.sock
Oct 14 08:40:25 np0005486759.ooo.test sudo[86578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:25 np0005486759.ooo.test podman[86518]: 2025-10-14 08:40:25.39147352 +0000 UTC m=+0.124967806 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:40:25 np0005486759.ooo.test podman[86518]: 2025-10-14 08:40:25.52729713 +0000 UTC m=+0.260791366 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9)
Oct 14 08:40:25 np0005486759.ooo.test podman[86518]: unhealthy
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:40:25 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:40:26 np0005486759.ooo.test sudo[86578]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:26 np0005486759.ooo.test sudo[86592]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxqpiz2kp/privsep.sock
Oct 14 08:40:26 np0005486759.ooo.test sudo[86592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:26 np0005486759.ooo.test sudo[86592]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:27 np0005486759.ooo.test sudo[86603]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd81ocixb/privsep.sock
Oct 14 08:40:27 np0005486759.ooo.test sudo[86603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:27 np0005486759.ooo.test sudo[86603]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:27 np0005486759.ooo.test sudo[86614]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphd8gdxl3/privsep.sock
Oct 14 08:40:27 np0005486759.ooo.test sudo[86614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:28 np0005486759.ooo.test sudo[86614]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:28 np0005486759.ooo.test sudo[86631]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmglwd_g1/privsep.sock
Oct 14 08:40:28 np0005486759.ooo.test sudo[86631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:29 np0005486759.ooo.test sudo[86631]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:29 np0005486759.ooo.test sudo[86642]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyqqfdo3_/privsep.sock
Oct 14 08:40:29 np0005486759.ooo.test sudo[86642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:30 np0005486759.ooo.test sudo[86642]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:30 np0005486759.ooo.test sudo[86653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1szbhk2j/privsep.sock
Oct 14 08:40:30 np0005486759.ooo.test sudo[86653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:31 np0005486759.ooo.test sudo[86653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:40:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:40:31 np0005486759.ooo.test podman[86657]: 2025-10-14 08:40:31.189746399 +0000 UTC m=+0.069671671 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:40:31 np0005486759.ooo.test podman[86657]: 2025-10-14 08:40:31.222254327 +0000 UTC m=+0.102179599 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20250721.1)
Oct 14 08:40:31 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:40:31 np0005486759.ooo.test podman[86660]: 2025-10-14 08:40:31.314016722 +0000 UTC m=+0.190222328 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-07-21T13:28:44, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:40:31 np0005486759.ooo.test podman[86660]: 2025-10-14 08:40:31.365364334 +0000 UTC m=+0.241569940 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=ovn_controller, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:40:31 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:40:31 np0005486759.ooo.test sudo[86709]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9jrtaf19/privsep.sock
Oct 14 08:40:31 np0005486759.ooo.test sudo[86709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:32 np0005486759.ooo.test sudo[86709]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:32 np0005486759.ooo.test sudo[86720]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb6h47zcd/privsep.sock
Oct 14 08:40:32 np0005486759.ooo.test sudo[86720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:32 np0005486759.ooo.test sudo[86720]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:33 np0005486759.ooo.test sudo[86731]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6in_7a9k/privsep.sock
Oct 14 08:40:33 np0005486759.ooo.test sudo[86731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:33 np0005486759.ooo.test sudo[86731]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:34 np0005486759.ooo.test sudo[86748]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn7ux2jnl/privsep.sock
Oct 14 08:40:34 np0005486759.ooo.test sudo[86748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:34 np0005486759.ooo.test sudo[86748]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:34 np0005486759.ooo.test sudo[86759]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpksyhda6n/privsep.sock
Oct 14 08:40:34 np0005486759.ooo.test sudo[86759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:35 np0005486759.ooo.test sudo[86759]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:35 np0005486759.ooo.test sudo[86866]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwuelmelwmpxbhftczcaawnullauzaue ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431235.0784655-138076-10546470033324/AnsiballZ_tempfile.py
Oct 14 08:40:35 np0005486759.ooo.test sudo[86866]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:40:35 np0005486759.ooo.test python3[86870]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Oct 14 08:40:35 np0005486759.ooo.test sudo[86866]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:35 np0005486759.ooo.test sudo[86890]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpna_bj71g/privsep.sock
Oct 14 08:40:35 np0005486759.ooo.test sudo[86890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:36 np0005486759.ooo.test sudo[86967]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dejuexjwtqpocnelorhwyszcnlwyidfq ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431235.965416-138121-63770922546946/AnsiballZ_copy.py
Oct 14 08:40:36 np0005486759.ooo.test sudo[86967]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:40:36 np0005486759.ooo.test python3[86969]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.h0cadudqtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:40:36 np0005486759.ooo.test sudo[86967]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:36 np0005486759.ooo.test sudo[86890]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:36 np0005486759.ooo.test sudo[87005]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf9l0l1h2/privsep.sock
Oct 14 08:40:36 np0005486759.ooo.test sudo[87005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:36 np0005486759.ooo.test sudo[87069]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxtxwzgbhtyvdzfverhaljrsftqgrodk ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431236.6549065-138156-10373240549169/AnsiballZ_blockinfile.py
Oct 14 08:40:36 np0005486759.ooo.test sudo[87069]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:40:37 np0005486759.ooo.test python3[87071]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.h0cadudqtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:40:37 np0005486759.ooo.test sudo[87069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:37 np0005486759.ooo.test sudo[87005]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:37 np0005486759.ooo.test sudo[87094]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk3q2ibyl/privsep.sock
Oct 14 08:40:37 np0005486759.ooo.test sudo[87094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:37 np0005486759.ooo.test sudo[87171]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xffunczpheohxrveqlkwybktmqmlxlxv ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431237.6962113-138194-111553996889197/AnsiballZ_blockinfile.py
Oct 14 08:40:37 np0005486759.ooo.test sudo[87171]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:40:38 np0005486759.ooo.test sudo[87094]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:38 np0005486759.ooo.test python3[87173]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.h0cadudqtmphosts insertbefore=BOF block=192.168.122.109 np0005486761.ooo.test np0005486761
                                                      172.17.0.109 np0005486761.internalapi.ooo.test np0005486761.internalapi
                                                      192.168.122.109 np0005486761.ctlplane.ooo.test np0005486761.ctlplane
                                                      192.168.122.106 np0005486757.ooo.test np0005486757
                                                      172.17.0.106 np0005486757.internalapi.ooo.test np0005486757.internalapi
                                                      192.168.122.106 np0005486757.ctlplane.ooo.test np0005486757.ctlplane
                                                      192.168.122.107 np0005486759.ooo.test np0005486759
                                                      172.17.0.107 np0005486759.internalapi.ooo.test np0005486759.internalapi
                                                      192.168.122.107 np0005486759.ctlplane.ooo.test np0005486759.ctlplane
                                                      192.168.122.103 np0005486756.ooo.test np0005486756
                                                      172.17.0.103 np0005486756.internalapi.ooo.test np0005486756.internalapi
                                                      192.168.122.103 np0005486756.ctlplane.ooo.test np0005486756.ctlplane
                                                      
                                                       marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: all marker_end=END_HOST_ENTRIES_FOR_STACK: all state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:40:38 np0005486759.ooo.test sudo[87171]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:38 np0005486759.ooo.test sudo[87196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp99wy3trj/privsep.sock
Oct 14 08:40:38 np0005486759.ooo.test sudo[87196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:38 np0005486759.ooo.test sudo[87273]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeuarstbbpgorlyhbzqzcmdqswbtfwtf ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431238.3096488-138232-205964405330006/AnsiballZ_command.py
Oct 14 08:40:38 np0005486759.ooo.test sudo[87273]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:40:38 np0005486759.ooo.test python3[87275]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.h0cadudqtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:40:38 np0005486759.ooo.test sudo[87273]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:38 np0005486759.ooo.test sudo[87196]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:39 np0005486759.ooo.test sudo[87333]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsyc7n7l8/privsep.sock
Oct 14 08:40:39 np0005486759.ooo.test sudo[87333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:39 np0005486759.ooo.test sudo[87381]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjrkzispfuucsqdarztdnqbjqgpqjyvn ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1760431239.0077689-138268-246241116555158/AnsiballZ_file.py
Oct 14 08:40:39 np0005486759.ooo.test sudo[87381]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1002)
Oct 14 08:40:39 np0005486759.ooo.test python3[87384]: ansible-file Invoked with path=/tmp/ansible.h0cadudqtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 08:40:39 np0005486759.ooo.test sudo[87381]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:39 np0005486759.ooo.test sudo[87333]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:39 np0005486759.ooo.test sudo[87407]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl7gnn5b8/privsep.sock
Oct 14 08:40:39 np0005486759.ooo.test sudo[87407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:40 np0005486759.ooo.test sudo[87407]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:40:40 np0005486759.ooo.test podman[87412]: 2025-10-14 08:40:40.589486337 +0000 UTC m=+0.063738567 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, release=1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37)
Oct 14 08:40:40 np0005486759.ooo.test sudo[87438]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbxw2aa3q/privsep.sock
Oct 14 08:40:40 np0005486759.ooo.test sudo[87438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:40 np0005486759.ooo.test podman[87412]: 2025-10-14 08:40:40.951256122 +0000 UTC m=+0.425508342 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9)
Oct 14 08:40:40 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:40:41 np0005486759.ooo.test sudo[87438]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:41 np0005486759.ooo.test sudo[87451]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphkys2l_k/privsep.sock
Oct 14 08:40:41 np0005486759.ooo.test sudo[87451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:42 np0005486759.ooo.test sudo[87451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:42 np0005486759.ooo.test sudo[87462]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphps_gg8q/privsep.sock
Oct 14 08:40:42 np0005486759.ooo.test sudo[87462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:42 np0005486759.ooo.test sudo[87462]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:43 np0005486759.ooo.test sudo[87473]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5wgalzmr/privsep.sock
Oct 14 08:40:43 np0005486759.ooo.test sudo[87473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:43 np0005486759.ooo.test sudo[87473]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:40:43 np0005486759.ooo.test podman[87478]: 2025-10-14 08:40:43.878741326 +0000 UTC m=+0.078480454 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible)
Oct 14 08:40:43 np0005486759.ooo.test podman[87478]: 2025-10-14 08:40:43.913241625 +0000 UTC m=+0.112980703 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public)
Oct 14 08:40:43 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:40:44 np0005486759.ooo.test sudo[87502]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwoevncld/privsep.sock
Oct 14 08:40:44 np0005486759.ooo.test sudo[87502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:44 np0005486759.ooo.test sudo[87502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:40:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:40:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:40:44 np0005486759.ooo.test systemd[1]: tmp-crun.mRoFJY.mount: Deactivated successfully.
Oct 14 08:40:44 np0005486759.ooo.test podman[87514]: 2025-10-14 08:40:44.738447148 +0000 UTC m=+0.061248130 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, vcs-type=git)
Oct 14 08:40:44 np0005486759.ooo.test podman[87514]: 2025-10-14 08:40:44.753200435 +0000 UTC m=+0.076001417 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, build-date=2025-07-21T13:04:03, tcib_managed=true, release=2, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:40:44 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:40:44 np0005486759.ooo.test podman[87511]: 2025-10-14 08:40:44.787783676 +0000 UTC m=+0.111605080 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:40:44 np0005486759.ooo.test podman[87511]: 2025-10-14 08:40:44.812386689 +0000 UTC m=+0.136208103 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:40:44 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:40:44 np0005486759.ooo.test podman[87515]: 2025-10-14 08:40:44.889144919 +0000 UTC m=+0.207557966 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.9, architecture=x86_64)
Oct 14 08:40:44 np0005486759.ooo.test sudo[87591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm5eqx1re/privsep.sock
Oct 14 08:40:44 np0005486759.ooo.test sudo[87591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:45 np0005486759.ooo.test podman[87515]: 2025-10-14 08:40:45.078234021 +0000 UTC m=+0.396647088 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=metrics_qdr)
Oct 14 08:40:45 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:40:45 np0005486759.ooo.test sudo[87591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:45 np0005486759.ooo.test sudo[87602]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpur8ihs_z/privsep.sock
Oct 14 08:40:45 np0005486759.ooo.test sudo[87602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:46 np0005486759.ooo.test sudo[87602]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:46 np0005486759.ooo.test sudo[87613]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_javv08w/privsep.sock
Oct 14 08:40:46 np0005486759.ooo.test sudo[87613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:47 np0005486759.ooo.test sudo[87613]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:47 np0005486759.ooo.test sudo[87624]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm8twozc9/privsep.sock
Oct 14 08:40:47 np0005486759.ooo.test sudo[87624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:48 np0005486759.ooo.test sudo[87624]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:48 np0005486759.ooo.test sudo[87635]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmf18tidh/privsep.sock
Oct 14 08:40:48 np0005486759.ooo.test sudo[87635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:48 np0005486759.ooo.test sudo[87635]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:49 np0005486759.ooo.test sudo[87646]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6u09838_/privsep.sock
Oct 14 08:40:49 np0005486759.ooo.test sudo[87646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:49 np0005486759.ooo.test sudo[87646]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:50 np0005486759.ooo.test sudo[87661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp88_beutu/privsep.sock
Oct 14 08:40:50 np0005486759.ooo.test sudo[87661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:50 np0005486759.ooo.test sudo[87661]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:50 np0005486759.ooo.test sudo[87674]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjggsm874/privsep.sock
Oct 14 08:40:50 np0005486759.ooo.test sudo[87674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:51 np0005486759.ooo.test sudo[87674]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:51 np0005486759.ooo.test sudo[87685]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmpkdkfhq/privsep.sock
Oct 14 08:40:51 np0005486759.ooo.test sudo[87685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:52 np0005486759.ooo.test sudo[87685]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:52 np0005486759.ooo.test sudo[87696]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf7rzxt_g/privsep.sock
Oct 14 08:40:52 np0005486759.ooo.test sudo[87696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:53 np0005486759.ooo.test sudo[87696]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:53 np0005486759.ooo.test sudo[87707]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprl8lwue7/privsep.sock
Oct 14 08:40:53 np0005486759.ooo.test sudo[87707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:53 np0005486759.ooo.test sudo[87707]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:54 np0005486759.ooo.test sudo[87718]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppg_yctbw/privsep.sock
Oct 14 08:40:54 np0005486759.ooo.test sudo[87718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:54 np0005486759.ooo.test sudo[87718]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:55 np0005486759.ooo.test sudo[87729]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6dd_r3ek/privsep.sock
Oct 14 08:40:55 np0005486759.ooo.test sudo[87729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:55 np0005486759.ooo.test sudo[87729]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: tmp-crun.pD6XE6.mount: Deactivated successfully.
Oct 14 08:40:55 np0005486759.ooo.test podman[87741]: 2025-10-14 08:40:55.736884766 +0000 UTC m=+0.081672453 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.9, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, release=1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: tmp-crun.mycCAU.mount: Deactivated successfully.
Oct 14 08:40:55 np0005486759.ooo.test podman[87742]: 2025-10-14 08:40:55.80219495 +0000 UTC m=+0.139136275 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-type=git)
Oct 14 08:40:55 np0005486759.ooo.test podman[87741]: 2025-10-14 08:40:55.826522845 +0000 UTC m=+0.171310532 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, container_name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9)
Oct 14 08:40:55 np0005486759.ooo.test podman[87743]: 2025-10-14 08:40:55.781244281 +0000 UTC m=+0.118566857 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, architecture=x86_64)
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:40:55 np0005486759.ooo.test podman[87742]: 2025-10-14 08:40:55.841321473 +0000 UTC m=+0.178262748 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, maintainer=OpenStack TripleO Team, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1)
Oct 14 08:40:55 np0005486759.ooo.test podman[87742]: unhealthy
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:40:55 np0005486759.ooo.test sudo[87802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpguj4kmyy/privsep.sock
Oct 14 08:40:55 np0005486759.ooo.test sudo[87802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:55 np0005486759.ooo.test podman[87743]: 2025-10-14 08:40:55.868341281 +0000 UTC m=+0.205663837 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, version=17.1.9, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Oct 14 08:40:55 np0005486759.ooo.test podman[87743]: unhealthy
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:40:55 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:40:56 np0005486759.ooo.test sudo[87802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:56 np0005486759.ooo.test sudo[87813]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_g9s7l4v/privsep.sock
Oct 14 08:40:56 np0005486759.ooo.test sudo[87813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:57 np0005486759.ooo.test sudo[87813]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:57 np0005486759.ooo.test sudo[87824]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdt15qpbd/privsep.sock
Oct 14 08:40:57 np0005486759.ooo.test sudo[87824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:58 np0005486759.ooo.test sudo[87824]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:58 np0005486759.ooo.test sudo[87835]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpylgut55y/privsep.sock
Oct 14 08:40:58 np0005486759.ooo.test sudo[87835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:58 np0005486759.ooo.test sudo[87835]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:59 np0005486759.ooo.test sudo[87846]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfiknjmkb/privsep.sock
Oct 14 08:40:59 np0005486759.ooo.test sudo[87846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:40:59 np0005486759.ooo.test sudo[87846]: pam_unix(sudo:session): session closed for user root
Oct 14 08:40:59 np0005486759.ooo.test sudo[87857]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpscikr914/privsep.sock
Oct 14 08:40:59 np0005486759.ooo.test sudo[87857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:00 np0005486759.ooo.test sudo[87857]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:00 np0005486759.ooo.test sudo[87874]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb6_3bz0g/privsep.sock
Oct 14 08:41:00 np0005486759.ooo.test sudo[87874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:41:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:41:01 np0005486759.ooo.test systemd[1]: tmp-crun.QFRYwU.mount: Deactivated successfully.
Oct 14 08:41:01 np0005486759.ooo.test podman[87877]: 2025-10-14 08:41:01.461677487 +0000 UTC m=+0.091674832 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-type=git, version=17.1.9, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:41:01 np0005486759.ooo.test sudo[87874]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:01 np0005486759.ooo.test podman[87877]: 2025-10-14 08:41:01.556533248 +0000 UTC m=+0.186530593 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Oct 14 08:41:01 np0005486759.ooo.test podman[87893]: 2025-10-14 08:41:01.565049592 +0000 UTC m=+0.104506120 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:41:01 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:41:01 np0005486759.ooo.test podman[87893]: 2025-10-14 08:41:01.588783868 +0000 UTC m=+0.128240396 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, version=17.1.9, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:41:01 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:41:01 np0005486759.ooo.test sudo[87934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr64k2h9y/privsep.sock
Oct 14 08:41:01 np0005486759.ooo.test sudo[87934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:02 np0005486759.ooo.test sudo[87934]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:02 np0005486759.ooo.test sudo[87945]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpskiv2m3r/privsep.sock
Oct 14 08:41:02 np0005486759.ooo.test sudo[87945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:03 np0005486759.ooo.test sudo[87945]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:03 np0005486759.ooo.test sudo[87956]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy9nvrdta/privsep.sock
Oct 14 08:41:03 np0005486759.ooo.test sudo[87956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:04 np0005486759.ooo.test sudo[87956]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:04 np0005486759.ooo.test sudo[87967]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxxozyb4_/privsep.sock
Oct 14 08:41:04 np0005486759.ooo.test sudo[87967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:04 np0005486759.ooo.test sudo[87967]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:05 np0005486759.ooo.test sudo[87978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprzhqo_vm/privsep.sock
Oct 14 08:41:05 np0005486759.ooo.test sudo[87978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:05 np0005486759.ooo.test sudo[87978]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:06 np0005486759.ooo.test sudo[87991]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1xy6swb/privsep.sock
Oct 14 08:41:06 np0005486759.ooo.test sudo[87991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:06 np0005486759.ooo.test sudo[87991]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:06 np0005486759.ooo.test sudo[88006]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp98ou05tw/privsep.sock
Oct 14 08:41:06 np0005486759.ooo.test sudo[88006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:07 np0005486759.ooo.test sudo[88006]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:07 np0005486759.ooo.test sudo[88017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfbmpsjt2/privsep.sock
Oct 14 08:41:07 np0005486759.ooo.test sudo[88017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:08 np0005486759.ooo.test sudo[88017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:08 np0005486759.ooo.test sudo[88028]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp10ks3owy/privsep.sock
Oct 14 08:41:08 np0005486759.ooo.test sudo[88028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:09 np0005486759.ooo.test sudo[88028]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:09 np0005486759.ooo.test sudo[88039]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkdnnujff/privsep.sock
Oct 14 08:41:09 np0005486759.ooo.test sudo[88039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:10 np0005486759.ooo.test sudo[88039]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:10 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:41:10 np0005486759.ooo.test recover_tripleo_nova_virtqemud[88046]: 47951
Oct 14 08:41:10 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:41:10 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:41:10 np0005486759.ooo.test sudo[88052]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphiapnp14/privsep.sock
Oct 14 08:41:10 np0005486759.ooo.test sudo[88052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:10 np0005486759.ooo.test sudo[88052]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:11 np0005486759.ooo.test sudo[88063]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyajp3hu3/privsep.sock
Oct 14 08:41:11 np0005486759.ooo.test sudo[88063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:41:11 np0005486759.ooo.test systemd[1]: tmp-crun.HsYp5k.mount: Deactivated successfully.
Oct 14 08:41:11 np0005486759.ooo.test podman[88065]: 2025-10-14 08:41:11.316774332 +0000 UTC m=+0.098643569 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_migration_target, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vcs-type=git)
Oct 14 08:41:11 np0005486759.ooo.test podman[88065]: 2025-10-14 08:41:11.675711839 +0000 UTC m=+0.457581046 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, version=17.1.9, io.openshift.expose-services=, container_name=nova_migration_target, maintainer=OpenStack TripleO Team)
Oct 14 08:41:11 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:41:11 np0005486759.ooo.test sudo[88063]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:12 np0005486759.ooo.test sudo[88103]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8g1sfs9q/privsep.sock
Oct 14 08:41:12 np0005486759.ooo.test sudo[88103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:12 np0005486759.ooo.test sudo[88103]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:12 np0005486759.ooo.test sudo[88114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzdttde8q/privsep.sock
Oct 14 08:41:12 np0005486759.ooo.test sudo[88114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:13 np0005486759.ooo.test sudo[88114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:13 np0005486759.ooo.test sudo[88125]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9elt9d72/privsep.sock
Oct 14 08:41:13 np0005486759.ooo.test sudo[88125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:14 np0005486759.ooo.test sudo[88125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:41:14 np0005486759.ooo.test podman[88129]: 2025-10-14 08:41:14.375403791 +0000 UTC m=+0.051970531 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, release=1)
Oct 14 08:41:14 np0005486759.ooo.test podman[88129]: 2025-10-14 08:41:14.406114873 +0000 UTC m=+0.082681623 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1)
Oct 14 08:41:14 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:41:14 np0005486759.ooo.test sudo[88155]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpahkwk2km/privsep.sock
Oct 14 08:41:14 np0005486759.ooo.test sudo[88155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:15 np0005486759.ooo.test sudo[88155]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: tmp-crun.Xa3svb.mount: Deactivated successfully.
Oct 14 08:41:15 np0005486759.ooo.test podman[88170]: 2025-10-14 08:41:15.178066175 +0000 UTC m=+0.055377488 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=)
Oct 14 08:41:15 np0005486759.ooo.test podman[88167]: 2025-10-14 08:41:15.227136176 +0000 UTC m=+0.109159186 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, tcib_managed=true, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container)
Oct 14 08:41:15 np0005486759.ooo.test podman[88167]: 2025-10-14 08:41:15.259830069 +0000 UTC m=+0.141853129 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9)
Oct 14 08:41:15 np0005486759.ooo.test podman[88159]: 2025-10-14 08:41:15.161115669 +0000 UTC m=+0.055212662 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:41:15 np0005486759.ooo.test sudo[88237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe9k1rl5x/privsep.sock
Oct 14 08:41:15 np0005486759.ooo.test sudo[88237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:15 np0005486759.ooo.test podman[88159]: 2025-10-14 08:41:15.339745996 +0000 UTC m=+0.233843009 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12)
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:41:15 np0005486759.ooo.test podman[88170]: 2025-10-14 08:41:15.395872466 +0000 UTC m=+0.273183829 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:41:15 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:41:15 np0005486759.ooo.test sudo[88237]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:16 np0005486759.ooo.test sudo[88248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyhb_20fw/privsep.sock
Oct 14 08:41:16 np0005486759.ooo.test sudo[88248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:16 np0005486759.ooo.test sudo[88248]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:16 np0005486759.ooo.test sudo[88264]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnylh_088/privsep.sock
Oct 14 08:41:16 np0005486759.ooo.test sudo[88264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:17 np0005486759.ooo.test sudo[88264]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:17 np0005486759.ooo.test sudo[88276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp30sjve7c/privsep.sock
Oct 14 08:41:17 np0005486759.ooo.test sudo[88276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:18 np0005486759.ooo.test sshd[88279]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 08:41:18 np0005486759.ooo.test sshd[88279]: Accepted publickey for zuul from 38.102.83.114 port 51402 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 08:41:18 np0005486759.ooo.test systemd-logind[759]: New session 17 of user zuul.
Oct 14 08:41:18 np0005486759.ooo.test systemd[1]: Started Session 17 of User zuul.
Oct 14 08:41:18 np0005486759.ooo.test sshd[88279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 08:41:18 np0005486759.ooo.test sudo[88296]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mktsxkkeygarhwegbujrgllejbcicofv ; /usr/bin/python3
Oct 14 08:41:18 np0005486759.ooo.test sudo[88296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:41:18 np0005486759.ooo.test sudo[88276]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:18 np0005486759.ooo.test python3[88299]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:41:18 np0005486759.ooo.test sudo[88308]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn0otf1z6/privsep.sock
Oct 14 08:41:18 np0005486759.ooo.test sudo[88308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:19 np0005486759.ooo.test sudo[88308]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:19 np0005486759.ooo.test sudo[88319]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmsqsh73h/privsep.sock
Oct 14 08:41:19 np0005486759.ooo.test sudo[88319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:19 np0005486759.ooo.test sudo[88319]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:20 np0005486759.ooo.test sudo[88330]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu5j3nqe7/privsep.sock
Oct 14 08:41:20 np0005486759.ooo.test sudo[88330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:20 np0005486759.ooo.test sudo[88330]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:20 np0005486759.ooo.test sudo[88341]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6wlu5le8/privsep.sock
Oct 14 08:41:20 np0005486759.ooo.test sudo[88341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:21 np0005486759.ooo.test sudo[88296]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:21 np0005486759.ooo.test sudo[88341]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:21 np0005486759.ooo.test sudo[88352]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsjbpb_pe/privsep.sock
Oct 14 08:41:21 np0005486759.ooo.test sudo[88352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:22 np0005486759.ooo.test sudo[88352]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:22 np0005486759.ooo.test sudo[88369]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkmftckwh/privsep.sock
Oct 14 08:41:22 np0005486759.ooo.test sudo[88369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:23 np0005486759.ooo.test sudo[88369]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:23 np0005486759.ooo.test sudo[88380]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpga0dua2y/privsep.sock
Oct 14 08:41:23 np0005486759.ooo.test sudo[88380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:23 np0005486759.ooo.test sudo[88380]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:24 np0005486759.ooo.test sudo[88391]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpken5mto0/privsep.sock
Oct 14 08:41:24 np0005486759.ooo.test sudo[88391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:24 np0005486759.ooo.test sudo[88391]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:25 np0005486759.ooo.test sudo[88402]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4wbm0plc/privsep.sock
Oct 14 08:41:25 np0005486759.ooo.test sudo[88402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:25 np0005486759.ooo.test sudo[88402]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:25 np0005486759.ooo.test sudo[88413]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkhmcy4wa/privsep.sock
Oct 14 08:41:25 np0005486759.ooo.test sudo[88413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:41:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:41:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:41:26 np0005486759.ooo.test podman[88417]: 2025-10-14 08:41:26.038762883 +0000 UTC m=+0.082558741 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, distribution-scope=public, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, release=1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi)
Oct 14 08:41:26 np0005486759.ooo.test podman[88417]: 2025-10-14 08:41:26.08221689 +0000 UTC m=+0.126012748 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., tcib_managed=true)
Oct 14 08:41:26 np0005486759.ooo.test podman[88417]: unhealthy
Oct 14 08:41:26 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:41:26 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:41:26 np0005486759.ooo.test podman[88415]: 2025-10-14 08:41:26.119983991 +0000 UTC m=+0.165634127 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, tcib_managed=true, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:41:26 np0005486759.ooo.test podman[88416]: 2025-10-14 08:41:26.096517813 +0000 UTC m=+0.139478335 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Oct 14 08:41:26 np0005486759.ooo.test podman[88415]: 2025-10-14 08:41:26.152256051 +0000 UTC m=+0.197906207 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true)
Oct 14 08:41:26 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:41:26 np0005486759.ooo.test podman[88416]: 2025-10-14 08:41:26.18029509 +0000 UTC m=+0.223255612 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:41:26 np0005486759.ooo.test podman[88416]: unhealthy
Oct 14 08:41:26 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:41:26 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:41:26 np0005486759.ooo.test sudo[88413]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:26 np0005486759.ooo.test sudo[88483]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg7sx6in1/privsep.sock
Oct 14 08:41:26 np0005486759.ooo.test sudo[88483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:27 np0005486759.ooo.test systemd[1]: tmp-crun.S79bJx.mount: Deactivated successfully.
Oct 14 08:41:27 np0005486759.ooo.test sudo[88483]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:27 np0005486759.ooo.test sudo[88497]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz6l1t0oc/privsep.sock
Oct 14 08:41:27 np0005486759.ooo.test sudo[88497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:28 np0005486759.ooo.test sudo[88497]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:28 np0005486759.ooo.test sudo[88511]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpruozs_ug/privsep.sock
Oct 14 08:41:28 np0005486759.ooo.test sudo[88511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:29 np0005486759.ooo.test sudo[88511]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:29 np0005486759.ooo.test sudo[88522]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpakwx6jfr/privsep.sock
Oct 14 08:41:29 np0005486759.ooo.test sudo[88522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:29 np0005486759.ooo.test sudo[88522]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:30 np0005486759.ooo.test sudo[88533]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_dmpcxge/privsep.sock
Oct 14 08:41:30 np0005486759.ooo.test sudo[88533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:30 np0005486759.ooo.test sudo[88533]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:30 np0005486759.ooo.test sudo[88544]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3kx4x62z/privsep.sock
Oct 14 08:41:30 np0005486759.ooo.test sudo[88544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:31 np0005486759.ooo.test sudo[88544]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:41:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:41:31 np0005486759.ooo.test podman[88551]: 2025-10-14 08:41:31.674989539 +0000 UTC m=+0.046030058 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, distribution-scope=public, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, release=1, vendor=Red Hat, Inc.)
Oct 14 08:41:31 np0005486759.ooo.test podman[88551]: 2025-10-14 08:41:31.719969534 +0000 UTC m=+0.091010053 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team)
Oct 14 08:41:31 np0005486759.ooo.test systemd[1]: tmp-crun.gfnQOR.mount: Deactivated successfully.
Oct 14 08:41:31 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:41:31 np0005486759.ooo.test podman[88549]: 2025-10-14 08:41:31.725868707 +0000 UTC m=+0.097462333 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=1, batch=17.1_20250721.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:41:31 np0005486759.ooo.test podman[88549]: 2025-10-14 08:41:31.805227367 +0000 UTC m=+0.176820993 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, architecture=x86_64, version=17.1.9)
Oct 14 08:41:31 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:41:31 np0005486759.ooo.test sudo[88604]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb4ax_tby/privsep.sock
Oct 14 08:41:31 np0005486759.ooo.test sudo[88604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:32 np0005486759.ooo.test sudo[88604]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:32 np0005486759.ooo.test sudo[88615]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5q2hx429/privsep.sock
Oct 14 08:41:32 np0005486759.ooo.test sudo[88615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:33 np0005486759.ooo.test sudo[88615]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:33 np0005486759.ooo.test sudo[88632]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp10ehdmk8/privsep.sock
Oct 14 08:41:33 np0005486759.ooo.test sudo[88632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:34 np0005486759.ooo.test sudo[88632]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:34 np0005486759.ooo.test sudo[88643]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfj5cnnyz/privsep.sock
Oct 14 08:41:34 np0005486759.ooo.test sudo[88643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:35 np0005486759.ooo.test sudo[88643]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:35 np0005486759.ooo.test sudo[88654]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa0c1mgi4/privsep.sock
Oct 14 08:41:35 np0005486759.ooo.test sudo[88654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:35 np0005486759.ooo.test sudo[88654]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:36 np0005486759.ooo.test sudo[88665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwk1bmw_f/privsep.sock
Oct 14 08:41:36 np0005486759.ooo.test sudo[88665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:36 np0005486759.ooo.test sudo[88665]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:36 np0005486759.ooo.test sudo[88676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4zyhyr95/privsep.sock
Oct 14 08:41:36 np0005486759.ooo.test sudo[88676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:37 np0005486759.ooo.test sudo[88676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:37 np0005486759.ooo.test sudo[88687]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpki1i0v35/privsep.sock
Oct 14 08:41:37 np0005486759.ooo.test sudo[88687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:38 np0005486759.ooo.test sudo[88687]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:38 np0005486759.ooo.test sudo[88704]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpstbrmhnx/privsep.sock
Oct 14 08:41:38 np0005486759.ooo.test sudo[88704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:39 np0005486759.ooo.test sudo[88704]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:39 np0005486759.ooo.test sudo[88715]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_bc_ioq9/privsep.sock
Oct 14 08:41:39 np0005486759.ooo.test sudo[88715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:40 np0005486759.ooo.test sudo[88715]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:40 np0005486759.ooo.test sudo[88726]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp14_9thlr/privsep.sock
Oct 14 08:41:40 np0005486759.ooo.test sudo[88726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:40 np0005486759.ooo.test sudo[88726]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:41 np0005486759.ooo.test sudo[88737]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphjnzi_23/privsep.sock
Oct 14 08:41:41 np0005486759.ooo.test sudo[88737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:41 np0005486759.ooo.test sudo[88737]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:41:41 np0005486759.ooo.test podman[88741]: 2025-10-14 08:41:41.899860445 +0000 UTC m=+0.058996399 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:41:42 np0005486759.ooo.test sudo[88769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8c1setnb/privsep.sock
Oct 14 08:41:42 np0005486759.ooo.test sudo[88769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:42 np0005486759.ooo.test podman[88741]: 2025-10-14 08:41:42.280403423 +0000 UTC m=+0.439539317 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:41:42 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:41:42 np0005486759.ooo.test sudo[88769]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:42 np0005486759.ooo.test sudo[88781]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1hsyostd/privsep.sock
Oct 14 08:41:42 np0005486759.ooo.test sudo[88781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:43 np0005486759.ooo.test sudo[88781]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:43 np0005486759.ooo.test sudo[88795]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqbnzy3i9/privsep.sock
Oct 14 08:41:43 np0005486759.ooo.test sudo[88795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:44 np0005486759.ooo.test sudo[88795]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:44 np0005486759.ooo.test sudo[88809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpek678hi4/privsep.sock
Oct 14 08:41:44 np0005486759.ooo.test sudo[88809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:41:44 np0005486759.ooo.test systemd[1]: tmp-crun.JZTVxx.mount: Deactivated successfully.
Oct 14 08:41:44 np0005486759.ooo.test podman[88811]: 2025-10-14 08:41:44.720662312 +0000 UTC m=+0.073118408 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:27:15, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Oct 14 08:41:44 np0005486759.ooo.test podman[88811]: 2025-10-14 08:41:44.726165973 +0000 UTC m=+0.078622069 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., release=1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, build-date=2025-07-21T13:27:15, tcib_managed=true, batch=17.1_20250721.1)
Oct 14 08:41:44 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:41:45 np0005486759.ooo.test sudo[88809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:41:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:41:45 np0005486759.ooo.test sudo[88857]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6t4abqpi/privsep.sock
Oct 14 08:41:45 np0005486759.ooo.test sudo[88857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:41:45 np0005486759.ooo.test podman[88835]: 2025-10-14 08:41:45.441817469 +0000 UTC m=+0.074125260 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, version=17.1.9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 08:41:45 np0005486759.ooo.test podman[88842]: 2025-10-14 08:41:45.470302482 +0000 UTC m=+0.096255305 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, tcib_managed=true, release=1)
Oct 14 08:41:45 np0005486759.ooo.test podman[88835]: 2025-10-14 08:41:45.472667495 +0000 UTC m=+0.104975296 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, version=17.1.9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc.)
Oct 14 08:41:45 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:41:45 np0005486759.ooo.test podman[88842]: 2025-10-14 08:41:45.498211027 +0000 UTC m=+0.124163890 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, name=rhosp17/openstack-nova-compute, release=1, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5)
Oct 14 08:41:45 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:41:45 np0005486759.ooo.test podman[88868]: 2025-10-14 08:41:45.580023663 +0000 UTC m=+0.128293719 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr)
Oct 14 08:41:45 np0005486759.ooo.test podman[88868]: 2025-10-14 08:41:45.76344389 +0000 UTC m=+0.311713986 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:41:45 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:41:45 np0005486759.ooo.test sudo[88857]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:46 np0005486759.ooo.test sudo[88923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpowv6ha_z/privsep.sock
Oct 14 08:41:46 np0005486759.ooo.test sudo[88923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:46 np0005486759.ooo.test sudo[88923]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:47 np0005486759.ooo.test sudo[88934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptppu7vjs/privsep.sock
Oct 14 08:41:47 np0005486759.ooo.test sudo[88934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:47 np0005486759.ooo.test sudo[88934]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:47 np0005486759.ooo.test sudo[88945]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxhkew6kn/privsep.sock
Oct 14 08:41:47 np0005486759.ooo.test sudo[88945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:48 np0005486759.ooo.test sudo[88945]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:48 np0005486759.ooo.test sudo[88956]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj4dftjbp/privsep.sock
Oct 14 08:41:48 np0005486759.ooo.test sudo[88956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:49 np0005486759.ooo.test sudo[88956]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:49 np0005486759.ooo.test sudo[88973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0_b0pn15/privsep.sock
Oct 14 08:41:49 np0005486759.ooo.test sudo[88973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:50 np0005486759.ooo.test sudo[88973]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:50 np0005486759.ooo.test sudo[88984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu7pd0hje/privsep.sock
Oct 14 08:41:50 np0005486759.ooo.test sudo[88984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:50 np0005486759.ooo.test sudo[88984]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:51 np0005486759.ooo.test sudo[88995]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwt3rdbhj/privsep.sock
Oct 14 08:41:51 np0005486759.ooo.test sudo[88995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:51 np0005486759.ooo.test sudo[88995]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:52 np0005486759.ooo.test sudo[89006]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyt67yke6/privsep.sock
Oct 14 08:41:52 np0005486759.ooo.test sudo[89006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:52 np0005486759.ooo.test sudo[89006]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:52 np0005486759.ooo.test sudo[89017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp10t7vxct/privsep.sock
Oct 14 08:41:52 np0005486759.ooo.test sudo[89017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:53 np0005486759.ooo.test sudo[89017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:53 np0005486759.ooo.test sudo[89028]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpacf2zvzz/privsep.sock
Oct 14 08:41:53 np0005486759.ooo.test sudo[89028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:54 np0005486759.ooo.test sudo[89028]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:54 np0005486759.ooo.test sudo[89041]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbs6otiix/privsep.sock
Oct 14 08:41:54 np0005486759.ooo.test sudo[89041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:55 np0005486759.ooo.test sudo[89041]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:55 np0005486759.ooo.test sudo[89056]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpegd8boyl/privsep.sock
Oct 14 08:41:55 np0005486759.ooo.test sudo[89056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:55 np0005486759.ooo.test sudo[89056]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:56 np0005486759.ooo.test sudo[89067]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb1oh5w9e/privsep.sock
Oct 14 08:41:56 np0005486759.ooo.test sudo[89067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:41:56 np0005486759.ooo.test podman[89069]: 2025-10-14 08:41:56.204286792 +0000 UTC m=+0.090865289 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20250721.1, maintainer=OpenStack TripleO Team)
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:41:56 np0005486759.ooo.test podman[89069]: 2025-10-14 08:41:56.223355592 +0000 UTC m=+0.109934079 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:41:56 np0005486759.ooo.test podman[89069]: unhealthy
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:41:56 np0005486759.ooo.test podman[89088]: 2025-10-14 08:41:56.32261978 +0000 UTC m=+0.110677822 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:41:56 np0005486759.ooo.test podman[89088]: 2025-10-14 08:41:56.330094931 +0000 UTC m=+0.118153003 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:41:56 np0005486759.ooo.test podman[89088]: unhealthy
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: tmp-crun.Fv2q9N.mount: Deactivated successfully.
Oct 14 08:41:56 np0005486759.ooo.test podman[89086]: 2025-10-14 08:41:56.367832222 +0000 UTC m=+0.158379642 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-cron, release=1, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:41:56 np0005486759.ooo.test podman[89086]: 2025-10-14 08:41:56.373071884 +0000 UTC m=+0.163619314 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:41:56 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:41:56 np0005486759.ooo.test sudo[89067]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:56 np0005486759.ooo.test sudo[89134]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmzrwv0e6/privsep.sock
Oct 14 08:41:56 np0005486759.ooo.test sudo[89134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:57 np0005486759.ooo.test sudo[89134]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:57 np0005486759.ooo.test sudo[89145]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo_a9g3tw/privsep.sock
Oct 14 08:41:57 np0005486759.ooo.test sudo[89145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:58 np0005486759.ooo.test sudo[89145]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:58 np0005486759.ooo.test sudo[89156]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr_xaty5q/privsep.sock
Oct 14 08:41:58 np0005486759.ooo.test sudo[89156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:59 np0005486759.ooo.test sudo[89173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avbbefbzvgkfdjcemgbonbajuvdaakzb ; /usr/bin/python3
Oct 14 08:41:59 np0005486759.ooo.test sudo[89156]: pam_unix(sudo:session): session closed for user root
Oct 14 08:41:59 np0005486759.ooo.test sudo[89173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:41:59 np0005486759.ooo.test sudo[89183]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqics7710/privsep.sock
Oct 14 08:41:59 np0005486759.ooo.test sudo[89183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:41:59 np0005486759.ooo.test python3[89175]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 14 08:41:59 np0005486759.ooo.test sudo[89183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:00 np0005486759.ooo.test sudo[89201]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvj12pgp1/privsep.sock
Oct 14 08:42:00 np0005486759.ooo.test sudo[89201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:00 np0005486759.ooo.test sudo[89201]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:01 np0005486759.ooo.test sudo[89212]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp68y3uk2t/privsep.sock
Oct 14 08:42:01 np0005486759.ooo.test sudo[89212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:01 np0005486759.ooo.test sudo[89212]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:01 np0005486759.ooo.test sudo[89223]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb5ib32y0/privsep.sock
Oct 14 08:42:01 np0005486759.ooo.test sudo[89223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:42:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:42:01 np0005486759.ooo.test systemd[1]: tmp-crun.Lo1cnY.mount: Deactivated successfully.
Oct 14 08:42:01 np0005486759.ooo.test podman[89226]: 2025-10-14 08:42:01.950717905 +0000 UTC m=+0.073573802 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Oct 14 08:42:01 np0005486759.ooo.test podman[89226]: 2025-10-14 08:42:01.969885609 +0000 UTC m=+0.092741506 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, version=17.1.9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git)
Oct 14 08:42:01 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:42:02 np0005486759.ooo.test podman[89225]: 2025-10-14 08:42:02.046819973 +0000 UTC m=+0.164736888 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:42:02 np0005486759.ooo.test podman[89225]: 2025-10-14 08:42:02.096915817 +0000 UTC m=+0.214832712 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53)
Oct 14 08:42:02 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:42:02 np0005486759.ooo.test sudo[89223]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:02 np0005486759.ooo.test sudo[89282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp75xi_n4g/privsep.sock
Oct 14 08:42:02 np0005486759.ooo.test sudo[89282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 08:42:03 np0005486759.ooo.test sudo[89282]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: run-r76c43dc8a40b4045b7776c8c103f22f5.service: Deactivated successfully.
Oct 14 08:42:03 np0005486759.ooo.test systemd[1]: run-r9b4279a4db7b458cbf9366df113f8f0b.service: Deactivated successfully.
Oct 14 08:42:03 np0005486759.ooo.test sudo[89440]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphw8aphv7/privsep.sock
Oct 14 08:42:03 np0005486759.ooo.test sudo[89440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:04 np0005486759.ooo.test sudo[89173]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:04 np0005486759.ooo.test sudo[89440]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:04 np0005486759.ooo.test sudo[89451]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnspmhbib/privsep.sock
Oct 14 08:42:04 np0005486759.ooo.test sudo[89451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:05 np0005486759.ooo.test sudo[89451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:05 np0005486759.ooo.test sudo[89465]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprpyifh_b/privsep.sock
Oct 14 08:42:05 np0005486759.ooo.test sudo[89465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:05 np0005486759.ooo.test sudo[89465]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:06 np0005486759.ooo.test sudo[89479]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz_btur9e/privsep.sock
Oct 14 08:42:06 np0005486759.ooo.test sudo[89479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:06 np0005486759.ooo.test sudo[89479]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:06 np0005486759.ooo.test sudo[89490]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp660mp_4e/privsep.sock
Oct 14 08:42:06 np0005486759.ooo.test sudo[89490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:07 np0005486759.ooo.test sudo[89490]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:07 np0005486759.ooo.test sudo[89501]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb8bq3ijm/privsep.sock
Oct 14 08:42:07 np0005486759.ooo.test sudo[89501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:08 np0005486759.ooo.test sudo[89501]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:08 np0005486759.ooo.test sudo[89512]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl084qpvw/privsep.sock
Oct 14 08:42:08 np0005486759.ooo.test sudo[89512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:09 np0005486759.ooo.test sudo[89512]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:09 np0005486759.ooo.test sudo[89523]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk4wly6yd/privsep.sock
Oct 14 08:42:09 np0005486759.ooo.test sudo[89523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:10 np0005486759.ooo.test sudo[89523]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:10 np0005486759.ooo.test sudo[89534]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9x70cw59/privsep.sock
Oct 14 08:42:10 np0005486759.ooo.test sudo[89534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:10 np0005486759.ooo.test sudo[89534]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:11 np0005486759.ooo.test sudo[89551]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbnrbgijg/privsep.sock
Oct 14 08:42:11 np0005486759.ooo.test sudo[89551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:11 np0005486759.ooo.test sudo[89551]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:12 np0005486759.ooo.test sudo[89562]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppu31iybr/privsep.sock
Oct 14 08:42:12 np0005486759.ooo.test sudo[89562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:42:12 np0005486759.ooo.test podman[89565]: 2025-10-14 08:42:12.434355724 +0000 UTC m=+0.067781683 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, container_name=nova_migration_target, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:42:12 np0005486759.ooo.test sudo[89562]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:12 np0005486759.ooo.test podman[89565]: 2025-10-14 08:42:12.805358686 +0000 UTC m=+0.438784585 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20250721.1, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64)
Oct 14 08:42:12 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:42:12 np0005486759.ooo.test sudo[89596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoxmf5o52/privsep.sock
Oct 14 08:42:12 np0005486759.ooo.test sudo[89596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:13 np0005486759.ooo.test sudo[89596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:13 np0005486759.ooo.test sudo[89607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpst87e_fp/privsep.sock
Oct 14 08:42:13 np0005486759.ooo.test sudo[89607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:14 np0005486759.ooo.test sudo[89607]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:14 np0005486759.ooo.test sudo[89618]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8tbyu0s5/privsep.sock
Oct 14 08:42:14 np0005486759.ooo.test sudo[89618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:15 np0005486759.ooo.test sudo[89618]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: tmp-crun.3anDhF.mount: Deactivated successfully.
Oct 14 08:42:15 np0005486759.ooo.test podman[89624]: 2025-10-14 08:42:15.360129945 +0000 UTC m=+0.093064807 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:42:15 np0005486759.ooo.test podman[89624]: 2025-10-14 08:42:15.402250541 +0000 UTC m=+0.135185353 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, container_name=iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:42:15 np0005486759.ooo.test sudo[89648]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0yeshiep/privsep.sock
Oct 14 08:42:15 np0005486759.ooo.test sudo[89648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:42:15 np0005486759.ooo.test podman[89651]: 2025-10-14 08:42:15.64900869 +0000 UTC m=+0.081630891 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, version=17.1.9, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12)
Oct 14 08:42:15 np0005486759.ooo.test podman[89650]: 2025-10-14 08:42:15.625981996 +0000 UTC m=+0.064342196 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, name=rhosp17/openstack-nova-compute, release=1, vendor=Red Hat, Inc.)
Oct 14 08:42:15 np0005486759.ooo.test podman[89651]: 2025-10-14 08:42:15.683338294 +0000 UTC m=+0.115960515 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git)
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:42:15 np0005486759.ooo.test podman[89650]: 2025-10-14 08:42:15.706918055 +0000 UTC m=+0.145278155 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1)
Oct 14 08:42:15 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:42:16 np0005486759.ooo.test sudo[89648]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:42:16 np0005486759.ooo.test podman[89701]: 2025-10-14 08:42:16.190649281 +0000 UTC m=+0.059946659 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, release=1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.9, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr)
Oct 14 08:42:16 np0005486759.ooo.test sudo[89738]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuj_tso20/privsep.sock
Oct 14 08:42:16 np0005486759.ooo.test sudo[89738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:16 np0005486759.ooo.test podman[89701]: 2025-10-14 08:42:16.417236716 +0000 UTC m=+0.286534064 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, container_name=metrics_qdr)
Oct 14 08:42:16 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:42:16 np0005486759.ooo.test sudo[89738]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:17 np0005486759.ooo.test sudo[89749]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgdfis9ia/privsep.sock
Oct 14 08:42:17 np0005486759.ooo.test sudo[89749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:17 np0005486759.ooo.test sudo[89749]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:18 np0005486759.ooo.test sudo[89760]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnrvsa9vm/privsep.sock
Oct 14 08:42:18 np0005486759.ooo.test sudo[89760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:18 np0005486759.ooo.test sudo[89760]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:18 np0005486759.ooo.test sudo[89771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpthp4zatw/privsep.sock
Oct 14 08:42:18 np0005486759.ooo.test sudo[89771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:19 np0005486759.ooo.test sudo[89771]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:19 np0005486759.ooo.test sudo[89790]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llyepgrhkdrihjtsujliovpmeytnkuvr ; /usr/bin/python3
Oct 14 08:42:19 np0005486759.ooo.test sudo[89790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 08:42:19 np0005486759.ooo.test sudo[89798]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplu6cexle/privsep.sock
Oct 14 08:42:19 np0005486759.ooo.test sudo[89798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:19 np0005486759.ooo.test python3[89793]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 08:42:20 np0005486759.ooo.test sudo[89798]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:20 np0005486759.ooo.test sudo[89811]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_c440akd/privsep.sock
Oct 14 08:42:20 np0005486759.ooo.test sudo[89811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:21 np0005486759.ooo.test sudo[89811]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:21 np0005486759.ooo.test sudo[89825]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoc73xtam/privsep.sock
Oct 14 08:42:21 np0005486759.ooo.test sudo[89825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:22 np0005486759.ooo.test sudo[89825]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:22 np0005486759.ooo.test sudo[89840]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx6wdcwhd/privsep.sock
Oct 14 08:42:22 np0005486759.ooo.test sudo[89840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:22 np0005486759.ooo.test sudo[89840]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:23 np0005486759.ooo.test rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 08:42:23 np0005486759.ooo.test sudo[89970]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz7lglfyr/privsep.sock
Oct 14 08:42:23 np0005486759.ooo.test sudo[89970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:23 np0005486759.ooo.test rhsm-service[6469]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Oct 14 08:42:23 np0005486759.ooo.test sudo[89970]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:23 np0005486759.ooo.test sudo[89987]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2k6l2uol/privsep.sock
Oct 14 08:42:23 np0005486759.ooo.test sudo[89987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:24 np0005486759.ooo.test sudo[89987]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:24 np0005486759.ooo.test sudo[90000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6fnr6izs/privsep.sock
Oct 14 08:42:24 np0005486759.ooo.test sudo[90000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:25 np0005486759.ooo.test sudo[90000]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:25 np0005486759.ooo.test sudo[90011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuv4boa2o/privsep.sock
Oct 14 08:42:25 np0005486759.ooo.test sudo[90011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:26 np0005486759.ooo.test sudo[90011]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:42:26 np0005486759.ooo.test podman[90018]: 2025-10-14 08:42:26.433130934 +0000 UTC m=+0.072249701 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, build-date=2025-07-21T14:45:33, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, version=17.1.9, release=1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: tmp-crun.eL05I4.mount: Deactivated successfully.
Oct 14 08:42:26 np0005486759.ooo.test podman[90018]: 2025-10-14 08:42:26.474230179 +0000 UTC m=+0.113348926 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:42:26 np0005486759.ooo.test podman[90018]: unhealthy
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:42:26 np0005486759.ooo.test podman[90017]: 2025-10-14 08:42:26.474507157 +0000 UTC m=+0.116670048 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-type=git, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Oct 14 08:42:26 np0005486759.ooo.test podman[90043]: 2025-10-14 08:42:26.53104024 +0000 UTC m=+0.099845416 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.33.12, version=17.1.9)
Oct 14 08:42:26 np0005486759.ooo.test sudo[90077]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2g999qv3/privsep.sock
Oct 14 08:42:26 np0005486759.ooo.test podman[90017]: 2025-10-14 08:42:26.558192702 +0000 UTC m=+0.200355573 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.openshift.expose-services=)
Oct 14 08:42:26 np0005486759.ooo.test sudo[90077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:26 np0005486759.ooo.test podman[90017]: unhealthy
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:42:26 np0005486759.ooo.test podman[90043]: 2025-10-14 08:42:26.573774854 +0000 UTC m=+0.142580050 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:42:26 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:42:27 np0005486759.ooo.test sudo[90077]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:27 np0005486759.ooo.test sudo[90155]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpstw6oggf/privsep.sock
Oct 14 08:42:27 np0005486759.ooo.test sudo[90155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:27 np0005486759.ooo.test systemd[1]: tmp-crun.b48q1W.mount: Deactivated successfully.
Oct 14 08:42:27 np0005486759.ooo.test sudo[89790]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:27 np0005486759.ooo.test sudo[90155]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:28 np0005486759.ooo.test sudo[90166]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa5pq4hed/privsep.sock
Oct 14 08:42:28 np0005486759.ooo.test sudo[90166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:28 np0005486759.ooo.test sudo[90166]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:29 np0005486759.ooo.test sudo[90177]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwjuswhss/privsep.sock
Oct 14 08:42:29 np0005486759.ooo.test sudo[90177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:29 np0005486759.ooo.test sudo[90177]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:29 np0005486759.ooo.test sudo[90188]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8yf8ojrk/privsep.sock
Oct 14 08:42:29 np0005486759.ooo.test sudo[90188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:30 np0005486759.ooo.test sudo[90188]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:30 np0005486759.ooo.test sudo[90199]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl59627se/privsep.sock
Oct 14 08:42:30 np0005486759.ooo.test sudo[90199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:31 np0005486759.ooo.test sudo[90199]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:31 np0005486759.ooo.test sudo[90210]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvw5j1rjo/privsep.sock
Oct 14 08:42:31 np0005486759.ooo.test sudo[90210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:32 np0005486759.ooo.test sudo[90210]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:42:32 np0005486759.ooo.test recover_tripleo_nova_virtqemud[90231]: 47951
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:42:32 np0005486759.ooo.test podman[90218]: 2025-10-14 08:42:32.249571007 +0000 UTC m=+0.057155012 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Oct 14 08:42:32 np0005486759.ooo.test podman[90218]: 2025-10-14 08:42:32.30641849 +0000 UTC m=+0.114002475 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, vcs-type=git, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: tmp-crun.QOYp3b.mount: Deactivated successfully.
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:42:32 np0005486759.ooo.test podman[90219]: 2025-10-14 08:42:32.316479351 +0000 UTC m=+0.118666959 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, release=1, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:42:32 np0005486759.ooo.test podman[90219]: 2025-10-14 08:42:32.335272574 +0000 UTC m=+0.137460202 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, release=1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9)
Oct 14 08:42:32 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:42:32 np0005486759.ooo.test sudo[90275]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnhh2gz6f/privsep.sock
Oct 14 08:42:32 np0005486759.ooo.test sudo[90275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:32 np0005486759.ooo.test sudo[90275]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:33 np0005486759.ooo.test sudo[90287]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps3e38ko4/privsep.sock
Oct 14 08:42:33 np0005486759.ooo.test sudo[90287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:33 np0005486759.ooo.test sudo[90287]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:34 np0005486759.ooo.test sudo[90298]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp29fk2bbs/privsep.sock
Oct 14 08:42:34 np0005486759.ooo.test sudo[90298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:34 np0005486759.ooo.test sudo[90298]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:34 np0005486759.ooo.test sudo[90309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcy7_mf2a/privsep.sock
Oct 14 08:42:34 np0005486759.ooo.test sudo[90309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:35 np0005486759.ooo.test sudo[90309]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:35 np0005486759.ooo.test sudo[90320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp562ymjx6/privsep.sock
Oct 14 08:42:35 np0005486759.ooo.test sudo[90320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:36 np0005486759.ooo.test sudo[90320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:36 np0005486759.ooo.test sudo[90331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfn73ms3w/privsep.sock
Oct 14 08:42:36 np0005486759.ooo.test sudo[90331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:37 np0005486759.ooo.test sudo[90331]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:37 np0005486759.ooo.test sudo[90344]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprvl720sj/privsep.sock
Oct 14 08:42:37 np0005486759.ooo.test sudo[90344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:38 np0005486759.ooo.test sudo[90344]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:38 np0005486759.ooo.test sudo[90359]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpih2psss_/privsep.sock
Oct 14 08:42:38 np0005486759.ooo.test sudo[90359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:38 np0005486759.ooo.test sudo[90359]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:39 np0005486759.ooo.test sudo[90370]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyshk81dg/privsep.sock
Oct 14 08:42:39 np0005486759.ooo.test sudo[90370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:39 np0005486759.ooo.test sudo[90370]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:40 np0005486759.ooo.test sudo[90381]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxsr4c39z/privsep.sock
Oct 14 08:42:40 np0005486759.ooo.test sudo[90381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:40 np0005486759.ooo.test sudo[90381]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:40 np0005486759.ooo.test sudo[90392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfw65pf7x/privsep.sock
Oct 14 08:42:40 np0005486759.ooo.test sudo[90392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:41 np0005486759.ooo.test sudo[90392]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:41 np0005486759.ooo.test sudo[90403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc3qpq2bt/privsep.sock
Oct 14 08:42:41 np0005486759.ooo.test sudo[90403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:42 np0005486759.ooo.test sudo[90403]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:42 np0005486759.ooo.test sudo[90414]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp04wx1c21/privsep.sock
Oct 14 08:42:42 np0005486759.ooo.test sudo[90414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:43 np0005486759.ooo.test sudo[90414]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:42:43 np0005486759.ooo.test podman[90425]: 2025-10-14 08:42:43.395875659 +0000 UTC m=+0.069347130 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Oct 14 08:42:43 np0005486759.ooo.test sudo[90451]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg4oms2kq/privsep.sock
Oct 14 08:42:43 np0005486759.ooo.test sudo[90451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:43 np0005486759.ooo.test podman[90425]: 2025-10-14 08:42:43.779293166 +0000 UTC m=+0.452764627 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute)
Oct 14 08:42:43 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:42:44 np0005486759.ooo.test sudo[90451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:44 np0005486759.ooo.test sudo[90463]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqvzqhx05/privsep.sock
Oct 14 08:42:44 np0005486759.ooo.test sudo[90463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:44 np0005486759.ooo.test sudo[90463]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:45 np0005486759.ooo.test sudo[90474]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprdl3dfg4/privsep.sock
Oct 14 08:42:45 np0005486759.ooo.test sudo[90474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:45 np0005486759.ooo.test sudo[90474]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:42:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:42:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:42:45 np0005486759.ooo.test systemd[1]: tmp-crun.gScQGo.mount: Deactivated successfully.
Oct 14 08:42:45 np0005486759.ooo.test podman[90487]: 2025-10-14 08:42:45.962753775 +0000 UTC m=+0.075086839 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.buildah.version=1.33.12, version=17.1.9, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, architecture=x86_64, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:42:45 np0005486759.ooo.test podman[90481]: 2025-10-14 08:42:45.986302145 +0000 UTC m=+0.106192984 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true)
Oct 14 08:42:45 np0005486759.ooo.test podman[90480]: 2025-10-14 08:42:45.938109881 +0000 UTC m=+0.063291604 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9)
Oct 14 08:42:46 np0005486759.ooo.test podman[90487]: 2025-10-14 08:42:46.018029458 +0000 UTC m=+0.130362552 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T13:04:03, release=2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step3, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:42:46 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:42:46 np0005486759.ooo.test podman[90481]: 2025-10-14 08:42:46.033365884 +0000 UTC m=+0.153256803 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:42:46 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:42:46 np0005486759.ooo.test podman[90480]: 2025-10-14 08:42:46.071362871 +0000 UTC m=+0.196544574 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:42:46 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:42:46 np0005486759.ooo.test sudo[90550]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf1h_umk5/privsep.sock
Oct 14 08:42:46 np0005486759.ooo.test sudo[90550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:46 np0005486759.ooo.test sudo[90550]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:42:46 np0005486759.ooo.test podman[90554]: 2025-10-14 08:42:46.799228516 +0000 UTC m=+0.070941950 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1)
Oct 14 08:42:46 np0005486759.ooo.test sudo[90589]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7qzkqj4d/privsep.sock
Oct 14 08:42:46 np0005486759.ooo.test sudo[90589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:46 np0005486759.ooo.test podman[90554]: 2025-10-14 08:42:46.998422361 +0000 UTC m=+0.270135775 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, release=1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container)
Oct 14 08:42:47 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:42:47 np0005486759.ooo.test sudo[90589]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:47 np0005486759.ooo.test sudo[90600]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq9mdl2_t/privsep.sock
Oct 14 08:42:47 np0005486759.ooo.test sudo[90600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:48 np0005486759.ooo.test sudo[90600]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:48 np0005486759.ooo.test sudo[90617]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4xzvg85/privsep.sock
Oct 14 08:42:48 np0005486759.ooo.test sudo[90617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:49 np0005486759.ooo.test sudo[90617]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:49 np0005486759.ooo.test sudo[90628]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe62y2udc/privsep.sock
Oct 14 08:42:49 np0005486759.ooo.test sudo[90628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:50 np0005486759.ooo.test sudo[90628]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:50 np0005486759.ooo.test sudo[90639]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7hcoisrl/privsep.sock
Oct 14 08:42:50 np0005486759.ooo.test sudo[90639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:50 np0005486759.ooo.test sudo[90639]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:51 np0005486759.ooo.test sudo[90650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq38ml6o8/privsep.sock
Oct 14 08:42:51 np0005486759.ooo.test sudo[90650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:51 np0005486759.ooo.test sudo[90650]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:52 np0005486759.ooo.test sudo[90661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcsjn1u6v/privsep.sock
Oct 14 08:42:52 np0005486759.ooo.test sudo[90661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:52 np0005486759.ooo.test sudo[90661]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:52 np0005486759.ooo.test sudo[90672]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpomo97cb3/privsep.sock
Oct 14 08:42:52 np0005486759.ooo.test sudo[90672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:53 np0005486759.ooo.test sudo[90672]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:53 np0005486759.ooo.test sudo[90683]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbibq4w0z/privsep.sock
Oct 14 08:42:53 np0005486759.ooo.test sudo[90683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:54 np0005486759.ooo.test sudo[90683]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:54 np0005486759.ooo.test sudo[90700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91n0g0ug/privsep.sock
Oct 14 08:42:54 np0005486759.ooo.test sudo[90700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:55 np0005486759.ooo.test sudo[90700]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:55 np0005486759.ooo.test sudo[90711]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvei9ws2j/privsep.sock
Oct 14 08:42:55 np0005486759.ooo.test sudo[90711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:55 np0005486759.ooo.test sudo[90711]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:56 np0005486759.ooo.test sudo[90722]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcaip84bz/privsep.sock
Oct 14 08:42:56 np0005486759.ooo.test sudo[90722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:56 np0005486759.ooo.test sudo[90722]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:42:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:42:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:42:56 np0005486759.ooo.test podman[90728]: 2025-10-14 08:42:56.92238711 +0000 UTC m=+0.058576577 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, version=17.1.9, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Oct 14 08:42:56 np0005486759.ooo.test podman[90728]: 2025-10-14 08:42:56.940235303 +0000 UTC m=+0.076424770 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:42:56 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:42:56 np0005486759.ooo.test podman[90729]: 2025-10-14 08:42:56.984618439 +0000 UTC m=+0.120638121 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, release=1)
Oct 14 08:42:56 np0005486759.ooo.test podman[90730]: 2025-10-14 08:42:56.942810223 +0000 UTC m=+0.072186259 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 08:42:57 np0005486759.ooo.test podman[90729]: 2025-10-14 08:42:57.018362535 +0000 UTC m=+0.154382137 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, batch=17.1_20250721.1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, config_id=tripleo_step4, release=1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:42:57 np0005486759.ooo.test podman[90729]: unhealthy
Oct 14 08:42:57 np0005486759.ooo.test podman[90730]: 2025-10-14 08:42:57.026564409 +0000 UTC m=+0.155940425 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:42:57 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:42:57 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:42:57 np0005486759.ooo.test podman[90730]: unhealthy
Oct 14 08:42:57 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:42:57 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:42:57 np0005486759.ooo.test sudo[90792]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcuwe3_in/privsep.sock
Oct 14 08:42:57 np0005486759.ooo.test sudo[90792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:57 np0005486759.ooo.test sudo[90792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:57 np0005486759.ooo.test sudo[90803]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxtdpf_wz/privsep.sock
Oct 14 08:42:57 np0005486759.ooo.test sudo[90803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:58 np0005486759.ooo.test sudo[90803]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:58 np0005486759.ooo.test sudo[90814]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm2tw7bzi/privsep.sock
Oct 14 08:42:58 np0005486759.ooo.test sudo[90814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:42:59 np0005486759.ooo.test sudo[90814]: pam_unix(sudo:session): session closed for user root
Oct 14 08:42:59 np0005486759.ooo.test sudo[90831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd1a6yc9r/privsep.sock
Oct 14 08:42:59 np0005486759.ooo.test sudo[90831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:00 np0005486759.ooo.test sudo[90831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:00 np0005486759.ooo.test sudo[90842]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaks5zp9o/privsep.sock
Oct 14 08:43:00 np0005486759.ooo.test sudo[90842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:01 np0005486759.ooo.test sudo[90842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:01 np0005486759.ooo.test sudo[90853]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsrrp4bs8/privsep.sock
Oct 14 08:43:01 np0005486759.ooo.test sudo[90853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:01 np0005486759.ooo.test sudo[90853]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:02 np0005486759.ooo.test sudo[90864]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2u4jwqbx/privsep.sock
Oct 14 08:43:02 np0005486759.ooo.test sudo[90864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:43:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:43:02 np0005486759.ooo.test podman[90868]: 2025-10-14 08:43:02.431413293 +0000 UTC m=+0.061479827 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:43:02 np0005486759.ooo.test podman[90866]: 2025-10-14 08:43:02.444172039 +0000 UTC m=+0.073572942 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.9, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Oct 14 08:43:02 np0005486759.ooo.test systemd[1]: tmp-crun.3rsU46.mount: Deactivated successfully.
Oct 14 08:43:02 np0005486759.ooo.test podman[90868]: 2025-10-14 08:43:02.483420805 +0000 UTC m=+0.113487329 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, container_name=ovn_controller, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, version=17.1.9, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, name=rhosp17/openstack-ovn-controller)
Oct 14 08:43:02 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:43:02 np0005486759.ooo.test podman[90866]: 2025-10-14 08:43:02.496138209 +0000 UTC m=+0.125539182 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git)
Oct 14 08:43:02 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:43:02 np0005486759.ooo.test sudo[90864]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:03 np0005486759.ooo.test sudo[90921]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjraxe_gx/privsep.sock
Oct 14 08:43:03 np0005486759.ooo.test sudo[90921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:03 np0005486759.ooo.test sudo[90921]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:04 np0005486759.ooo.test sudo[90932]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyf0qgz_y/privsep.sock
Oct 14 08:43:04 np0005486759.ooo.test sudo[90932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:04 np0005486759.ooo.test sudo[90932]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:04 np0005486759.ooo.test sudo[90948]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv9yvgooo/privsep.sock
Oct 14 08:43:04 np0005486759.ooo.test sudo[90948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:05 np0005486759.ooo.test sudo[90948]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:05 np0005486759.ooo.test sudo[90960]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfr4l0s9b/privsep.sock
Oct 14 08:43:05 np0005486759.ooo.test sudo[90960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:06 np0005486759.ooo.test sudo[90960]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:06 np0005486759.ooo.test sudo[90971]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjnldmbwf/privsep.sock
Oct 14 08:43:06 np0005486759.ooo.test sudo[90971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:07 np0005486759.ooo.test sudo[90971]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:07 np0005486759.ooo.test sudo[90982]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0g5d4cwh/privsep.sock
Oct 14 08:43:07 np0005486759.ooo.test sudo[90982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:07 np0005486759.ooo.test sudo[90982]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:08 np0005486759.ooo.test sudo[90993]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpool6nhp5/privsep.sock
Oct 14 08:43:08 np0005486759.ooo.test sudo[90993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:08 np0005486759.ooo.test sudo[90993]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:09 np0005486759.ooo.test sudo[91004]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0oez9j9/privsep.sock
Oct 14 08:43:09 np0005486759.ooo.test sudo[91004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:09 np0005486759.ooo.test sudo[91004]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:09 np0005486759.ooo.test sudo[91017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9tz027jt/privsep.sock
Oct 14 08:43:09 np0005486759.ooo.test sudo[91017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:10 np0005486759.ooo.test sudo[91017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:10 np0005486759.ooo.test sudo[91032]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6az2hr4p/privsep.sock
Oct 14 08:43:10 np0005486759.ooo.test sudo[91032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:11 np0005486759.ooo.test sudo[91032]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:11 np0005486759.ooo.test sudo[91043]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbtf9ghyy/privsep.sock
Oct 14 08:43:11 np0005486759.ooo.test sudo[91043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:12 np0005486759.ooo.test sudo[91043]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:12 np0005486759.ooo.test sudo[91054]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe7eqef32/privsep.sock
Oct 14 08:43:12 np0005486759.ooo.test sudo[91054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:13 np0005486759.ooo.test sudo[91054]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:13 np0005486759.ooo.test sudo[91065]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeglnjxmb/privsep.sock
Oct 14 08:43:13 np0005486759.ooo.test sudo[91065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:13 np0005486759.ooo.test sudo[91065]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:43:14 np0005486759.ooo.test podman[91069]: 2025-10-14 08:43:14.014415833 +0000 UTC m=+0.050092054 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:43:14 np0005486759.ooo.test sudo[91099]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf0pwis2f/privsep.sock
Oct 14 08:43:14 np0005486759.ooo.test sudo[91099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:14 np0005486759.ooo.test podman[91069]: 2025-10-14 08:43:14.406300112 +0000 UTC m=+0.441976323 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_id=tripleo_step4)
Oct 14 08:43:14 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:43:14 np0005486759.ooo.test sudo[91099]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:14 np0005486759.ooo.test python3[91120]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Oct 14 08:43:15 np0005486759.ooo.test sudo[91126]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmbiemrb/privsep.sock
Oct 14 08:43:15 np0005486759.ooo.test sudo[91126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:15 np0005486759.ooo.test sudo[91126]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:15 np0005486759.ooo.test sudo[91143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqncc5la5/privsep.sock
Oct 14 08:43:15 np0005486759.ooo.test sudo[91143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: tmp-crun.DFRDIF.mount: Deactivated successfully.
Oct 14 08:43:16 np0005486759.ooo.test podman[91148]: 2025-10-14 08:43:16.45651973 +0000 UTC m=+0.071764106 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, tcib_managed=true, batch=17.1_20250721.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, architecture=x86_64)
Oct 14 08:43:16 np0005486759.ooo.test podman[91148]: 2025-10-14 08:43:16.468220183 +0000 UTC m=+0.083464529 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20250721.1, release=2, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:43:16 np0005486759.ooo.test sudo[91143]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:16 np0005486759.ooo.test podman[91146]: 2025-10-14 08:43:16.500024618 +0000 UTC m=+0.117912136 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, architecture=x86_64, release=1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Oct 14 08:43:16 np0005486759.ooo.test podman[91146]: 2025-10-14 08:43:16.512267398 +0000 UTC m=+0.130154926 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:43:16 np0005486759.ooo.test podman[91147]: 2025-10-14 08:43:16.433986601 +0000 UTC m=+0.051948361 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:43:16 np0005486759.ooo.test podman[91147]: 2025-10-14 08:43:16.563109234 +0000 UTC m=+0.181070994 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, version=17.1.9)
Oct 14 08:43:16 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:43:16 np0005486759.ooo.test sudo[91219]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqr_9ajt9/privsep.sock
Oct 14 08:43:16 np0005486759.ooo.test sudo[91219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:17 np0005486759.ooo.test sudo[91219]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:43:17 np0005486759.ooo.test podman[91224]: 2025-10-14 08:43:17.3432943 +0000 UTC m=+0.067951657 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, container_name=metrics_qdr, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Oct 14 08:43:17 np0005486759.ooo.test sudo[91259]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpomj3em7k/privsep.sock
Oct 14 08:43:17 np0005486759.ooo.test sudo[91259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:17 np0005486759.ooo.test podman[91224]: 2025-10-14 08:43:17.50745117 +0000 UTC m=+0.232108517 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:43:17 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:43:18 np0005486759.ooo.test sudo[91259]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:18 np0005486759.ooo.test sudo[91270]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqf8zf9rf/privsep.sock
Oct 14 08:43:18 np0005486759.ooo.test sudo[91270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:18 np0005486759.ooo.test sudo[91270]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:19 np0005486759.ooo.test sudo[91281]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphscvjesh/privsep.sock
Oct 14 08:43:19 np0005486759.ooo.test sudo[91281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:19 np0005486759.ooo.test sudo[91281]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:19 np0005486759.ooo.test sudo[91292]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj_kwxv9d/privsep.sock
Oct 14 08:43:19 np0005486759.ooo.test sudo[91292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:20 np0005486759.ooo.test sudo[91292]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:20 np0005486759.ooo.test sudo[91306]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw11um7y4/privsep.sock
Oct 14 08:43:20 np0005486759.ooo.test sudo[91306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:21 np0005486759.ooo.test sudo[91306]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:21 np0005486759.ooo.test sudo[91320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0itanu_a/privsep.sock
Oct 14 08:43:21 np0005486759.ooo.test sudo[91320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:22 np0005486759.ooo.test sudo[91320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:22 np0005486759.ooo.test sudo[91331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkcgfl0_i/privsep.sock
Oct 14 08:43:22 np0005486759.ooo.test sudo[91331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:23 np0005486759.ooo.test sudo[91331]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:23 np0005486759.ooo.test sudo[91342]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeyppku82/privsep.sock
Oct 14 08:43:23 np0005486759.ooo.test sudo[91342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:23 np0005486759.ooo.test sudo[91342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:24 np0005486759.ooo.test sudo[91353]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfnenu_4i/privsep.sock
Oct 14 08:43:24 np0005486759.ooo.test sudo[91353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:24 np0005486759.ooo.test sudo[91353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:25 np0005486759.ooo.test sudo[91364]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9iizsb4a/privsep.sock
Oct 14 08:43:25 np0005486759.ooo.test sudo[91364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:25 np0005486759.ooo.test sudo[91364]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:25 np0005486759.ooo.test sudo[91375]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnzr23ejw/privsep.sock
Oct 14 08:43:25 np0005486759.ooo.test sudo[91375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:26 np0005486759.ooo.test sudo[91375]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:26 np0005486759.ooo.test sudo[91392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu5dokfdk/privsep.sock
Oct 14 08:43:26 np0005486759.ooo.test sudo[91392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:27 np0005486759.ooo.test sudo[91392]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:43:27 np0005486759.ooo.test podman[91405]: 2025-10-14 08:43:27.320471 +0000 UTC m=+0.067485493 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:43:27 np0005486759.ooo.test podman[91405]: 2025-10-14 08:43:27.336709593 +0000 UTC m=+0.083724116 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:43:27 np0005486759.ooo.test podman[91405]: unhealthy
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: tmp-crun.JclBYO.mount: Deactivated successfully.
Oct 14 08:43:27 np0005486759.ooo.test podman[91397]: 2025-10-14 08:43:27.373772832 +0000 UTC m=+0.126620596 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T13:07:52, architecture=x86_64, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:43:27 np0005486759.ooo.test podman[91397]: 2025-10-14 08:43:27.409257692 +0000 UTC m=+0.162105466 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, version=17.1.9, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12)
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:43:27 np0005486759.ooo.test podman[91399]: 2025-10-14 08:43:27.42144095 +0000 UTC m=+0.168382361 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4)
Oct 14 08:43:27 np0005486759.ooo.test podman[91399]: 2025-10-14 08:43:27.460288154 +0000 UTC m=+0.207229515 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git)
Oct 14 08:43:27 np0005486759.ooo.test podman[91399]: unhealthy
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:43:27 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:43:27 np0005486759.ooo.test sudo[91456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmtjx7io7/privsep.sock
Oct 14 08:43:27 np0005486759.ooo.test sudo[91456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:28 np0005486759.ooo.test sudo[91456]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:28 np0005486759.ooo.test sudo[91467]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptk2qba0d/privsep.sock
Oct 14 08:43:28 np0005486759.ooo.test sudo[91467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:29 np0005486759.ooo.test sudo[91467]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:29 np0005486759.ooo.test sudo[91478]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4lf84yyt/privsep.sock
Oct 14 08:43:29 np0005486759.ooo.test sudo[91478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:29 np0005486759.ooo.test sudo[91478]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:30 np0005486759.ooo.test sudo[91489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp32k7gw6i/privsep.sock
Oct 14 08:43:30 np0005486759.ooo.test sudo[91489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:30 np0005486759.ooo.test sudo[91489]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:31 np0005486759.ooo.test sudo[91500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiv5zgg6j/privsep.sock
Oct 14 08:43:31 np0005486759.ooo.test sudo[91500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:31 np0005486759.ooo.test sudo[91500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:31 np0005486759.ooo.test sudo[91517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpygd_mc2w/privsep.sock
Oct 14 08:43:31 np0005486759.ooo.test sudo[91517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:32 np0005486759.ooo.test sudo[91517]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:32 np0005486759.ooo.test sudo[91528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmas57l8w/privsep.sock
Oct 14 08:43:32 np0005486759.ooo.test sudo[91528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:43:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:43:32 np0005486759.ooo.test podman[91530]: 2025-10-14 08:43:32.727113669 +0000 UTC m=+0.049326061 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4)
Oct 14 08:43:32 np0005486759.ooo.test podman[91530]: 2025-10-14 08:43:32.780921357 +0000 UTC m=+0.103133689 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:43:32 np0005486759.ooo.test systemd[1]: tmp-crun.qgkyxO.mount: Deactivated successfully.
Oct 14 08:43:32 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:43:32 np0005486759.ooo.test podman[91531]: 2025-10-14 08:43:32.794434176 +0000 UTC m=+0.114717148 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, container_name=ovn_controller, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64)
Oct 14 08:43:32 np0005486759.ooo.test podman[91531]: 2025-10-14 08:43:32.837327065 +0000 UTC m=+0.157610037 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, distribution-scope=public, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, container_name=ovn_controller, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, name=rhosp17/openstack-ovn-controller)
Oct 14 08:43:32 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:43:33 np0005486759.ooo.test sudo[91528]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:33 np0005486759.ooo.test sudo[91587]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnevk1rmo/privsep.sock
Oct 14 08:43:33 np0005486759.ooo.test sudo[91587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:34 np0005486759.ooo.test sudo[91587]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:34 np0005486759.ooo.test sudo[91598]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp445xvlej/privsep.sock
Oct 14 08:43:34 np0005486759.ooo.test sudo[91598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:34 np0005486759.ooo.test sudo[91598]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:35 np0005486759.ooo.test sudo[91609]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkl2i7v3h/privsep.sock
Oct 14 08:43:35 np0005486759.ooo.test sudo[91609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:35 np0005486759.ooo.test sudo[91609]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:36 np0005486759.ooo.test sudo[91620]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpssha2oml/privsep.sock
Oct 14 08:43:36 np0005486759.ooo.test sudo[91620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:36 np0005486759.ooo.test sudo[91620]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:36 np0005486759.ooo.test sudo[91633]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3t3svtyk/privsep.sock
Oct 14 08:43:36 np0005486759.ooo.test sudo[91633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:37 np0005486759.ooo.test sudo[91633]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:37 np0005486759.ooo.test sudo[91648]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1vi8bwx/privsep.sock
Oct 14 08:43:37 np0005486759.ooo.test sudo[91648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:38 np0005486759.ooo.test sudo[91648]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:38 np0005486759.ooo.test sudo[91659]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8c46x5ay/privsep.sock
Oct 14 08:43:38 np0005486759.ooo.test sudo[91659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:39 np0005486759.ooo.test sudo[91659]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:39 np0005486759.ooo.test sudo[91670]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1wsxivba/privsep.sock
Oct 14 08:43:39 np0005486759.ooo.test sudo[91670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:40 np0005486759.ooo.test sudo[91670]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:40 np0005486759.ooo.test sudo[91681]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps1smpq3i/privsep.sock
Oct 14 08:43:40 np0005486759.ooo.test sudo[91681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:41 np0005486759.ooo.test sudo[91681]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:41 np0005486759.ooo.test sudo[91692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqj9qls2x/privsep.sock
Oct 14 08:43:41 np0005486759.ooo.test sudo[91692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:41 np0005486759.ooo.test sudo[91692]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:42 np0005486759.ooo.test sudo[91703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa5mewp2v/privsep.sock
Oct 14 08:43:42 np0005486759.ooo.test sudo[91703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:42 np0005486759.ooo.test sudo[91703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:42 np0005486759.ooo.test sudo[91720]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplsvlap2w/privsep.sock
Oct 14 08:43:42 np0005486759.ooo.test sudo[91720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:43 np0005486759.ooo.test sudo[91720]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:43 np0005486759.ooo.test sudo[91731]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuri5l4l6/privsep.sock
Oct 14 08:43:43 np0005486759.ooo.test sudo[91731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:44 np0005486759.ooo.test sudo[91731]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:44 np0005486759.ooo.test sudo[91742]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0wjwgmf7/privsep.sock
Oct 14 08:43:44 np0005486759.ooo.test sudo[91742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:43:44 np0005486759.ooo.test podman[91744]: 2025-10-14 08:43:44.753659909 +0000 UTC m=+0.101537999 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:43:45 np0005486759.ooo.test podman[91744]: 2025-10-14 08:43:45.094534186 +0000 UTC m=+0.442412366 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, release=1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 08:43:45 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:43:45 np0005486759.ooo.test sudo[91742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:45 np0005486759.ooo.test sudo[91777]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5st8fv7h/privsep.sock
Oct 14 08:43:45 np0005486759.ooo.test sudo[91777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:46 np0005486759.ooo.test sudo[91777]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:46 np0005486759.ooo.test sudo[91788]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4ndrus7/privsep.sock
Oct 14 08:43:46 np0005486759.ooo.test sudo[91788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:46 np0005486759.ooo.test sudo[91788]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:43:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:43:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:43:47 np0005486759.ooo.test podman[91794]: 2025-10-14 08:43:47.039825081 +0000 UTC m=+0.065793741 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, config_id=tripleo_step3)
Oct 14 08:43:47 np0005486759.ooo.test podman[91794]: 2025-10-14 08:43:47.047247612 +0000 UTC m=+0.073216252 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid)
Oct 14 08:43:47 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:43:47 np0005486759.ooo.test systemd[1]: tmp-crun.ewYzhM.mount: Deactivated successfully.
Oct 14 08:43:47 np0005486759.ooo.test podman[91796]: 2025-10-14 08:43:47.093758693 +0000 UTC m=+0.115952585 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, release=2, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd)
Oct 14 08:43:47 np0005486759.ooo.test podman[91796]: 2025-10-14 08:43:47.100714719 +0000 UTC m=+0.122908621 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, release=2, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:43:47 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:43:47 np0005486759.ooo.test podman[91795]: 2025-10-14 08:43:47.139029436 +0000 UTC m=+0.161746164 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.9, container_name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:43:47 np0005486759.ooo.test podman[91795]: 2025-10-14 08:43:47.16334194 +0000 UTC m=+0.186058668 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9)
Oct 14 08:43:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:43:47 np0005486759.ooo.test sudo[91862]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpntrusrna/privsep.sock
Oct 14 08:43:47 np0005486759.ooo.test sudo[91862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:43:47 np0005486759.ooo.test sudo[91862]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:47 np0005486759.ooo.test podman[91869]: 2025-10-14 08:43:47.94394131 +0000 UTC m=+0.074030717 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, release=1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:43:48 np0005486759.ooo.test sudo[91906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppb2tcju4/privsep.sock
Oct 14 08:43:48 np0005486759.ooo.test sudo[91906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:48 np0005486759.ooo.test podman[91869]: 2025-10-14 08:43:48.136342034 +0000 UTC m=+0.266431471 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.33.12, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, container_name=metrics_qdr, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:43:48 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:43:48 np0005486759.ooo.test sudo[91906]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:48 np0005486759.ooo.test sudo[91917]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmahmpbl/privsep.sock
Oct 14 08:43:48 np0005486759.ooo.test sudo[91917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:49 np0005486759.ooo.test sudo[91917]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:49 np0005486759.ooo.test sudo[91928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd7g9p0k_/privsep.sock
Oct 14 08:43:49 np0005486759.ooo.test sudo[91928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:50 np0005486759.ooo.test sudo[91928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:50 np0005486759.ooo.test sudo[91939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr4gx3lri/privsep.sock
Oct 14 08:43:50 np0005486759.ooo.test sudo[91939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:51 np0005486759.ooo.test sudo[91939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:51 np0005486759.ooo.test sudo[91950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprkjtnf_3/privsep.sock
Oct 14 08:43:51 np0005486759.ooo.test sudo[91950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:52 np0005486759.ooo.test sudo[91950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:52 np0005486759.ooo.test sudo[91961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6jfppo8c/privsep.sock
Oct 14 08:43:52 np0005486759.ooo.test sudo[91961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:52 np0005486759.ooo.test sudo[91961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:53 np0005486759.ooo.test sudo[91975]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcie7t854/privsep.sock
Oct 14 08:43:53 np0005486759.ooo.test sudo[91975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:53 np0005486759.ooo.test sudo[91975]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:53 np0005486759.ooo.test sudo[91989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf97y3h1j/privsep.sock
Oct 14 08:43:53 np0005486759.ooo.test sudo[91989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:54 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:43:54 np0005486759.ooo.test recover_tripleo_nova_virtqemud[91993]: 47951
Oct 14 08:43:54 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:43:54 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:43:54 np0005486759.ooo.test sudo[91989]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:54 np0005486759.ooo.test sudo[92002]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw7g3tsue/privsep.sock
Oct 14 08:43:54 np0005486759.ooo.test sudo[92002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:55 np0005486759.ooo.test sudo[92002]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:55 np0005486759.ooo.test sudo[92013]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6u_48b2h/privsep.sock
Oct 14 08:43:55 np0005486759.ooo.test sudo[92013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:56 np0005486759.ooo.test sudo[92013]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:56 np0005486759.ooo.test sudo[92024]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1_c0ym8q/privsep.sock
Oct 14 08:43:56 np0005486759.ooo.test sudo[92024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:57 np0005486759.ooo.test sudo[92024]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:43:57 np0005486759.ooo.test podman[92030]: 2025-10-14 08:43:57.416910667 +0000 UTC m=+0.048560936 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, architecture=x86_64, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:43:57 np0005486759.ooo.test podman[92030]: 2025-10-14 08:43:57.428491396 +0000 UTC m=+0.060141675 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:43:57 np0005486759.ooo.test podman[92030]: unhealthy
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:43:57 np0005486759.ooo.test podman[92049]: 2025-10-14 08:43:57.535185623 +0000 UTC m=+0.079802304 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, build-date=2025-07-21T13:07:52, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1)
Oct 14 08:43:57 np0005486759.ooo.test podman[92049]: 2025-10-14 08:43:57.541237742 +0000 UTC m=+0.085854443 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:43:57 np0005486759.ooo.test sudo[92083]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjnrw_k9a/privsep.sock
Oct 14 08:43:57 np0005486759.ooo.test sudo[92083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: tmp-crun.MKHl78.mount: Deactivated successfully.
Oct 14 08:43:57 np0005486759.ooo.test podman[92067]: 2025-10-14 08:43:57.599142946 +0000 UTC m=+0.058555426 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:43:57 np0005486759.ooo.test podman[92067]: 2025-10-14 08:43:57.615088211 +0000 UTC m=+0.074500681 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1)
Oct 14 08:43:57 np0005486759.ooo.test podman[92067]: unhealthy
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:43:57 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:43:58 np0005486759.ooo.test sudo[92083]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:58 np0005486759.ooo.test sudo[92101]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5tid3vvb/privsep.sock
Oct 14 08:43:58 np0005486759.ooo.test sudo[92101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:58 np0005486759.ooo.test sudo[92101]: pam_unix(sudo:session): session closed for user root
Oct 14 08:43:59 np0005486759.ooo.test sudo[92118]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp790rw4aw/privsep.sock
Oct 14 08:43:59 np0005486759.ooo.test sudo[92118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:43:59 np0005486759.ooo.test sudo[92118]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:00 np0005486759.ooo.test sudo[92129]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuepceivv/privsep.sock
Oct 14 08:44:00 np0005486759.ooo.test sudo[92129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:00 np0005486759.ooo.test sudo[92129]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:00 np0005486759.ooo.test sudo[92140]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg8imtrbu/privsep.sock
Oct 14 08:44:00 np0005486759.ooo.test sudo[92140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:01 np0005486759.ooo.test sudo[92140]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:01 np0005486759.ooo.test sudo[92151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfckvj4i4/privsep.sock
Oct 14 08:44:01 np0005486759.ooo.test sudo[92151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:02 np0005486759.ooo.test sudo[92151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:02 np0005486759.ooo.test sudo[92162]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbwf7y7ns/privsep.sock
Oct 14 08:44:02 np0005486759.ooo.test sudo[92162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:03 np0005486759.ooo.test sudo[92162]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:44:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:44:03 np0005486759.ooo.test systemd[1]: tmp-crun.JjQ2TD.mount: Deactivated successfully.
Oct 14 08:44:03 np0005486759.ooo.test podman[92167]: 2025-10-14 08:44:03.36011252 +0000 UTC m=+0.192684754 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, container_name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1)
Oct 14 08:44:03 np0005486759.ooo.test podman[92167]: 2025-10-14 08:44:03.386316842 +0000 UTC m=+0.218889106 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, release=1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:44:03 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:44:03 np0005486759.ooo.test sudo[92209]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2jt861x7/privsep.sock
Oct 14 08:44:03 np0005486759.ooo.test sudo[92209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:03 np0005486759.ooo.test podman[92169]: 2025-10-14 08:44:03.461239115 +0000 UTC m=+0.286983738 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:44:03 np0005486759.ooo.test podman[92169]: 2025-10-14 08:44:03.482226155 +0000 UTC m=+0.307970768 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:44:03 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:44:04 np0005486759.ooo.test sudo[92209]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:04 np0005486759.ooo.test sudo[92238]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp95a7peqz/privsep.sock
Oct 14 08:44:04 np0005486759.ooo.test sudo[92238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:04 np0005486759.ooo.test sudo[92238]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:05 np0005486759.ooo.test sudo[92249]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxuhq8jj_/privsep.sock
Oct 14 08:44:05 np0005486759.ooo.test sudo[92249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:05 np0005486759.ooo.test sudo[92249]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:05 np0005486759.ooo.test sudo[92260]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2qhnntey/privsep.sock
Oct 14 08:44:05 np0005486759.ooo.test sudo[92260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:06 np0005486759.ooo.test sudo[92260]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:06 np0005486759.ooo.test sudo[92271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjwozbzwj/privsep.sock
Oct 14 08:44:06 np0005486759.ooo.test sudo[92271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:07 np0005486759.ooo.test sudo[92271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:07 np0005486759.ooo.test sudo[92282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf9_aiwv0/privsep.sock
Oct 14 08:44:07 np0005486759.ooo.test sudo[92282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:08 np0005486759.ooo.test sudo[92282]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:08 np0005486759.ooo.test sudo[92293]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps5r8r4_r/privsep.sock
Oct 14 08:44:08 np0005486759.ooo.test sudo[92293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:09 np0005486759.ooo.test sudo[92293]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:09 np0005486759.ooo.test sudo[92306]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpylo7v11b/privsep.sock
Oct 14 08:44:09 np0005486759.ooo.test sudo[92306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:09 np0005486759.ooo.test sudo[92306]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:10 np0005486759.ooo.test sudo[92321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0nd3soih/privsep.sock
Oct 14 08:44:10 np0005486759.ooo.test sudo[92321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:10 np0005486759.ooo.test sudo[92321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:10 np0005486759.ooo.test sudo[92332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpstjdzqmp/privsep.sock
Oct 14 08:44:10 np0005486759.ooo.test sudo[92332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:11 np0005486759.ooo.test sudo[92332]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:11 np0005486759.ooo.test sudo[92343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg899nbof/privsep.sock
Oct 14 08:44:11 np0005486759.ooo.test sudo[92343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:12 np0005486759.ooo.test sudo[92343]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:12 np0005486759.ooo.test sudo[92354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8nzp9vl8/privsep.sock
Oct 14 08:44:12 np0005486759.ooo.test sudo[92354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:13 np0005486759.ooo.test sudo[92354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:13 np0005486759.ooo.test sudo[92365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnow38xpe/privsep.sock
Oct 14 08:44:13 np0005486759.ooo.test sudo[92365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:13 np0005486759.ooo.test sudo[92365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:14 np0005486759.ooo.test sudo[92376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpscjbejk4/privsep.sock
Oct 14 08:44:14 np0005486759.ooo.test sudo[92376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:14 np0005486759.ooo.test sudo[92376]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:15 np0005486759.ooo.test sshd[88282]: Received disconnect from 38.102.83.114 port 51402:11: disconnected by user
Oct 14 08:44:15 np0005486759.ooo.test sshd[88282]: Disconnected from user zuul 38.102.83.114 port 51402
Oct 14 08:44:15 np0005486759.ooo.test sshd[88279]: pam_unix(sshd:session): session closed for user zuul
Oct 14 08:44:15 np0005486759.ooo.test systemd[1]: session-17.scope: Deactivated successfully.
Oct 14 08:44:15 np0005486759.ooo.test systemd[1]: session-17.scope: Consumed 13.579s CPU time.
Oct 14 08:44:15 np0005486759.ooo.test systemd-logind[759]: Session 17 logged out. Waiting for processes to exit.
Oct 14 08:44:15 np0005486759.ooo.test systemd-logind[759]: Removed session 17.
Oct 14 08:44:15 np0005486759.ooo.test sudo[92394]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr6gca2x3/privsep.sock
Oct 14 08:44:15 np0005486759.ooo.test sudo[92394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:44:15 np0005486759.ooo.test systemd[1]: tmp-crun.aeyZrd.mount: Deactivated successfully.
Oct 14 08:44:15 np0005486759.ooo.test podman[92397]: 2025-10-14 08:44:15.452387818 +0000 UTC m=+0.079440224 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.9, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:44:15 np0005486759.ooo.test sudo[92394]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:15 np0005486759.ooo.test podman[92397]: 2025-10-14 08:44:15.788177347 +0000 UTC m=+0.415229743 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:44:15 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:44:15 np0005486759.ooo.test sudo[92428]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmput9ycgs_/privsep.sock
Oct 14 08:44:15 np0005486759.ooo.test sudo[92428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:16 np0005486759.ooo.test sudo[92428]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:16 np0005486759.ooo.test sudo[92439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphdx0hrda/privsep.sock
Oct 14 08:44:16 np0005486759.ooo.test sudo[92439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:44:17 np0005486759.ooo.test sudo[92439]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:44:17 np0005486759.ooo.test podman[92443]: 2025-10-14 08:44:17.437075765 +0000 UTC m=+0.053742627 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 08:44:17 np0005486759.ooo.test podman[92447]: 2025-10-14 08:44:17.496722453 +0000 UTC m=+0.109615799 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-collectd, release=2, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:44:17 np0005486759.ooo.test podman[92447]: 2025-10-14 08:44:17.504474224 +0000 UTC m=+0.117367560 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-07-21T13:04:03, vcs-type=git, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:44:17 np0005486759.ooo.test podman[92443]: 2025-10-14 08:44:17.520452249 +0000 UTC m=+0.137119081 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: tmp-crun.4qG2zC.mount: Deactivated successfully.
Oct 14 08:44:17 np0005486759.ooo.test podman[92444]: 2025-10-14 08:44:17.609069056 +0000 UTC m=+0.226267575 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:44:17 np0005486759.ooo.test podman[92444]: 2025-10-14 08:44:17.621805432 +0000 UTC m=+0.239003941 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step5, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, container_name=nova_compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:44:17 np0005486759.ooo.test sudo[92510]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyacks1ae/privsep.sock
Oct 14 08:44:17 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:44:17 np0005486759.ooo.test sudo[92510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:18 np0005486759.ooo.test sudo[92510]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:44:18 np0005486759.ooo.test podman[92518]: 2025-10-14 08:44:18.299878222 +0000 UTC m=+0.062051524 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Oct 14 08:44:18 np0005486759.ooo.test podman[92518]: 2025-10-14 08:44:18.505435594 +0000 UTC m=+0.267608946 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:44:18 np0005486759.ooo.test sudo[92553]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9bs8tw3h/privsep.sock
Oct 14 08:44:18 np0005486759.ooo.test sudo[92553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:18 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:44:19 np0005486759.ooo.test sudo[92553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:19 np0005486759.ooo.test sudo[92565]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_nw0_ay5/privsep.sock
Oct 14 08:44:19 np0005486759.ooo.test sudo[92565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:19 np0005486759.ooo.test sudo[92565]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:20 np0005486759.ooo.test sudo[92579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9qieyn06/privsep.sock
Oct 14 08:44:20 np0005486759.ooo.test sudo[92579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:20 np0005486759.ooo.test sudo[92579]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:21 np0005486759.ooo.test sudo[92593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzqij72x2/privsep.sock
Oct 14 08:44:21 np0005486759.ooo.test sudo[92593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:21 np0005486759.ooo.test sudo[92593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:21 np0005486759.ooo.test sudo[92604]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4l4zbboa/privsep.sock
Oct 14 08:44:21 np0005486759.ooo.test sudo[92604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:22 np0005486759.ooo.test sudo[92604]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:22 np0005486759.ooo.test sudo[92615]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ijjqc6q/privsep.sock
Oct 14 08:44:22 np0005486759.ooo.test sudo[92615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:23 np0005486759.ooo.test sudo[92615]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:23 np0005486759.ooo.test sudo[92626]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpogwbyqu_/privsep.sock
Oct 14 08:44:23 np0005486759.ooo.test sudo[92626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:24 np0005486759.ooo.test sudo[92626]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:24 np0005486759.ooo.test sudo[92637]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4uj99_d/privsep.sock
Oct 14 08:44:24 np0005486759.ooo.test sudo[92637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:25 np0005486759.ooo.test sudo[92637]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:25 np0005486759.ooo.test sudo[92648]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_afuboyk/privsep.sock
Oct 14 08:44:25 np0005486759.ooo.test sudo[92648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:25 np0005486759.ooo.test sudo[92648]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:26 np0005486759.ooo.test sudo[92665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkvj6gu3o/privsep.sock
Oct 14 08:44:26 np0005486759.ooo.test sudo[92665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:26 np0005486759.ooo.test sudo[92665]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:27 np0005486759.ooo.test sudo[92676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt75uca49/privsep.sock
Oct 14 08:44:27 np0005486759.ooo.test sudo[92676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:27 np0005486759.ooo.test sudo[92676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: tmp-crun.TZz7EE.mount: Deactivated successfully.
Oct 14 08:44:27 np0005486759.ooo.test podman[92682]: 2025-10-14 08:44:27.696306097 +0000 UTC m=+0.073402083 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52)
Oct 14 08:44:27 np0005486759.ooo.test podman[92682]: 2025-10-14 08:44:27.725459502 +0000 UTC m=+0.102555488 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, config_id=tripleo_step4, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: tmp-crun.R3Ln7t.mount: Deactivated successfully.
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:44:27 np0005486759.ooo.test podman[92683]: 2025-10-14 08:44:27.729132387 +0000 UTC m=+0.105091908 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9)
Oct 14 08:44:27 np0005486759.ooo.test podman[92683]: 2025-10-14 08:44:27.809428222 +0000 UTC m=+0.185387793 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 08:44:27 np0005486759.ooo.test podman[92683]: unhealthy
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:44:27 np0005486759.ooo.test sudo[92735]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx_0ysact/privsep.sock
Oct 14 08:44:27 np0005486759.ooo.test sudo[92735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:27 np0005486759.ooo.test podman[92706]: 2025-10-14 08:44:27.866392213 +0000 UTC m=+0.166218148 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-07-21T14:45:33, vcs-type=git, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute)
Oct 14 08:44:27 np0005486759.ooo.test podman[92706]: 2025-10-14 08:44:27.876498987 +0000 UTC m=+0.176324862 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1, vcs-type=git)
Oct 14 08:44:27 np0005486759.ooo.test podman[92706]: unhealthy
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:44:27 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:44:28 np0005486759.ooo.test sudo[92735]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:28 np0005486759.ooo.test sudo[92753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7ult6ljw/privsep.sock
Oct 14 08:44:28 np0005486759.ooo.test sudo[92753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:29 np0005486759.ooo.test sudo[92753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:29 np0005486759.ooo.test sudo[92764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa6ir0wlx/privsep.sock
Oct 14 08:44:29 np0005486759.ooo.test sudo[92764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:29 np0005486759.ooo.test sudo[92764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:30 np0005486759.ooo.test sudo[92775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_42rzmjk/privsep.sock
Oct 14 08:44:30 np0005486759.ooo.test sudo[92775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:30 np0005486759.ooo.test sudo[92775]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:31 np0005486759.ooo.test sudo[92792]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpda086k2k/privsep.sock
Oct 14 08:44:31 np0005486759.ooo.test sudo[92792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:31 np0005486759.ooo.test sudo[92792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:31 np0005486759.ooo.test sudo[92803]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqp9jzq75/privsep.sock
Oct 14 08:44:31 np0005486759.ooo.test sudo[92803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:32 np0005486759.ooo.test sudo[92803]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:32 np0005486759.ooo.test sudo[92814]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvb2sfook/privsep.sock
Oct 14 08:44:32 np0005486759.ooo.test sudo[92814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:33 np0005486759.ooo.test sudo[92814]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:33 np0005486759.ooo.test sudo[92825]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv4vhwb4k/privsep.sock
Oct 14 08:44:33 np0005486759.ooo.test sudo[92825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:44:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:44:33 np0005486759.ooo.test systemd[1]: tmp-crun.MijPOT.mount: Deactivated successfully.
Oct 14 08:44:33 np0005486759.ooo.test podman[92827]: 2025-10-14 08:44:33.564150722 +0000 UTC m=+0.058011264 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, container_name=ovn_metadata_agent, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Oct 14 08:44:33 np0005486759.ooo.test podman[92827]: 2025-10-14 08:44:33.594244867 +0000 UTC m=+0.088105399 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, architecture=x86_64, build-date=2025-07-21T16:28:53)
Oct 14 08:44:33 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:44:33 np0005486759.ooo.test podman[92828]: 2025-10-14 08:44:33.665889954 +0000 UTC m=+0.157026251 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container)
Oct 14 08:44:33 np0005486759.ooo.test podman[92828]: 2025-10-14 08:44:33.711386178 +0000 UTC m=+0.202522495 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, version=17.1.9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller)
Oct 14 08:44:33 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:44:34 np0005486759.ooo.test sudo[92825]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:34 np0005486759.ooo.test sudo[92882]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpttw4ebef/privsep.sock
Oct 14 08:44:34 np0005486759.ooo.test sudo[92882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:34 np0005486759.ooo.test systemd[1]: tmp-crun.KxcDdU.mount: Deactivated successfully.
Oct 14 08:44:34 np0005486759.ooo.test sudo[92882]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:35 np0005486759.ooo.test sudo[92893]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3z021fn6/privsep.sock
Oct 14 08:44:35 np0005486759.ooo.test sudo[92893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:35 np0005486759.ooo.test sudo[92893]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:36 np0005486759.ooo.test sudo[92904]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3z9la682/privsep.sock
Oct 14 08:44:36 np0005486759.ooo.test sudo[92904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:36 np0005486759.ooo.test sudo[92904]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:36 np0005486759.ooo.test sudo[92921]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjlwy9ia4/privsep.sock
Oct 14 08:44:36 np0005486759.ooo.test sudo[92921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:37 np0005486759.ooo.test sudo[92921]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:37 np0005486759.ooo.test sudo[92932]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpshexhoor/privsep.sock
Oct 14 08:44:37 np0005486759.ooo.test sudo[92932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:38 np0005486759.ooo.test sudo[92932]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:38 np0005486759.ooo.test sudo[92943]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpilth8i_l/privsep.sock
Oct 14 08:44:38 np0005486759.ooo.test sudo[92943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:39 np0005486759.ooo.test sudo[92943]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:39 np0005486759.ooo.test sudo[92954]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpae0ylwjr/privsep.sock
Oct 14 08:44:39 np0005486759.ooo.test sudo[92954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:40 np0005486759.ooo.test sudo[92954]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:40 np0005486759.ooo.test sudo[92965]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8g7swygl/privsep.sock
Oct 14 08:44:40 np0005486759.ooo.test sudo[92965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:41 np0005486759.ooo.test sudo[92965]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:41 np0005486759.ooo.test sudo[92976]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiscjha4y/privsep.sock
Oct 14 08:44:41 np0005486759.ooo.test sudo[92976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:41 np0005486759.ooo.test sudo[92976]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:42 np0005486759.ooo.test sudo[92993]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkr_ov0kj/privsep.sock
Oct 14 08:44:42 np0005486759.ooo.test sudo[92993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:42 np0005486759.ooo.test sudo[92993]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:42 np0005486759.ooo.test sudo[93004]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpacw_eza7/privsep.sock
Oct 14 08:44:42 np0005486759.ooo.test sudo[93004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:43 np0005486759.ooo.test sudo[93004]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:43 np0005486759.ooo.test sudo[93015]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5h4hw3lt/privsep.sock
Oct 14 08:44:43 np0005486759.ooo.test sudo[93015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:44 np0005486759.ooo.test sudo[93015]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:44 np0005486759.ooo.test sudo[93026]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdk_c4ip9/privsep.sock
Oct 14 08:44:44 np0005486759.ooo.test sudo[93026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:45 np0005486759.ooo.test sudo[93026]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:45 np0005486759.ooo.test sudo[93037]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp139ed2wd/privsep.sock
Oct 14 08:44:45 np0005486759.ooo.test sudo[93037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:45 np0005486759.ooo.test sudo[93037]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:44:46 np0005486759.ooo.test podman[93042]: 2025-10-14 08:44:46.035822567 +0000 UTC m=+0.054751803 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, distribution-scope=public, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:44:46 np0005486759.ooo.test sudo[93072]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxideauq2/privsep.sock
Oct 14 08:44:46 np0005486759.ooo.test sudo[93072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:46 np0005486759.ooo.test podman[93042]: 2025-10-14 08:44:46.38212566 +0000 UTC m=+0.401054886 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:44:46 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:44:46 np0005486759.ooo.test sudo[93072]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:47 np0005486759.ooo.test sudo[93086]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8tw8f8sz/privsep.sock
Oct 14 08:44:47 np0005486759.ooo.test sudo[93086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:47 np0005486759.ooo.test sudo[93086]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:44:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:44:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:44:47 np0005486759.ooo.test podman[93095]: 2025-10-14 08:44:47.758267621 +0000 UTC m=+0.076131857 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, container_name=iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team)
Oct 14 08:44:47 np0005486759.ooo.test podman[93102]: 2025-10-14 08:44:47.811992561 +0000 UTC m=+0.122101776 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64)
Oct 14 08:44:47 np0005486759.ooo.test sudo[93151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbhpnkqhk/privsep.sock
Oct 14 08:44:47 np0005486759.ooo.test sudo[93151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:47 np0005486759.ooo.test podman[93102]: 2025-10-14 08:44:47.892931547 +0000 UTC m=+0.203040752 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:44:47 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:44:47 np0005486759.ooo.test podman[93095]: 2025-10-14 08:44:47.944111958 +0000 UTC m=+0.261976214 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible)
Oct 14 08:44:47 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:44:47 np0005486759.ooo.test podman[93096]: 2025-10-14 08:44:47.907467259 +0000 UTC m=+0.219124812 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-nova-compute-container, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:44:47 np0005486759.ooo.test podman[93096]: 2025-10-14 08:44:47.994390791 +0000 UTC m=+0.306048274 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, config_id=tripleo_step5, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:44:48 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:44:48 np0005486759.ooo.test sudo[93151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:48 np0005486759.ooo.test sudo[93178]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb5a4y5_d/privsep.sock
Oct 14 08:44:48 np0005486759.ooo.test sudo[93178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:44:48 np0005486759.ooo.test podman[93180]: 2025-10-14 08:44:48.822674814 +0000 UTC m=+0.053198765 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, version=17.1.9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, config_id=tripleo_step1, io.buildah.version=1.33.12, container_name=metrics_qdr)
Oct 14 08:44:48 np0005486759.ooo.test podman[93180]: 2025-10-14 08:44:48.976262567 +0000 UTC m=+0.206786538 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd)
Oct 14 08:44:48 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:44:49 np0005486759.ooo.test sudo[93178]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:49 np0005486759.ooo.test sudo[93218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwu62hlfj/privsep.sock
Oct 14 08:44:49 np0005486759.ooo.test sudo[93218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:50 np0005486759.ooo.test sudo[93218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:50 np0005486759.ooo.test sudo[93229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjsdb7_tw/privsep.sock
Oct 14 08:44:50 np0005486759.ooo.test sudo[93229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:51 np0005486759.ooo.test sudo[93229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:51 np0005486759.ooo.test sudo[93240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb5yzaamw/privsep.sock
Oct 14 08:44:51 np0005486759.ooo.test sudo[93240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:51 np0005486759.ooo.test sudo[93240]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:52 np0005486759.ooo.test sudo[93251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpef22s4dr/privsep.sock
Oct 14 08:44:52 np0005486759.ooo.test sudo[93251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:52 np0005486759.ooo.test sudo[93251]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:53 np0005486759.ooo.test sudo[93268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4cty80w1/privsep.sock
Oct 14 08:44:53 np0005486759.ooo.test sudo[93268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:53 np0005486759.ooo.test sudo[93268]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:53 np0005486759.ooo.test sudo[93279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwaye4e6x/privsep.sock
Oct 14 08:44:53 np0005486759.ooo.test sudo[93279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:54 np0005486759.ooo.test sudo[93279]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:54 np0005486759.ooo.test sudo[93290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkh2uihhy/privsep.sock
Oct 14 08:44:54 np0005486759.ooo.test sudo[93290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:55 np0005486759.ooo.test sudo[93290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:55 np0005486759.ooo.test sudo[93301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5dteovz3/privsep.sock
Oct 14 08:44:55 np0005486759.ooo.test sudo[93301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:56 np0005486759.ooo.test sudo[93301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:56 np0005486759.ooo.test sudo[93312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0do7klzw/privsep.sock
Oct 14 08:44:56 np0005486759.ooo.test sudo[93312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:57 np0005486759.ooo.test sudo[93312]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:57 np0005486759.ooo.test sudo[93323]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw7_51lt1/privsep.sock
Oct 14 08:44:57 np0005486759.ooo.test sudo[93323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:57 np0005486759.ooo.test sudo[93323]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:44:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: tmp-crun.6HiecO.mount: Deactivated successfully.
Oct 14 08:44:58 np0005486759.ooo.test podman[93336]: 2025-10-14 08:44:58.069114478 +0000 UTC m=+0.067136057 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: tmp-crun.wztLaz.mount: Deactivated successfully.
Oct 14 08:44:58 np0005486759.ooo.test podman[93335]: 2025-10-14 08:44:58.12611862 +0000 UTC m=+0.126391480 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, distribution-scope=public, version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, tcib_managed=true)
Oct 14 08:44:58 np0005486759.ooo.test podman[93335]: 2025-10-14 08:44:58.159282311 +0000 UTC m=+0.159555171 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1)
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:44:58 np0005486759.ooo.test podman[93336]: 2025-10-14 08:44:58.207239831 +0000 UTC m=+0.205261450 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, architecture=x86_64, release=1, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:44:58 np0005486759.ooo.test podman[93336]: unhealthy
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:44:58 np0005486759.ooo.test podman[93340]: 2025-10-14 08:44:58.235794308 +0000 UTC m=+0.228502172 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, tcib_managed=true)
Oct 14 08:44:58 np0005486759.ooo.test sudo[93392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuriy7hwy/privsep.sock
Oct 14 08:44:58 np0005486759.ooo.test sudo[93392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:58 np0005486759.ooo.test podman[93340]: 2025-10-14 08:44:58.253281483 +0000 UTC m=+0.245989287 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64)
Oct 14 08:44:58 np0005486759.ooo.test podman[93340]: unhealthy
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:44:58 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:44:58 np0005486759.ooo.test sudo[93392]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:59 np0005486759.ooo.test sudo[93408]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3u8r6o_m/privsep.sock
Oct 14 08:44:59 np0005486759.ooo.test sudo[93408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:44:59 np0005486759.ooo.test sudo[93408]: pam_unix(sudo:session): session closed for user root
Oct 14 08:44:59 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:44:59 np0005486759.ooo.test recover_tripleo_nova_virtqemud[93415]: 47951
Oct 14 08:44:59 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:44:59 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:44:59 np0005486759.ooo.test sudo[93421]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp976_ew5p/privsep.sock
Oct 14 08:44:59 np0005486759.ooo.test sudo[93421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:00 np0005486759.ooo.test sudo[93421]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:00 np0005486759.ooo.test sudo[93432]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjl743orb/privsep.sock
Oct 14 08:45:00 np0005486759.ooo.test sudo[93432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:01 np0005486759.ooo.test sudo[93432]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:01 np0005486759.ooo.test sudo[93447]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7n86_z_l/privsep.sock
Oct 14 08:45:01 np0005486759.ooo.test sudo[93447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:02 np0005486759.ooo.test sudo[93447]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:02 np0005486759.ooo.test sudo[93458]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp48bbz16b/privsep.sock
Oct 14 08:45:02 np0005486759.ooo.test sudo[93458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:03 np0005486759.ooo.test sudo[93458]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:03 np0005486759.ooo.test sudo[93474]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp83u00ww0/privsep.sock
Oct 14 08:45:03 np0005486759.ooo.test sudo[93474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:03 np0005486759.ooo.test sudo[93474]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:45:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:45:03 np0005486759.ooo.test systemd[1]: tmp-crun.SsXzmd.mount: Deactivated successfully.
Oct 14 08:45:03 np0005486759.ooo.test podman[93481]: 2025-10-14 08:45:03.978513555 +0000 UTC m=+0.069406508 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, distribution-scope=public)
Oct 14 08:45:04 np0005486759.ooo.test podman[93481]: 2025-10-14 08:45:04.033419772 +0000 UTC m=+0.124312745 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:45:04 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:45:04 np0005486759.ooo.test sudo[93519]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0t_hmtq1/privsep.sock
Oct 14 08:45:04 np0005486759.ooo.test sudo[93519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:04 np0005486759.ooo.test podman[93482]: 2025-10-14 08:45:04.208906386 +0000 UTC m=+0.298007124 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:45:04 np0005486759.ooo.test podman[93482]: 2025-10-14 08:45:04.249007092 +0000 UTC m=+0.338107820 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=)
Oct 14 08:45:04 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:45:04 np0005486759.ooo.test sudo[93519]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:04 np0005486759.ooo.test sudo[93541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcxvvyz73/privsep.sock
Oct 14 08:45:04 np0005486759.ooo.test sudo[93541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:05 np0005486759.ooo.test sudo[93541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:05 np0005486759.ooo.test sudo[93552]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb7qt5n1w/privsep.sock
Oct 14 08:45:05 np0005486759.ooo.test sudo[93552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:06 np0005486759.ooo.test sudo[93552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:06 np0005486759.ooo.test sudo[93563]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6i60lrmx/privsep.sock
Oct 14 08:45:06 np0005486759.ooo.test sudo[93563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:07 np0005486759.ooo.test sudo[93563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:07 np0005486759.ooo.test sudo[93574]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx77xoe4w/privsep.sock
Oct 14 08:45:07 np0005486759.ooo.test sudo[93574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:08 np0005486759.ooo.test sudo[93574]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:08 np0005486759.ooo.test sudo[93585]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqnessi3o/privsep.sock
Oct 14 08:45:08 np0005486759.ooo.test sudo[93585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:09 np0005486759.ooo.test sudo[93585]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:09 np0005486759.ooo.test sudo[93602]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqrmhcm0k/privsep.sock
Oct 14 08:45:09 np0005486759.ooo.test sudo[93602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:09 np0005486759.ooo.test sudo[93602]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:10 np0005486759.ooo.test sudo[93613]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdccpt4c_/privsep.sock
Oct 14 08:45:10 np0005486759.ooo.test sudo[93613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:10 np0005486759.ooo.test sudo[93613]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:11 np0005486759.ooo.test sudo[93624]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd180jez2/privsep.sock
Oct 14 08:45:11 np0005486759.ooo.test sudo[93624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:11 np0005486759.ooo.test sudo[93624]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:11 np0005486759.ooo.test sudo[93636]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppwuj9af5/privsep.sock
Oct 14 08:45:11 np0005486759.ooo.test sudo[93636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:12 np0005486759.ooo.test sudo[93636]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:12 np0005486759.ooo.test sudo[93647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzkbehgq7/privsep.sock
Oct 14 08:45:12 np0005486759.ooo.test sudo[93647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:13 np0005486759.ooo.test sudo[93647]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:13 np0005486759.ooo.test sudo[93658]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqakthfxn/privsep.sock
Oct 14 08:45:13 np0005486759.ooo.test sudo[93658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:14 np0005486759.ooo.test sudo[93658]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:14 np0005486759.ooo.test sudo[93675]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz_cqx1ko/privsep.sock
Oct 14 08:45:14 np0005486759.ooo.test sudo[93675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:15 np0005486759.ooo.test sudo[93675]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:15 np0005486759.ooo.test sudo[93686]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5y0k3eqk/privsep.sock
Oct 14 08:45:15 np0005486759.ooo.test sudo[93686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:16 np0005486759.ooo.test sudo[93686]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:16 np0005486759.ooo.test sudo[93697]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8edy1pk9/privsep.sock
Oct 14 08:45:16 np0005486759.ooo.test sudo[93697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:16 np0005486759.ooo.test sudo[93697]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:45:16 np0005486759.ooo.test systemd[1]: tmp-crun.FoN2Sv.mount: Deactivated successfully.
Oct 14 08:45:16 np0005486759.ooo.test podman[93702]: 2025-10-14 08:45:16.962237455 +0000 UTC m=+0.057335792 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:45:17 np0005486759.ooo.test sudo[93731]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpefzxn1oj/privsep.sock
Oct 14 08:45:17 np0005486759.ooo.test sudo[93731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:17 np0005486759.ooo.test podman[93702]: 2025-10-14 08:45:17.328611813 +0000 UTC m=+0.423710160 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:45:17 np0005486759.ooo.test sudo[93752]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /usr/share/nova/nova-dist.conf --config-file /etc/nova/nova.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprk3c5j_b/privsep.sock
Oct 14 08:45:17 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 08:45:17 np0005486759.ooo.test sudo[93731]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Queued start job for default target Main User Target.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Created slice User Application Slice.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Reached target Paths.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Reached target Timers.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Starting D-Bus User Message Bus Socket...
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Starting Create User's Volatile Files and Directories...
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Listening on D-Bus User Message Bus Socket.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Reached target Sockets.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Finished Create User's Volatile Files and Directories.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Reached target Basic System.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Reached target Main User Target.
Oct 14 08:45:17 np0005486759.ooo.test systemd[93754]: Startup finished in 156ms.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Started Session c11 of User root.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:45:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:45:17 np0005486759.ooo.test sudo[93752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Oct 14 08:45:17 np0005486759.ooo.test sudo[93790]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkc2amfz1/privsep.sock
Oct 14 08:45:17 np0005486759.ooo.test sudo[93790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:45:18 np0005486759.ooo.test podman[93776]: 2025-10-14 08:45:18.112142735 +0000 UTC m=+0.142328074 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, container_name=iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:45:18 np0005486759.ooo.test systemd[1]: tmp-crun.g3DT0J.mount: Deactivated successfully.
Oct 14 08:45:18 np0005486759.ooo.test podman[93807]: 2025-10-14 08:45:18.159215258 +0000 UTC m=+0.078204842 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=nova_compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:45:18 np0005486759.ooo.test podman[93774]: 2025-10-14 08:45:18.073771382 +0000 UTC m=+0.104418376 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, release=2, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, io.buildah.version=1.33.12)
Oct 14 08:45:18 np0005486759.ooo.test podman[93774]: 2025-10-14 08:45:18.209434449 +0000 UTC m=+0.240081393 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, release=2, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:45:18 np0005486759.ooo.test podman[93776]: 2025-10-14 08:45:18.227069257 +0000 UTC m=+0.257254626 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, container_name=iscsid, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1)
Oct 14 08:45:18 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:45:18 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:45:18 np0005486759.ooo.test podman[93807]: 2025-10-14 08:45:18.26640951 +0000 UTC m=+0.185399094 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, name=rhosp17/openstack-nova-compute)
Oct 14 08:45:18 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:45:18 np0005486759.ooo.test sudo[93790]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:18 np0005486759.ooo.test sudo[93752]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:18 np0005486759.ooo.test sudo[93861]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfz41n_d/privsep.sock
Oct 14 08:45:18 np0005486759.ooo.test sudo[93861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:45:19 np0005486759.ooo.test sudo[93861]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:19 np0005486759.ooo.test podman[93897]: 2025-10-14 08:45:19.459616305 +0000 UTC m=+0.081277837 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, version=17.1.9, distribution-scope=public, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:45:19 np0005486759.ooo.test sudo[93927]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /usr/share/nova/nova-dist.conf --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpncrkxkuz/privsep.sock
Oct 14 08:45:19 np0005486759.ooo.test systemd-logind[759]: Existing logind session ID 12 used by new audit session, ignoring.
Oct 14 08:45:19 np0005486759.ooo.test systemd[1]: Started Session c12 of User root.
Oct 14 08:45:19 np0005486759.ooo.test sudo[93927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Oct 14 08:45:19 np0005486759.ooo.test podman[93897]: 2025-10-14 08:45:19.631479507 +0000 UTC m=+0.253141039 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:45:19 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:45:19 np0005486759.ooo.test sudo[93937]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbtkbfcid/privsep.sock
Oct 14 08:45:19 np0005486759.ooo.test sudo[93937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:20 np0005486759.ooo.test sudo[93927]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:20 np0005486759.ooo.test sudo[93937]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:20 np0005486759.ooo.test kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 14 08:45:20 np0005486759.ooo.test kernel: device tapeee08de8-f9 entered promiscuous mode
Oct 14 08:45:20 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760431520.5749] manager: (tapeee08de8-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Oct 14 08:45:20 np0005486759.ooo.test systemd-udevd[93966]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 08:45:20 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760431520.5912] device (tapeee08de8-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 08:45:20 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760431520.5920] device (tapeee08de8-f9): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Oct 14 08:45:20 np0005486759.ooo.test systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 14 08:45:20 np0005486759.ooo.test systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 14 08:45:20 np0005486759.ooo.test systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 14 08:45:20 np0005486759.ooo.test systemd-machined[93972]: New machine qemu-1-instance-00000001.
Oct 14 08:45:20 np0005486759.ooo.test systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Oct 14 08:45:20 np0005486759.ooo.test sudo[93980]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9sgrgtof/privsep.sock
Oct 14 08:45:20 np0005486759.ooo.test sudo[93980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:20 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760431520.8700] manager: (tap9197abc5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Oct 14 08:45:20 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9197abc5-01: link becomes ready
Oct 14 08:45:20 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9197abc5-00: link becomes ready
Oct 14 08:45:20 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760431520.9141] device (tap9197abc5-00): carrier: link connected
Oct 14 08:45:21 np0005486759.ooo.test kernel: device tap9197abc5-00 entered promiscuous mode
Oct 14 08:45:21 np0005486759.ooo.test sudo[93980]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:21 np0005486759.ooo.test sudo[94038]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcsav82ht/privsep.sock
Oct 14 08:45:21 np0005486759.ooo.test sudo[94038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:22 np0005486759.ooo.test sudo[94038]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 14 08:45:22 np0005486759.ooo.test sudo[94055]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 haproxy -f /var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf
Oct 14 08:45:22 np0005486759.ooo.test sudo[94055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Oct 14 08:45:22 np0005486759.ooo.test sudo[94061]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_c1z5i1p/privsep.sock
Oct 14 08:45:22 np0005486759.ooo.test sudo[94061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 14 08:45:22 np0005486759.ooo.test podman[94086]: 2025-10-14 08:45:22.663240945 +0000 UTC m=+0.074860898 container create 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53)
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: Started libpod-conmon-465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777.scope.
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: tmp-crun.pVvXyS.mount: Deactivated successfully.
Oct 14 08:45:22 np0005486759.ooo.test podman[94086]: 2025-10-14 08:45:22.620776315 +0000 UTC m=+0.032396288 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Oct 14 08:45:22 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 08:45:22 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34874d58ed9ea56da0acf8f63c3cf2d1b1b4035503c3f2994c6596fd2507826e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 08:45:22 np0005486759.ooo.test podman[94086]: 2025-10-14 08:45:22.749687892 +0000 UTC m=+0.161307865 container init 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, io.openshift.expose-services=, distribution-scope=public, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:45:22 np0005486759.ooo.test podman[94086]: 2025-10-14 08:45:22.755592555 +0000 UTC m=+0.167212528 container start 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, io.openshift.expose-services=, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., release=1, version=17.1.9, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:45:22 np0005486759.ooo.test sudo[94055]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:23 np0005486759.ooo.test sudo[94061]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 08:45:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 08:45:23 np0005486759.ooo.test sudo[94132]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw3p9fiot/privsep.sock
Oct 14 08:45:23 np0005486759.ooo.test sudo[94132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:23 np0005486759.ooo.test setroubleshoot[94050]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 395af390-a324-4a43-96e7-e57a24fdcf92
Oct 14 08:45:23 np0005486759.ooo.test setroubleshoot[94050]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                             
                                                             *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                             
                                                             If max_map_count is a virtualization target
                                                             Then you need to change the label on max_map_count'
                                                             Do
                                                             # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                             # restorecon -v 'max_map_count'
                                                             
                                                             *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                             
                                                             If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                             Then you should report this as a bug.
                                                             You can generate a local policy module to allow this access.
                                                             Do
                                                             allow this access for now by executing:
                                                             # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                             # semodule -X 300 -i my-qemukvm.pp
                                                             
Oct 14 08:45:23 np0005486759.ooo.test sudo[94132]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:24 np0005486759.ooo.test sudo[94144]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1b1dko23/privsep.sock
Oct 14 08:45:24 np0005486759.ooo.test sudo[94144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:24 np0005486759.ooo.test sudo[94144]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:25 np0005486759.ooo.test sudo[94161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0zcdshr/privsep.sock
Oct 14 08:45:25 np0005486759.ooo.test sudo[94161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:25 np0005486759.ooo.test sudo[94161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:25 np0005486759.ooo.test sudo[94172]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwcsh8wyr/privsep.sock
Oct 14 08:45:25 np0005486759.ooo.test sudo[94172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:26 np0005486759.ooo.test sudo[94172]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:26 np0005486759.ooo.test sudo[94183]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjgyrlfc8/privsep.sock
Oct 14 08:45:26 np0005486759.ooo.test sudo[94183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:27 np0005486759.ooo.test sudo[94183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:27 np0005486759.ooo.test sudo[94194]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplz412eoc/privsep.sock
Oct 14 08:45:27 np0005486759.ooo.test sudo[94194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:28 np0005486759.ooo.test sudo[94194]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: tmp-crun.NU5HPq.mount: Deactivated successfully.
Oct 14 08:45:28 np0005486759.ooo.test podman[94200]: 2025-10-14 08:45:28.350467396 +0000 UTC m=+0.082194096 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, name=rhosp17/openstack-ceilometer-compute)
Oct 14 08:45:28 np0005486759.ooo.test podman[94200]: 2025-10-14 08:45:28.387077253 +0000 UTC m=+0.118803943 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, release=1, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:45:28 np0005486759.ooo.test podman[94200]: unhealthy
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:45:28 np0005486759.ooo.test podman[94198]: 2025-10-14 08:45:28.401467592 +0000 UTC m=+0.132853351 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20250721.1, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:45:28 np0005486759.ooo.test podman[94198]: 2025-10-14 08:45:28.411240005 +0000 UTC m=+0.142625744 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, version=17.1.9, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-07-21T13:07:52, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container)
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:45:28 np0005486759.ooo.test podman[94224]: 2025-10-14 08:45:28.479407833 +0000 UTC m=+0.128158943 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1)
Oct 14 08:45:28 np0005486759.ooo.test podman[94224]: 2025-10-14 08:45:28.48669004 +0000 UTC m=+0.135441130 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, release=1, io.buildah.version=1.33.12)
Oct 14 08:45:28 np0005486759.ooo.test podman[94224]: unhealthy
Oct 14 08:45:28 np0005486759.ooo.test sudo[94256]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5lpduq6m/privsep.sock
Oct 14 08:45:28 np0005486759.ooo.test sudo[94256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:45:28 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:45:29 np0005486759.ooo.test sudo[94256]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:29 np0005486759.ooo.test sudo[94272]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi5hvws7r/privsep.sock
Oct 14 08:45:29 np0005486759.ooo.test sudo[94272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:29 np0005486759.ooo.test sudo[94272]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:29 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:45:29 np0005486759.ooo.test recover_tripleo_nova_virtqemud[94281]: 47951
Oct 14 08:45:29 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:45:29 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:45:30 np0005486759.ooo.test sudo[94292]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu7uyfjjm/privsep.sock
Oct 14 08:45:30 np0005486759.ooo.test sudo[94292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:30 np0005486759.ooo.test sudo[94292]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:30 np0005486759.ooo.test sudo[94320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv4z2tum5/privsep.sock
Oct 14 08:45:30 np0005486759.ooo.test sudo[94320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:31 np0005486759.ooo.test sudo[94320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:31 np0005486759.ooo.test sudo[94331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf85gc1yc/privsep.sock
Oct 14 08:45:31 np0005486759.ooo.test sudo[94331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:32 np0005486759.ooo.test sudo[94331]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:32 np0005486759.ooo.test sudo[94342]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_wgz0c1f/privsep.sock
Oct 14 08:45:32 np0005486759.ooo.test sudo[94342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:33 np0005486759.ooo.test systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 14 08:45:33 np0005486759.ooo.test sudo[94342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:33 np0005486759.ooo.test sudo[94354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph3yl3z8u/privsep.sock
Oct 14 08:45:33 np0005486759.ooo.test sudo[94354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:33 np0005486759.ooo.test systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 14 08:45:33 np0005486759.ooo.test sudo[94354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:34 np0005486759.ooo.test sudo[94365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq8aejqr1/privsep.sock
Oct 14 08:45:34 np0005486759.ooo.test sudo[94365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:45:34 np0005486759.ooo.test systemd[1]: tmp-crun.xIe422.mount: Deactivated successfully.
Oct 14 08:45:34 np0005486759.ooo.test podman[94367]: 2025-10-14 08:45:34.295229172 +0000 UTC m=+0.078708707 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.9, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:45:34 np0005486759.ooo.test podman[94367]: 2025-10-14 08:45:34.336589448 +0000 UTC m=+0.120068963 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Oct 14 08:45:34 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:45:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:45:34 np0005486759.ooo.test podman[94391]: 2025-10-14 08:45:34.422809588 +0000 UTC m=+0.051864034 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:45:34 np0005486759.ooo.test podman[94391]: 2025-10-14 08:45:34.466309129 +0000 UTC m=+0.095363585 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:45:34 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:45:34 np0005486759.ooo.test sudo[94365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:34 np0005486759.ooo.test sudo[94424]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6yenuwkj/privsep.sock
Oct 14 08:45:34 np0005486759.ooo.test sudo[94424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:35 np0005486759.ooo.test systemd[1]: tmp-crun.vM3hPH.mount: Deactivated successfully.
Oct 14 08:45:35 np0005486759.ooo.test sudo[94424]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:35 np0005486759.ooo.test sudo[94441]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbmmvxuc_/privsep.sock
Oct 14 08:45:35 np0005486759.ooo.test sudo[94441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:36 np0005486759.ooo.test sudo[94441]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:36 np0005486759.ooo.test sudo[94452]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpslx2z57n/privsep.sock
Oct 14 08:45:36 np0005486759.ooo.test sudo[94452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:37 np0005486759.ooo.test sudo[94452]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:37 np0005486759.ooo.test sudo[94463]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo1u72vd7/privsep.sock
Oct 14 08:45:37 np0005486759.ooo.test sudo[94463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:38 np0005486759.ooo.test sudo[94463]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:38 np0005486759.ooo.test sudo[94474]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1hapj_7j/privsep.sock
Oct 14 08:45:38 np0005486759.ooo.test sudo[94474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:39 np0005486759.ooo.test sudo[94474]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:39 np0005486759.ooo.test sudo[94485]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy_skofko/privsep.sock
Oct 14 08:45:39 np0005486759.ooo.test sudo[94485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:39 np0005486759.ooo.test sudo[94485]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:40 np0005486759.ooo.test sudo[94496]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpikrkj8tu/privsep.sock
Oct 14 08:45:40 np0005486759.ooo.test sudo[94496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:40 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49670 [14/Oct/2025:08:45:39.558] listener listener/metadata 0/0/0/997/997 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Oct 14 08:45:40 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49680 [14/Oct/2025:08:45:40.646] listener listener/metadata 0/0/0/197/197 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Oct 14 08:45:40 np0005486759.ooo.test sudo[94496]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:41 np0005486759.ooo.test sudo[94513]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqcb441ue/privsep.sock
Oct 14 08:45:41 np0005486759.ooo.test sudo[94513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:41 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49688 [14/Oct/2025:08:45:40.887] listener listener/metadata 0/0/0/248/248 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Oct 14 08:45:41 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49698 [14/Oct/2025:08:45:41.210] listener listener/metadata 0/0/0/195/195 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Oct 14 08:45:41 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49710 [14/Oct/2025:08:45:41.445] listener listener/metadata 0/0/0/209/209 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Oct 14 08:45:41 np0005486759.ooo.test sudo[94513]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:41 np0005486759.ooo.test sudo[94524]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph1t97n9s/privsep.sock
Oct 14 08:45:41 np0005486759.ooo.test sudo[94524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:41 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49714 [14/Oct/2025:08:45:41.727] listener listener/metadata 0/0/0/208/208 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Oct 14 08:45:42 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49716 [14/Oct/2025:08:45:41.987] listener listener/metadata 0/0/0/169/169 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Oct 14 08:45:42 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49724 [14/Oct/2025:08:45:42.199] listener listener/metadata 0/0/0/192/192 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Oct 14 08:45:42 np0005486759.ooo.test sudo[94524]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:42 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49740 [14/Oct/2025:08:45:42.462] listener listener/metadata 0/0/0/200/200 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Oct 14 08:45:42 np0005486759.ooo.test sudo[94535]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd7jnayoy/privsep.sock
Oct 14 08:45:42 np0005486759.ooo.test sudo[94535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:42 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49750 [14/Oct/2025:08:45:42.706] listener listener/metadata 0/0/0/178/178 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Oct 14 08:45:43 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49766 [14/Oct/2025:08:45:42.945] listener listener/metadata 0/0/0/206/206 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Oct 14 08:45:43 np0005486759.ooo.test sudo[94535]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:43 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49778 [14/Oct/2025:08:45:43.208] listener listener/metadata 0/0/0/257/257 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Oct 14 08:45:43 np0005486759.ooo.test sudo[94546]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn8w5cywk/privsep.sock
Oct 14 08:45:43 np0005486759.ooo.test sudo[94546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:43 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49792 [14/Oct/2025:08:45:43.526] listener listener/metadata 0/0/0/197/197 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Oct 14 08:45:43 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49802 [14/Oct/2025:08:45:43.773] listener listener/metadata 0/0/0/175/175 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Oct 14 08:45:44 np0005486759.ooo.test sudo[94546]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:44 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49814 [14/Oct/2025:08:45:44.027] listener listener/metadata 0/0/0/191/191 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Oct 14 08:45:44 np0005486759.ooo.test sudo[94557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6zrpjc5p/privsep.sock
Oct 14 08:45:44 np0005486759.ooo.test sudo[94557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:44 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[94114]: 192.168.0.173:49828 [14/Oct/2025:08:45:44.288] listener listener/metadata 0/0/0/221/221 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Oct 14 08:45:45 np0005486759.ooo.test sudo[94557]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:45 np0005486759.ooo.test sudo[94568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5b1v0f77/privsep.sock
Oct 14 08:45:45 np0005486759.ooo.test sudo[94568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:45 np0005486759.ooo.test sudo[94568]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:46 np0005486759.ooo.test sudo[94579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6q50fw7z/privsep.sock
Oct 14 08:45:46 np0005486759.ooo.test sudo[94579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:46 np0005486759.ooo.test sudo[94579]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:46 np0005486759.ooo.test sudo[94596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3kb1_s_n/privsep.sock
Oct 14 08:45:46 np0005486759.ooo.test sudo[94596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:45:47 np0005486759.ooo.test systemd[1]: tmp-crun.YdzkyG.mount: Deactivated successfully.
Oct 14 08:45:47 np0005486759.ooo.test podman[94599]: 2025-10-14 08:45:47.471036243 +0000 UTC m=+0.098226604 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1, io.buildah.version=1.33.12, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=nova_migration_target, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:45:47 np0005486759.ooo.test sudo[94596]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:47 np0005486759.ooo.test sudo[94628]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz5ka2ftt/privsep.sock
Oct 14 08:45:47 np0005486759.ooo.test sudo[94628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:47 np0005486759.ooo.test podman[94599]: 2025-10-14 08:45:47.855372048 +0000 UTC m=+0.482562429 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 08:45:47 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:45:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:45:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:45:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:45:48 np0005486759.ooo.test sudo[94628]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:48 np0005486759.ooo.test podman[94632]: 2025-10-14 08:45:48.471189508 +0000 UTC m=+0.094540839 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.9, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, tcib_managed=true, distribution-scope=public, container_name=iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:45:48 np0005486759.ooo.test podman[94633]: 2025-10-14 08:45:48.503820212 +0000 UTC m=+0.126398140 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, vcs-type=git, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:45:48 np0005486759.ooo.test podman[94632]: 2025-10-14 08:45:48.510277943 +0000 UTC m=+0.133629284 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-type=git)
Oct 14 08:45:48 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:45:48 np0005486759.ooo.test podman[94633]: 2025-10-14 08:45:48.549036217 +0000 UTC m=+0.171614165 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, config_id=tripleo_step5, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:45:48 np0005486759.ooo.test podman[94634]: 2025-10-14 08:45:48.453333653 +0000 UTC m=+0.072953099 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, config_id=tripleo_step3, version=17.1.9, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, tcib_managed=true, vcs-type=git)
Oct 14 08:45:48 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:45:48 np0005486759.ooo.test podman[94634]: 2025-10-14 08:45:48.584933783 +0000 UTC m=+0.204553219 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step3, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=2, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:45:48 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:45:48 np0005486759.ooo.test sudo[94698]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdxj6mdj3/privsep.sock
Oct 14 08:45:48 np0005486759.ooo.test sudo[94698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:49 np0005486759.ooo.test sudo[94698]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:49 np0005486759.ooo.test sudo[94709]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgmmwe04n/privsep.sock
Oct 14 08:45:49 np0005486759.ooo.test sudo[94709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:50 np0005486759.ooo.test sudo[94709]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:45:50 np0005486759.ooo.test podman[94715]: 2025-10-14 08:45:50.162598927 +0000 UTC m=+0.053326138 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, tcib_managed=true, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr)
Oct 14 08:45:50 np0005486759.ooo.test podman[94715]: 2025-10-14 08:45:50.327378029 +0000 UTC m=+0.218105300 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Oct 14 08:45:50 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:45:50 np0005486759.ooo.test sudo[94748]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz6mtbtqj/privsep.sock
Oct 14 08:45:50 np0005486759.ooo.test sudo[94748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:50 np0005486759.ooo.test sudo[94748]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:51 np0005486759.ooo.test sudo[94759]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt_bypcm_/privsep.sock
Oct 14 08:45:51 np0005486759.ooo.test sudo[94759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:51 np0005486759.ooo.test sudo[94759]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:52 np0005486759.ooo.test sudo[94776]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1z91jqq/privsep.sock
Oct 14 08:45:52 np0005486759.ooo.test sudo[94776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:52 np0005486759.ooo.test sudo[94776]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:52 np0005486759.ooo.test sudo[94787]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptva1he0w/privsep.sock
Oct 14 08:45:52 np0005486759.ooo.test sudo[94787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:53 np0005486759.ooo.test sudo[94787]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:53 np0005486759.ooo.test sudo[94798]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ozqv4qv/privsep.sock
Oct 14 08:45:53 np0005486759.ooo.test sudo[94798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:54 np0005486759.ooo.test sudo[94798]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:54 np0005486759.ooo.test sudo[94809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptu6aqao7/privsep.sock
Oct 14 08:45:54 np0005486759.ooo.test sudo[94809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:55 np0005486759.ooo.test sudo[94809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:55 np0005486759.ooo.test sudo[94820]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjbn32b27/privsep.sock
Oct 14 08:45:55 np0005486759.ooo.test sudo[94820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:56 np0005486759.ooo.test sudo[94820]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:56 np0005486759.ooo.test sudo[94831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpifx7jlm1/privsep.sock
Oct 14 08:45:56 np0005486759.ooo.test sudo[94831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:57 np0005486759.ooo.test sudo[94831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:57 np0005486759.ooo.test sudo[94848]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1gf6qxuj/privsep.sock
Oct 14 08:45:57 np0005486759.ooo.test sudo[94848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:57 np0005486759.ooo.test sudo[94848]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:58 np0005486759.ooo.test sudo[94859]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxfp9upx6/privsep.sock
Oct 14 08:45:58 np0005486759.ooo.test sudo[94859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:58 np0005486759.ooo.test sudo[94859]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: tmp-crun.Gxafys.mount: Deactivated successfully.
Oct 14 08:45:58 np0005486759.ooo.test podman[94866]: 2025-10-14 08:45:58.901168877 +0000 UTC m=+0.120392573 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T14:45:33, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 08:45:58 np0005486759.ooo.test podman[94867]: 2025-10-14 08:45:58.93537333 +0000 UTC m=+0.150937153 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team)
Oct 14 08:45:58 np0005486759.ooo.test podman[94867]: 2025-10-14 08:45:58.947512637 +0000 UTC m=+0.163076450 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, release=1)
Oct 14 08:45:58 np0005486759.ooo.test podman[94867]: unhealthy
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:45:58 np0005486759.ooo.test podman[94866]: 2025-10-14 08:45:58.961512242 +0000 UTC m=+0.180735988 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:45:58 np0005486759.ooo.test podman[94866]: unhealthy
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:45:58 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:45:58 np0005486759.ooo.test sudo[94919]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxtxm2zl0/privsep.sock
Oct 14 08:45:58 np0005486759.ooo.test sudo[94919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:45:59 np0005486759.ooo.test podman[94865]: 2025-10-14 08:45:59.03511563 +0000 UTC m=+0.256185814 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true, managed_by=tripleo_ansible)
Oct 14 08:45:59 np0005486759.ooo.test podman[94865]: 2025-10-14 08:45:59.04314699 +0000 UTC m=+0.264217194 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Oct 14 08:45:59 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:45:59 np0005486759.ooo.test sudo[94919]: pam_unix(sudo:session): session closed for user root
Oct 14 08:45:59 np0005486759.ooo.test sudo[94937]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc7nziyxk/privsep.sock
Oct 14 08:45:59 np0005486759.ooo.test sudo[94937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:00 np0005486759.ooo.test sudo[94937]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:00 np0005486759.ooo.test sudo[94948]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplxtn1frp/privsep.sock
Oct 14 08:46:00 np0005486759.ooo.test sudo[94948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:01 np0005486759.ooo.test sudo[94948]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:01 np0005486759.ooo.test sudo[94959]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph3jdg48d/privsep.sock
Oct 14 08:46:01 np0005486759.ooo.test sudo[94959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:02 np0005486759.ooo.test sudo[94959]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:02 np0005486759.ooo.test sudo[94972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc96t68tu/privsep.sock
Oct 14 08:46:02 np0005486759.ooo.test sudo[94972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:02 np0005486759.ooo.test sudo[94972]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:03 np0005486759.ooo.test sudo[94987]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm8n3unxk/privsep.sock
Oct 14 08:46:03 np0005486759.ooo.test sudo[94987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:03 np0005486759.ooo.test sudo[94987]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:04 np0005486759.ooo.test sudo[95010]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpouk7qxnp/privsep.sock
Oct 14 08:46:04 np0005486759.ooo.test sudo[95010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:46:04 np0005486759.ooo.test systemd[1]: tmp-crun.wEDB4k.mount: Deactivated successfully.
Oct 14 08:46:04 np0005486759.ooo.test podman[95013]: 2025-10-14 08:46:04.464196349 +0000 UTC m=+0.091820646 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, build-date=2025-07-21T16:28:53, distribution-scope=public, tcib_managed=true, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:46:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:46:04 np0005486759.ooo.test podman[95013]: 2025-10-14 08:46:04.528429655 +0000 UTC m=+0.156053952 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:46:04 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:46:04 np0005486759.ooo.test systemd[1]: tmp-crun.vVOTSM.mount: Deactivated successfully.
Oct 14 08:46:04 np0005486759.ooo.test podman[95037]: 2025-10-14 08:46:04.620515456 +0000 UTC m=+0.090721190 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:46:04 np0005486759.ooo.test podman[95037]: 2025-10-14 08:46:04.669376505 +0000 UTC m=+0.139582259 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, build-date=2025-07-21T13:28:44, distribution-scope=public, container_name=ovn_controller, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, vcs-type=git)
Oct 14 08:46:04 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:46:04 np0005486759.ooo.test sudo[95010]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:05 np0005486759.ooo.test sudo[95069]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnm0hcacs/privsep.sock
Oct 14 08:46:05 np0005486759.ooo.test sudo[95069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:05 np0005486759.ooo.test sudo[95069]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:05 np0005486759.ooo.test sudo[95080]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiswwl_ui/privsep.sock
Oct 14 08:46:05 np0005486759.ooo.test sudo[95080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:06 np0005486759.ooo.test sudo[95080]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:06 np0005486759.ooo.test sudo[95091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprl6xqyav/privsep.sock
Oct 14 08:46:06 np0005486759.ooo.test sudo[95091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:07 np0005486759.ooo.test sudo[95091]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:07 np0005486759.ooo.test sudo[95103]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1hl7ich7/privsep.sock
Oct 14 08:46:07 np0005486759.ooo.test sudo[95103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:08 np0005486759.ooo.test sudo[95103]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:08 np0005486759.ooo.test sudo[95119]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx0roejco/privsep.sock
Oct 14 08:46:08 np0005486759.ooo.test sudo[95119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:09 np0005486759.ooo.test sudo[95119]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:09 np0005486759.ooo.test sudo[95130]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9eczavlv/privsep.sock
Oct 14 08:46:09 np0005486759.ooo.test sudo[95130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:10 np0005486759.ooo.test sudo[95130]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:10 np0005486759.ooo.test sudo[95141]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6c5o5tx_/privsep.sock
Oct 14 08:46:10 np0005486759.ooo.test sudo[95141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:10 np0005486759.ooo.test sudo[95141]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:11 np0005486759.ooo.test sudo[95153]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjvqvgeh8/privsep.sock
Oct 14 08:46:11 np0005486759.ooo.test sudo[95153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:11 np0005486759.ooo.test sudo[95153]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:12 np0005486759.ooo.test sudo[95164]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd3wpqy7b/privsep.sock
Oct 14 08:46:12 np0005486759.ooo.test sudo[95164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:12 np0005486759.ooo.test sudo[95164]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:12 np0005486759.ooo.test sudo[95175]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_xdjq5gs/privsep.sock
Oct 14 08:46:12 np0005486759.ooo.test sudo[95175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:13 np0005486759.ooo.test sudo[95175]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:13 np0005486759.ooo.test sudo[95192]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3a4ode3i/privsep.sock
Oct 14 08:46:13 np0005486759.ooo.test sudo[95192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:14 np0005486759.ooo.test sudo[95192]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:14 np0005486759.ooo.test sudo[95203]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp46g1m3cu/privsep.sock
Oct 14 08:46:14 np0005486759.ooo.test sudo[95203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:15 np0005486759.ooo.test sudo[95203]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:15 np0005486759.ooo.test sudo[95214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp78mko6fh/privsep.sock
Oct 14 08:46:15 np0005486759.ooo.test sudo[95214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:16 np0005486759.ooo.test sudo[95214]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:16 np0005486759.ooo.test sudo[95225]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfuy_g3fc/privsep.sock
Oct 14 08:46:16 np0005486759.ooo.test sudo[95225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:17 np0005486759.ooo.test sudo[95225]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:17 np0005486759.ooo.test sudo[95236]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9y5f3m88/privsep.sock
Oct 14 08:46:17 np0005486759.ooo.test sudo[95236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:18 np0005486759.ooo.test sudo[95236]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:46:18 np0005486759.ooo.test systemd[1]: tmp-crun.kRnH59.mount: Deactivated successfully.
Oct 14 08:46:18 np0005486759.ooo.test podman[95241]: 2025-10-14 08:46:18.166509193 +0000 UTC m=+0.086744837 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 08:46:18 np0005486759.ooo.test sudo[95269]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnbpaijt2/privsep.sock
Oct 14 08:46:18 np0005486759.ooo.test sudo[95269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:18 np0005486759.ooo.test podman[95241]: 2025-10-14 08:46:18.505342734 +0000 UTC m=+0.425578358 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Oct 14 08:46:18 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:46:18 np0005486759.ooo.test sudo[95269]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:46:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:46:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:46:19 np0005486759.ooo.test podman[95280]: 2025-10-14 08:46:19.014085816 +0000 UTC m=+0.065960841 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64)
Oct 14 08:46:19 np0005486759.ooo.test podman[95280]: 2025-10-14 08:46:19.047156414 +0000 UTC m=+0.099031439 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, version=17.1.9, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:46:19 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:46:19 np0005486759.ooo.test podman[95284]: 2025-10-14 08:46:19.093650819 +0000 UTC m=+0.140810047 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, container_name=collectd, vendor=Red Hat, Inc., tcib_managed=true)
Oct 14 08:46:19 np0005486759.ooo.test podman[95284]: 2025-10-14 08:46:19.104163866 +0000 UTC m=+0.151323114 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, io.buildah.version=1.33.12, config_id=tripleo_step3, distribution-scope=public, release=2, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 08:46:19 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:46:19 np0005486759.ooo.test podman[95283]: 2025-10-14 08:46:19.147686528 +0000 UTC m=+0.196538219 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vcs-type=git, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:46:19 np0005486759.ooo.test sudo[95351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5_t2kqrk/privsep.sock
Oct 14 08:46:19 np0005486759.ooo.test sudo[95351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:19 np0005486759.ooo.test podman[95283]: 2025-10-14 08:46:19.199399106 +0000 UTC m=+0.248250857 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, distribution-scope=public, config_id=tripleo_step5, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.9)
Oct 14 08:46:19 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:46:19 np0005486759.ooo.test sudo[95351]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:20 np0005486759.ooo.test sudo[95362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_vwrxujp/privsep.sock
Oct 14 08:46:20 np0005486759.ooo.test sudo[95362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:46:20 np0005486759.ooo.test podman[95365]: 2025-10-14 08:46:20.449918342 +0000 UTC m=+0.078377997 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9)
Oct 14 08:46:20 np0005486759.ooo.test podman[95365]: 2025-10-14 08:46:20.630270717 +0000 UTC m=+0.258730382 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59)
Oct 14 08:46:20 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:46:20 np0005486759.ooo.test sudo[95362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:21 np0005486759.ooo.test sudo[95402]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphz9fs28q/privsep.sock
Oct 14 08:46:21 np0005486759.ooo.test sudo[95402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:21 np0005486759.ooo.test sudo[95402]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:22 np0005486759.ooo.test sudo[95413]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0s7p_4gz/privsep.sock
Oct 14 08:46:22 np0005486759.ooo.test sudo[95413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:22 np0005486759.ooo.test sudo[95413]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:22 np0005486759.ooo.test sudo[95424]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpimhkdfn_/privsep.sock
Oct 14 08:46:22 np0005486759.ooo.test sudo[95424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:23 np0005486759.ooo.test sudo[95424]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:23 np0005486759.ooo.test sudo[95435]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptx3i6mu7/privsep.sock
Oct 14 08:46:23 np0005486759.ooo.test sudo[95435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:24 np0005486759.ooo.test sudo[95435]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:24 np0005486759.ooo.test sudo[95452]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8jkzxkfe/privsep.sock
Oct 14 08:46:24 np0005486759.ooo.test sudo[95452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:25 np0005486759.ooo.test sudo[95452]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:25 np0005486759.ooo.test sudo[95463]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbkyozhuq/privsep.sock
Oct 14 08:46:25 np0005486759.ooo.test sudo[95463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:26 np0005486759.ooo.test sudo[95463]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:26 np0005486759.ooo.test sudo[95474]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp98tds3l_/privsep.sock
Oct 14 08:46:26 np0005486759.ooo.test sudo[95474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:27 np0005486759.ooo.test sudo[95474]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:27 np0005486759.ooo.test sudo[95485]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv1ga765t/privsep.sock
Oct 14 08:46:27 np0005486759.ooo.test sudo[95485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:27 np0005486759.ooo.test sudo[95485]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:28 np0005486759.ooo.test sudo[95496]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8on5e2fg/privsep.sock
Oct 14 08:46:28 np0005486759.ooo.test sudo[95496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:28 np0005486759.ooo.test sudo[95496]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:29 np0005486759.ooo.test sudo[95507]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpinb8vniy/privsep.sock
Oct 14 08:46:29 np0005486759.ooo.test sudo[95507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:46:29 np0005486759.ooo.test podman[95509]: 2025-10-14 08:46:29.150844911 +0000 UTC m=+0.078128249 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, distribution-scope=public, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:46:29 np0005486759.ooo.test podman[95509]: 2025-10-14 08:46:29.191809245 +0000 UTC m=+0.119092643 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1)
Oct 14 08:46:29 np0005486759.ooo.test podman[95509]: unhealthy
Oct 14 08:46:29 np0005486759.ooo.test podman[95510]: 2025-10-14 08:46:29.203266781 +0000 UTC m=+0.128174985 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12)
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:46:29 np0005486759.ooo.test podman[95510]: 2025-10-14 08:46:29.246944928 +0000 UTC m=+0.171853112 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 08:46:29 np0005486759.ooo.test podman[95510]: unhealthy
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:46:29 np0005486759.ooo.test podman[95511]: 2025-10-14 08:46:29.25631543 +0000 UTC m=+0.174231947 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52)
Oct 14 08:46:29 np0005486759.ooo.test podman[95511]: 2025-10-14 08:46:29.339306678 +0000 UTC m=+0.257223125 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1)
Oct 14 08:46:29 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:46:29 np0005486759.ooo.test sudo[95507]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:29 np0005486759.ooo.test sudo[95581]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3gsl75yh/privsep.sock
Oct 14 08:46:29 np0005486759.ooo.test sudo[95581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:30 np0005486759.ooo.test sudo[95581]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:30 np0005486759.ooo.test sudo[95592]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpik3duvf5/privsep.sock
Oct 14 08:46:30 np0005486759.ooo.test sudo[95592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:31 np0005486759.ooo.test sudo[95592]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:31 np0005486759.ooo.test sudo[95603]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps_2lgi6g/privsep.sock
Oct 14 08:46:31 np0005486759.ooo.test sudo[95603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:32 np0005486759.ooo.test sudo[95603]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:32 np0005486759.ooo.test sudo[95614]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpejo8ntbz/privsep.sock
Oct 14 08:46:32 np0005486759.ooo.test sudo[95614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:33 np0005486759.ooo.test sudo[95614]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:33 np0005486759.ooo.test sudo[95625]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxdyiadry/privsep.sock
Oct 14 08:46:33 np0005486759.ooo.test sudo[95625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:33 np0005486759.ooo.test sudo[95625]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:34 np0005486759.ooo.test sudo[95636]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphv9jhw_7/privsep.sock
Oct 14 08:46:34 np0005486759.ooo.test sudo[95636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:46:34 np0005486759.ooo.test recover_tripleo_nova_virtqemud[95639]: 47951
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:46:34 np0005486759.ooo.test sudo[95636]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:46:34 np0005486759.ooo.test podman[95644]: 2025-10-14 08:46:34.804687764 +0000 UTC m=+0.064787754 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Oct 14 08:46:34 np0005486759.ooo.test podman[95645]: 2025-10-14 08:46:34.821104725 +0000 UTC m=+0.074650312 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible)
Oct 14 08:46:34 np0005486759.ooo.test podman[95645]: 2025-10-14 08:46:34.861343115 +0000 UTC m=+0.114888762 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:28:44, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team)
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:46:34 np0005486759.ooo.test podman[95644]: 2025-10-14 08:46:34.887662174 +0000 UTC m=+0.147762224 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64)
Oct 14 08:46:34 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:46:34 np0005486759.ooo.test sudo[95700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx9yo7_hk/privsep.sock
Oct 14 08:46:34 np0005486759.ooo.test sudo[95700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:35 np0005486759.ooo.test sudo[95700]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:35 np0005486759.ooo.test sudo[95715]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppizm6433/privsep.sock
Oct 14 08:46:35 np0005486759.ooo.test sudo[95715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:36 np0005486759.ooo.test sudo[95715]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:36 np0005486759.ooo.test sudo[95726]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu8129trh/privsep.sock
Oct 14 08:46:36 np0005486759.ooo.test sudo[95726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:37 np0005486759.ooo.test sudo[95726]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:37 np0005486759.ooo.test sudo[95737]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphsrmgm3k/privsep.sock
Oct 14 08:46:37 np0005486759.ooo.test sudo[95737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:38 np0005486759.ooo.test sudo[95737]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:38 np0005486759.ooo.test sudo[95748]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn6_rpoeg/privsep.sock
Oct 14 08:46:38 np0005486759.ooo.test sudo[95748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:39 np0005486759.ooo.test sudo[95748]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:39 np0005486759.ooo.test sudo[95759]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkdchj5nv/privsep.sock
Oct 14 08:46:39 np0005486759.ooo.test sudo[95759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:39 np0005486759.ooo.test sudo[95759]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:40 np0005486759.ooo.test sudo[95770]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd3mp4kvt/privsep.sock
Oct 14 08:46:40 np0005486759.ooo.test sudo[95770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:40 np0005486759.ooo.test sudo[95770]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:41 np0005486759.ooo.test sudo[95787]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsarvcn8w/privsep.sock
Oct 14 08:46:41 np0005486759.ooo.test sudo[95787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:41 np0005486759.ooo.test sudo[95787]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:42 np0005486759.ooo.test sudo[95798]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvo8i7tcq/privsep.sock
Oct 14 08:46:42 np0005486759.ooo.test sudo[95798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:42 np0005486759.ooo.test sudo[95798]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:42 np0005486759.ooo.test sudo[95809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqdr9ksvr/privsep.sock
Oct 14 08:46:42 np0005486759.ooo.test sudo[95809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:43 np0005486759.ooo.test sudo[95809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:43 np0005486759.ooo.test sudo[95820]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjgi3a0pl/privsep.sock
Oct 14 08:46:43 np0005486759.ooo.test sudo[95820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:44 np0005486759.ooo.test sudo[95820]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:44 np0005486759.ooo.test sudo[95831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzhrqm30c/privsep.sock
Oct 14 08:46:44 np0005486759.ooo.test sudo[95831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:45 np0005486759.ooo.test sudo[95831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:45 np0005486759.ooo.test sudo[95842]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpznwi1r7c/privsep.sock
Oct 14 08:46:45 np0005486759.ooo.test sudo[95842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:46 np0005486759.ooo.test sudo[95842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:46 np0005486759.ooo.test sudo[95859]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2i4d0v7j/privsep.sock
Oct 14 08:46:46 np0005486759.ooo.test sudo[95859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:47 np0005486759.ooo.test sudo[95859]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:47 np0005486759.ooo.test sudo[95870]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpms8k_j81/privsep.sock
Oct 14 08:46:47 np0005486759.ooo.test sudo[95870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:47 np0005486759.ooo.test sudo[95870]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:48 np0005486759.ooo.test sudo[95881]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr2gprylk/privsep.sock
Oct 14 08:46:48 np0005486759.ooo.test sudo[95881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:48 np0005486759.ooo.test sudo[95881]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:46:48 np0005486759.ooo.test systemd[1]: tmp-crun.0tgRCn.mount: Deactivated successfully.
Oct 14 08:46:48 np0005486759.ooo.test podman[95886]: 2025-10-14 08:46:48.889846608 +0000 UTC m=+0.092503316 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:46:49 np0005486759.ooo.test sudo[95913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb9xfghd_/privsep.sock
Oct 14 08:46:49 np0005486759.ooo.test sudo[95913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:49 np0005486759.ooo.test podman[95886]: 2025-10-14 08:46:49.251340373 +0000 UTC m=+0.453997371 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20250721.1, release=1, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:46:49 np0005486759.ooo.test podman[95918]: 2025-10-14 08:46:49.354242922 +0000 UTC m=+0.077665126 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_compute)
Oct 14 08:46:49 np0005486759.ooo.test podman[95918]: 2025-10-14 08:46:49.400316783 +0000 UTC m=+0.123738997 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc.)
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:46:49 np0005486759.ooo.test podman[95917]: 2025-10-14 08:46:49.420289454 +0000 UTC m=+0.144370768 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git)
Oct 14 08:46:49 np0005486759.ooo.test podman[95917]: 2025-10-14 08:46:49.45845501 +0000 UTC m=+0.182536314 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:46:49 np0005486759.ooo.test podman[95919]: 2025-10-14 08:46:49.466618394 +0000 UTC m=+0.181509213 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, container_name=collectd, io.openshift.expose-services=)
Oct 14 08:46:49 np0005486759.ooo.test podman[95919]: 2025-10-14 08:46:49.564563688 +0000 UTC m=+0.279454577 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, vcs-type=git, build-date=2025-07-21T13:04:03, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, batch=17.1_20250721.1, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:46:49 np0005486759.ooo.test sudo[95913]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:49 np0005486759.ooo.test systemd[1]: tmp-crun.PpU3X7.mount: Deactivated successfully.
Oct 14 08:46:49 np0005486759.ooo.test sudo[96007]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyyzy_gvd/privsep.sock
Oct 14 08:46:49 np0005486759.ooo.test sudo[96007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:50 np0005486759.ooo.test sudo[96007]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:50 np0005486759.ooo.test sudo[96018]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6knfh_if/privsep.sock
Oct 14 08:46:50 np0005486759.ooo.test sudo[96018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:46:50 np0005486759.ooo.test systemd[1]: tmp-crun.cQ1ZHg.mount: Deactivated successfully.
Oct 14 08:46:50 np0005486759.ooo.test podman[96020]: 2025-10-14 08:46:50.866064759 +0000 UTC m=+0.082533296 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:46:51 np0005486759.ooo.test podman[96020]: 2025-10-14 08:46:51.06686869 +0000 UTC m=+0.283337287 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, container_name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.expose-services=)
Oct 14 08:46:51 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:46:51 np0005486759.ooo.test sudo[96018]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:51 np0005486759.ooo.test sudo[96063]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf64nnxjf/privsep.sock
Oct 14 08:46:51 np0005486759.ooo.test sudo[96063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:52 np0005486759.ooo.test sudo[96063]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:52 np0005486759.ooo.test sudo[96074]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps4sce0j4/privsep.sock
Oct 14 08:46:52 np0005486759.ooo.test sudo[96074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:53 np0005486759.ooo.test sudo[96074]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:53 np0005486759.ooo.test sudo[96085]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuhlhkemp/privsep.sock
Oct 14 08:46:53 np0005486759.ooo.test sudo[96085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:54 np0005486759.ooo.test sudo[96085]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:54 np0005486759.ooo.test sudo[96096]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9h92ekmv/privsep.sock
Oct 14 08:46:54 np0005486759.ooo.test sudo[96096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:54 np0005486759.ooo.test sudo[96096]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:55 np0005486759.ooo.test sudo[96107]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxbto0dj5/privsep.sock
Oct 14 08:46:55 np0005486759.ooo.test sudo[96107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:55 np0005486759.ooo.test sudo[96107]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:56 np0005486759.ooo.test sudo[96118]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppzjua2z3/privsep.sock
Oct 14 08:46:56 np0005486759.ooo.test sudo[96118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:56 np0005486759.ooo.test sudo[96118]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:57 np0005486759.ooo.test sudo[96135]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2r_2lxsn/privsep.sock
Oct 14 08:46:57 np0005486759.ooo.test sudo[96135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:57 np0005486759.ooo.test sudo[96135]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:58 np0005486759.ooo.test sudo[96146]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppocz_1ma/privsep.sock
Oct 14 08:46:58 np0005486759.ooo.test sudo[96146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:58 np0005486759.ooo.test sudo[96146]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:58 np0005486759.ooo.test sudo[96157]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaipzd0op/privsep.sock
Oct 14 08:46:58 np0005486759.ooo.test sudo[96157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:46:59 np0005486759.ooo.test podman[96160]: 2025-10-14 08:46:59.456185285 +0000 UTC m=+0.081288898 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.buildah.version=1.33.12, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., container_name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1)
Oct 14 08:46:59 np0005486759.ooo.test sudo[96157]: pam_unix(sudo:session): session closed for user root
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: tmp-crun.JGapdV.mount: Deactivated successfully.
Oct 14 08:46:59 np0005486759.ooo.test podman[96162]: 2025-10-14 08:46:59.52103356 +0000 UTC m=+0.137656159 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1)
Oct 14 08:46:59 np0005486759.ooo.test podman[96162]: 2025-10-14 08:46:59.560416744 +0000 UTC m=+0.177039293 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: tmp-crun.8FeYmg.mount: Deactivated successfully.
Oct 14 08:46:59 np0005486759.ooo.test podman[96162]: unhealthy
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:46:59 np0005486759.ooo.test podman[96160]: 2025-10-14 08:46:59.593372578 +0000 UTC m=+0.218476171 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, config_id=tripleo_step4)
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:46:59 np0005486759.ooo.test podman[96161]: 2025-10-14 08:46:59.575928156 +0000 UTC m=+0.195231579 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:46:59 np0005486759.ooo.test podman[96161]: 2025-10-14 08:46:59.657361497 +0000 UTC m=+0.276664910 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:46:59 np0005486759.ooo.test podman[96161]: unhealthy
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:46:59 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:46:59 np0005486759.ooo.test sudo[96222]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq5fbhxm3/privsep.sock
Oct 14 08:46:59 np0005486759.ooo.test sudo[96222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:00 np0005486759.ooo.test sudo[96222]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:00 np0005486759.ooo.test sudo[96233]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_a3empbx/privsep.sock
Oct 14 08:47:00 np0005486759.ooo.test sudo[96233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:01 np0005486759.ooo.test sudo[96233]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:01 np0005486759.ooo.test sudo[96244]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplkfempx9/privsep.sock
Oct 14 08:47:01 np0005486759.ooo.test sudo[96244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:02 np0005486759.ooo.test sudo[96244]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:02 np0005486759.ooo.test sudo[96261]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgwj2vfeu/privsep.sock
Oct 14 08:47:02 np0005486759.ooo.test sudo[96261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:03 np0005486759.ooo.test sudo[96261]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:03 np0005486759.ooo.test sudo[96272]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfuxwlvxl/privsep.sock
Oct 14 08:47:03 np0005486759.ooo.test sudo[96272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:04 np0005486759.ooo.test sudo[96272]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:04 np0005486759.ooo.test sudo[96286]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa12zuhyc/privsep.sock
Oct 14 08:47:04 np0005486759.ooo.test sudo[96286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:05 np0005486759.ooo.test sudo[96286]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:47:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:47:05 np0005486759.ooo.test systemd[1]: tmp-crun.i7PJBE.mount: Deactivated successfully.
Oct 14 08:47:05 np0005486759.ooo.test podman[96301]: 2025-10-14 08:47:05.185985479 +0000 UTC m=+0.088966536 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:47:05 np0005486759.ooo.test podman[96301]: 2025-10-14 08:47:05.227464508 +0000 UTC m=+0.130445545 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T16:28:53, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, container_name=ovn_metadata_agent)
Oct 14 08:47:05 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:47:05 np0005486759.ooo.test podman[96302]: 2025-10-14 08:47:05.255646544 +0000 UTC m=+0.155637398 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, tcib_managed=true)
Oct 14 08:47:05 np0005486759.ooo.test podman[96302]: 2025-10-14 08:47:05.281301981 +0000 UTC m=+0.181292835 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Oct 14 08:47:05 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:47:05 np0005486759.ooo.test sudo[96355]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3rc3mld8/privsep.sock
Oct 14 08:47:05 np0005486759.ooo.test sudo[96355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:05 np0005486759.ooo.test sudo[96355]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:06 np0005486759.ooo.test systemd[1]: tmp-crun.izCzug.mount: Deactivated successfully.
Oct 14 08:47:06 np0005486759.ooo.test sudo[96366]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprzkzqa6t/privsep.sock
Oct 14 08:47:06 np0005486759.ooo.test sudo[96366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:06 np0005486759.ooo.test sudo[96366]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:07 np0005486759.ooo.test sudo[96377]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_n_30wuo/privsep.sock
Oct 14 08:47:07 np0005486759.ooo.test sudo[96377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:07 np0005486759.ooo.test sudo[96377]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:08 np0005486759.ooo.test sudo[96394]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph306y3q7/privsep.sock
Oct 14 08:47:08 np0005486759.ooo.test sudo[96394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:08 np0005486759.ooo.test sudo[96394]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:09 np0005486759.ooo.test sudo[96405]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpobvib7uo/privsep.sock
Oct 14 08:47:09 np0005486759.ooo.test sudo[96405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:09 np0005486759.ooo.test sudo[96405]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:09 np0005486759.ooo.test sudo[96416]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps45c7gvq/privsep.sock
Oct 14 08:47:09 np0005486759.ooo.test sudo[96416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:10 np0005486759.ooo.test sudo[96416]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:10 np0005486759.ooo.test sudo[96427]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpisurh16c/privsep.sock
Oct 14 08:47:10 np0005486759.ooo.test sudo[96427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:11 np0005486759.ooo.test sudo[96427]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:11 np0005486759.ooo.test sudo[96439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnzq26x5y/privsep.sock
Oct 14 08:47:11 np0005486759.ooo.test sudo[96439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:12 np0005486759.ooo.test sudo[96439]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:12 np0005486759.ooo.test sudo[96450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpes_i0pkh/privsep.sock
Oct 14 08:47:12 np0005486759.ooo.test sudo[96450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:13 np0005486759.ooo.test sudo[96450]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:13 np0005486759.ooo.test sudo[96467]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb6o481ht/privsep.sock
Oct 14 08:47:13 np0005486759.ooo.test sudo[96467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:14 np0005486759.ooo.test sudo[96467]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:14 np0005486759.ooo.test sudo[96478]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9dws8p6c/privsep.sock
Oct 14 08:47:14 np0005486759.ooo.test sudo[96478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:15 np0005486759.ooo.test sudo[96478]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:15 np0005486759.ooo.test sudo[96489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkxd749le/privsep.sock
Oct 14 08:47:15 np0005486759.ooo.test sudo[96489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:15 np0005486759.ooo.test sudo[96489]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:16 np0005486759.ooo.test sudo[96500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_e4wofuk/privsep.sock
Oct 14 08:47:16 np0005486759.ooo.test sudo[96500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:16 np0005486759.ooo.test sudo[96500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:16 np0005486759.ooo.test sudo[96511]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpobs0ecm6/privsep.sock
Oct 14 08:47:16 np0005486759.ooo.test sudo[96511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:17 np0005486759.ooo.test sudo[96511]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:17 np0005486759.ooo.test sudo[96522]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj8ysyf2e/privsep.sock
Oct 14 08:47:17 np0005486759.ooo.test sudo[96522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:18 np0005486759.ooo.test sudo[96522]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:18 np0005486759.ooo.test sudo[96539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplok3x91r/privsep.sock
Oct 14 08:47:18 np0005486759.ooo.test sudo[96539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:19 np0005486759.ooo.test sudo[96539]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:47:19 np0005486759.ooo.test podman[96545]: 2025-10-14 08:47:19.401069159 +0000 UTC m=+0.086627452 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, distribution-scope=public, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step4)
Oct 14 08:47:19 np0005486759.ooo.test sudo[96572]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpagn6mdjq/privsep.sock
Oct 14 08:47:19 np0005486759.ooo.test sudo[96572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: tmp-crun.gA9W5H.mount: Deactivated successfully.
Oct 14 08:47:19 np0005486759.ooo.test podman[96578]: 2025-10-14 08:47:19.656669494 +0000 UTC m=+0.059249832 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, managed_by=tripleo_ansible, release=2, architecture=x86_64, config_id=tripleo_step3, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:47:19 np0005486759.ooo.test podman[96578]: 2025-10-14 08:47:19.662149374 +0000 UTC m=+0.064729702 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, container_name=collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=2, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:47:19 np0005486759.ooo.test podman[96575]: 2025-10-14 08:47:19.692116065 +0000 UTC m=+0.100023279 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:27:15, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:47:19 np0005486759.ooo.test podman[96545]: 2025-10-14 08:47:19.76532114 +0000 UTC m=+0.450879453 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git)
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:47:19 np0005486759.ooo.test podman[96575]: 2025-10-14 08:47:19.7817095 +0000 UTC m=+0.189616744 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.openshift.expose-services=, release=1, version=17.1.9, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:47:19 np0005486759.ooo.test podman[96576]: 2025-10-14 08:47:19.768220161 +0000 UTC m=+0.173241495 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=nova_compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37)
Oct 14 08:47:19 np0005486759.ooo.test podman[96576]: 2025-10-14 08:47:19.848037811 +0000 UTC m=+0.253059235 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:47:19 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:47:20 np0005486759.ooo.test sudo[96572]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:20 np0005486759.ooo.test sudo[96646]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprkhnrnpq/privsep.sock
Oct 14 08:47:20 np0005486759.ooo.test sudo[96646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:21 np0005486759.ooo.test sudo[96646]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:47:21 np0005486759.ooo.test podman[96650]: 2025-10-14 08:47:21.284236049 +0000 UTC m=+0.054538056 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, container_name=metrics_qdr, tcib_managed=true, version=17.1.9, config_id=tripleo_step1, release=1)
Oct 14 08:47:21 np0005486759.ooo.test podman[96650]: 2025-10-14 08:47:21.442712984 +0000 UTC m=+0.213014991 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, distribution-scope=public, tcib_managed=true, vcs-type=git, version=17.1.9, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Oct 14 08:47:21 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:47:21 np0005486759.ooo.test sudo[96686]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9vazpbaa/privsep.sock
Oct 14 08:47:21 np0005486759.ooo.test sudo[96686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:22 np0005486759.ooo.test sudo[96686]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:22 np0005486759.ooo.test sudo[96697]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnw_cyqic/privsep.sock
Oct 14 08:47:22 np0005486759.ooo.test sudo[96697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:23 np0005486759.ooo.test sudo[96697]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:23 np0005486759.ooo.test sudo[96708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps4qegty1/privsep.sock
Oct 14 08:47:23 np0005486759.ooo.test sudo[96708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:23 np0005486759.ooo.test sudo[96708]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:24 np0005486759.ooo.test sudo[96725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx84g929t/privsep.sock
Oct 14 08:47:24 np0005486759.ooo.test sudo[96725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:24 np0005486759.ooo.test sudo[96725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:24 np0005486759.ooo.test sudo[96736]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3sx99l6e/privsep.sock
Oct 14 08:47:24 np0005486759.ooo.test sudo[96736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:25 np0005486759.ooo.test sudo[96736]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:25 np0005486759.ooo.test sudo[96747]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxruor7sm/privsep.sock
Oct 14 08:47:25 np0005486759.ooo.test sudo[96747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:26 np0005486759.ooo.test sudo[96747]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:26 np0005486759.ooo.test sudo[96758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp28bb5k37/privsep.sock
Oct 14 08:47:26 np0005486759.ooo.test sudo[96758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:27 np0005486759.ooo.test sudo[96758]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:27 np0005486759.ooo.test sudo[96769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjerunuhw/privsep.sock
Oct 14 08:47:27 np0005486759.ooo.test sudo[96769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:28 np0005486759.ooo.test sudo[96769]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:28 np0005486759.ooo.test sudo[96780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb1iqpvq7/privsep.sock
Oct 14 08:47:28 np0005486759.ooo.test sudo[96780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:29 np0005486759.ooo.test sudo[96780]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:29 np0005486759.ooo.test sudo[96796]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7a11x77p/privsep.sock
Oct 14 08:47:29 np0005486759.ooo.test sudo[96796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:29 np0005486759.ooo.test sudo[96796]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:47:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:47:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:47:29 np0005486759.ooo.test podman[96804]: 2025-10-14 08:47:29.938020203 +0000 UTC m=+0.048521330 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:47:29 np0005486759.ooo.test podman[96804]: 2025-10-14 08:47:29.975411335 +0000 UTC m=+0.085912492 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, release=1, config_id=tripleo_step4, vendor=Red Hat, Inc.)
Oct 14 08:47:29 np0005486759.ooo.test podman[96804]: unhealthy
Oct 14 08:47:29 np0005486759.ooo.test systemd[1]: tmp-crun.8RgI6f.mount: Deactivated successfully.
Oct 14 08:47:29 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:47:29 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:47:29 np0005486759.ooo.test podman[96805]: 2025-10-14 08:47:29.992008841 +0000 UTC m=+0.099138302 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, version=17.1.9, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, release=1)
Oct 14 08:47:30 np0005486759.ooo.test podman[96805]: 2025-10-14 08:47:30.002483477 +0000 UTC m=+0.109612938 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-type=git, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Oct 14 08:47:30 np0005486759.ooo.test podman[96805]: unhealthy
Oct 14 08:47:30 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:47:30 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:47:30 np0005486759.ooo.test podman[96803]: 2025-10-14 08:47:30.049221629 +0000 UTC m=+0.161094838 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-cron, release=1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4)
Oct 14 08:47:30 np0005486759.ooo.test podman[96803]: 2025-10-14 08:47:30.052761529 +0000 UTC m=+0.164634718 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, distribution-scope=public, batch=17.1_20250721.1, version=17.1.9, release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible)
Oct 14 08:47:30 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:47:30 np0005486759.ooo.test sudo[96867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz2o_s1hp/privsep.sock
Oct 14 08:47:30 np0005486759.ooo.test sudo[96867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:30 np0005486759.ooo.test sudo[96867]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:30 np0005486759.ooo.test systemd[1]: tmp-crun.LrG8tK.mount: Deactivated successfully.
Oct 14 08:47:30 np0005486759.ooo.test sudo[96878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptx1veija/privsep.sock
Oct 14 08:47:30 np0005486759.ooo.test sudo[96878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:31 np0005486759.ooo.test sudo[96878]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:31 np0005486759.ooo.test sudo[96889]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5l4zhtxa/privsep.sock
Oct 14 08:47:31 np0005486759.ooo.test sudo[96889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:32 np0005486759.ooo.test sudo[96889]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:32 np0005486759.ooo.test sudo[96900]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1lgbfle9/privsep.sock
Oct 14 08:47:32 np0005486759.ooo.test sudo[96900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:33 np0005486759.ooo.test sudo[96900]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:33 np0005486759.ooo.test sudo[96911]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp26srxdbr/privsep.sock
Oct 14 08:47:33 np0005486759.ooo.test sudo[96911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:34 np0005486759.ooo.test sudo[96911]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:34 np0005486759.ooo.test sudo[96922]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvib67pq3/privsep.sock
Oct 14 08:47:34 np0005486759.ooo.test sudo[96922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:35 np0005486759.ooo.test sudo[96922]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:47:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:47:35 np0005486759.ooo.test sudo[96941]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuur65lx9/privsep.sock
Oct 14 08:47:35 np0005486759.ooo.test sudo[96941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:35 np0005486759.ooo.test systemd[1]: tmp-crun.mvazB9.mount: Deactivated successfully.
Oct 14 08:47:35 np0005486759.ooo.test podman[96940]: 2025-10-14 08:47:35.476140151 +0000 UTC m=+0.090025259 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:47:35 np0005486759.ooo.test podman[96938]: 2025-10-14 08:47:35.511640533 +0000 UTC m=+0.127142412 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:47:35 np0005486759.ooo.test podman[96940]: 2025-10-14 08:47:35.547817288 +0000 UTC m=+0.161702346 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44, container_name=ovn_controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:47:35 np0005486759.ooo.test podman[96938]: 2025-10-14 08:47:35.549657805 +0000 UTC m=+0.165159684 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true)
Oct 14 08:47:35 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:47:35 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:47:36 np0005486759.ooo.test sudo[96941]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:36 np0005486759.ooo.test sudo[96998]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr3kbq_62/privsep.sock
Oct 14 08:47:36 np0005486759.ooo.test sudo[96998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:36 np0005486759.ooo.test sudo[96998]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:37 np0005486759.ooo.test sudo[97009]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqocfmbkf/privsep.sock
Oct 14 08:47:37 np0005486759.ooo.test sudo[97009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:37 np0005486759.ooo.test sudo[97009]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:38 np0005486759.ooo.test sudo[97020]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnn_b3xvq/privsep.sock
Oct 14 08:47:38 np0005486759.ooo.test sudo[97020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:38 np0005486759.ooo.test sudo[97020]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:38 np0005486759.ooo.test sudo[97031]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqreq4b5o/privsep.sock
Oct 14 08:47:38 np0005486759.ooo.test sudo[97031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:39 np0005486759.ooo.test sudo[97031]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:39 np0005486759.ooo.test sudo[97042]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1f7muzzb/privsep.sock
Oct 14 08:47:39 np0005486759.ooo.test sudo[97042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:40 np0005486759.ooo.test sudo[97042]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:40 np0005486759.ooo.test sudo[97059]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcrgboi6w/privsep.sock
Oct 14 08:47:40 np0005486759.ooo.test sudo[97059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:41 np0005486759.ooo.test sudo[97059]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:41 np0005486759.ooo.test sudo[97070]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfjc83cve/privsep.sock
Oct 14 08:47:41 np0005486759.ooo.test sudo[97070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:42 np0005486759.ooo.test sudo[97070]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:42 np0005486759.ooo.test sudo[97081]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpua6rk14x/privsep.sock
Oct 14 08:47:42 np0005486759.ooo.test sudo[97081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:43 np0005486759.ooo.test sudo[97081]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:43 np0005486759.ooo.test sudo[97092]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2msoc9gt/privsep.sock
Oct 14 08:47:43 np0005486759.ooo.test sudo[97092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:43 np0005486759.ooo.test sudo[97092]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:44 np0005486759.ooo.test sudo[97103]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbn_s07lk/privsep.sock
Oct 14 08:47:44 np0005486759.ooo.test sudo[97103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:44 np0005486759.ooo.test sudo[97103]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:45 np0005486759.ooo.test sudo[97114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp87ejiqt0/privsep.sock
Oct 14 08:47:45 np0005486759.ooo.test sudo[97114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:45 np0005486759.ooo.test sudo[97114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:45 np0005486759.ooo.test sudo[97131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb_csucg5/privsep.sock
Oct 14 08:47:45 np0005486759.ooo.test sudo[97131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:46 np0005486759.ooo.test sudo[97131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:46 np0005486759.ooo.test sudo[97142]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdsakyfyr/privsep.sock
Oct 14 08:47:46 np0005486759.ooo.test sudo[97142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:47 np0005486759.ooo.test sudo[97142]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:47 np0005486759.ooo.test sudo[97153]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpszfpeka3/privsep.sock
Oct 14 08:47:47 np0005486759.ooo.test sudo[97153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:48 np0005486759.ooo.test sudo[97153]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:48 np0005486759.ooo.test sudo[97164]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_op989xn/privsep.sock
Oct 14 08:47:48 np0005486759.ooo.test sudo[97164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:49 np0005486759.ooo.test sudo[97164]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:49 np0005486759.ooo.test sudo[97175]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplw_z2fhb/privsep.sock
Oct 14 08:47:49 np0005486759.ooo.test sudo[97175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:49 np0005486759.ooo.test sudo[97175]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:47:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:47:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:47:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:47:50 np0005486759.ooo.test systemd[1]: tmp-crun.9FEh0t.mount: Deactivated successfully.
Oct 14 08:47:50 np0005486759.ooo.test podman[97181]: 2025-10-14 08:47:50.056137583 +0000 UTC m=+0.066315982 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step3, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container)
Oct 14 08:47:50 np0005486759.ooo.test podman[97181]: 2025-10-14 08:47:50.065225945 +0000 UTC m=+0.075404354 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container)
Oct 14 08:47:50 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:47:50 np0005486759.ooo.test podman[97184]: 2025-10-14 08:47:50.128017897 +0000 UTC m=+0.133616784 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-type=git, container_name=collectd)
Oct 14 08:47:50 np0005486759.ooo.test podman[97184]: 2025-10-14 08:47:50.166266296 +0000 UTC m=+0.171865183 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, tcib_managed=true, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.component=openstack-collectd-container, vcs-type=git, architecture=x86_64, container_name=collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Oct 14 08:47:50 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:47:50 np0005486759.ooo.test podman[97179]: 2025-10-14 08:47:50.141911989 +0000 UTC m=+0.151887831 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, release=1, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true)
Oct 14 08:47:50 np0005486759.ooo.test podman[97183]: 2025-10-14 08:47:50.166595306 +0000 UTC m=+0.171427859 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, distribution-scope=public, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vcs-type=git)
Oct 14 08:47:50 np0005486759.ooo.test sudo[97268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprscu5s3o/privsep.sock
Oct 14 08:47:50 np0005486759.ooo.test sudo[97268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:50 np0005486759.ooo.test podman[97183]: 2025-10-14 08:47:50.247856312 +0000 UTC m=+0.252688925 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:47:50 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:47:50 np0005486759.ooo.test podman[97179]: 2025-10-14 08:47:50.442273784 +0000 UTC m=+0.452249686 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target)
Oct 14 08:47:50 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:47:50 np0005486759.ooo.test sudo[97268]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:51 np0005486759.ooo.test systemd[1]: tmp-crun.gAjT97.mount: Deactivated successfully.
Oct 14 08:47:51 np0005486759.ooo.test sudo[97286]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5h42nng/privsep.sock
Oct 14 08:47:51 np0005486759.ooo.test sudo[97286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:51 np0005486759.ooo.test sudo[97286]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:47:51 np0005486759.ooo.test systemd[1]: tmp-crun.QpIovl.mount: Deactivated successfully.
Oct 14 08:47:51 np0005486759.ooo.test podman[97291]: 2025-10-14 08:47:51.890254138 +0000 UTC m=+0.096615504 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 08:47:52 np0005486759.ooo.test sudo[97326]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo62ji729/privsep.sock
Oct 14 08:47:52 np0005486759.ooo.test sudo[97326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:52 np0005486759.ooo.test podman[97291]: 2025-10-14 08:47:52.092411611 +0000 UTC m=+0.298772997 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd)
Oct 14 08:47:52 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:47:52 np0005486759.ooo.test sudo[97326]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:52 np0005486759.ooo.test sudo[97337]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqh480q7x/privsep.sock
Oct 14 08:47:52 np0005486759.ooo.test sudo[97337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:53 np0005486759.ooo.test sudo[97337]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:53 np0005486759.ooo.test sudo[97348]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph3rpoax5/privsep.sock
Oct 14 08:47:53 np0005486759.ooo.test sudo[97348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:54 np0005486759.ooo.test sudo[97348]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:54 np0005486759.ooo.test sudo[97359]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1klz9nw7/privsep.sock
Oct 14 08:47:54 np0005486759.ooo.test sudo[97359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:55 np0005486759.ooo.test sudo[97359]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:55 np0005486759.ooo.test sudo[97370]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph72ebpj7/privsep.sock
Oct 14 08:47:55 np0005486759.ooo.test sudo[97370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:56 np0005486759.ooo.test sudo[97370]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:56 np0005486759.ooo.test sudo[97387]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5uyhdrwh/privsep.sock
Oct 14 08:47:56 np0005486759.ooo.test sudo[97387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:57 np0005486759.ooo.test sudo[97387]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:57 np0005486759.ooo.test sudo[97398]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3xi08iyj/privsep.sock
Oct 14 08:47:57 np0005486759.ooo.test sudo[97398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:58 np0005486759.ooo.test sudo[97398]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:58 np0005486759.ooo.test sudo[97409]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_vzr2t4j/privsep.sock
Oct 14 08:47:58 np0005486759.ooo.test sudo[97409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:58 np0005486759.ooo.test sudo[97409]: pam_unix(sudo:session): session closed for user root
Oct 14 08:47:59 np0005486759.ooo.test sudo[97420]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy4ul096i/privsep.sock
Oct 14 08:47:59 np0005486759.ooo.test sudo[97420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:47:59 np0005486759.ooo.test sudo[97420]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:00 np0005486759.ooo.test sudo[97431]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7cvc1458/privsep.sock
Oct 14 08:48:00 np0005486759.ooo.test sudo[97431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: tmp-crun.FOnmlA.mount: Deactivated successfully.
Oct 14 08:48:00 np0005486759.ooo.test podman[97433]: 2025-10-14 08:48:00.147010942 +0000 UTC m=+0.058881241 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible)
Oct 14 08:48:00 np0005486759.ooo.test podman[97433]: 2025-10-14 08:48:00.180190063 +0000 UTC m=+0.092060342 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, release=1)
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:48:00 np0005486759.ooo.test podman[97434]: 2025-10-14 08:48:00.258754165 +0000 UTC m=+0.167063073 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, release=1, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 08:48:00 np0005486759.ooo.test podman[97435]: 2025-10-14 08:48:00.181110483 +0000 UTC m=+0.084984363 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, vcs-type=git)
Oct 14 08:48:00 np0005486759.ooo.test podman[97434]: 2025-10-14 08:48:00.298341055 +0000 UTC m=+0.206649943 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:48:00 np0005486759.ooo.test podman[97434]: unhealthy
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:48:00 np0005486759.ooo.test podman[97435]: 2025-10-14 08:48:00.313352513 +0000 UTC m=+0.217226383 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:48:00 np0005486759.ooo.test podman[97435]: unhealthy
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:48:00 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:48:00 np0005486759.ooo.test sudo[97431]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:00 np0005486759.ooo.test sudo[97500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt5pka5o7/privsep.sock
Oct 14 08:48:00 np0005486759.ooo.test sudo[97500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:01 np0005486759.ooo.test sudo[97500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:01 np0005486759.ooo.test sudo[97516]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxuzzjbxo/privsep.sock
Oct 14 08:48:01 np0005486759.ooo.test sudo[97516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:02 np0005486759.ooo.test sudo[97516]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:02 np0005486759.ooo.test sudo[97528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvwmivr9l/privsep.sock
Oct 14 08:48:02 np0005486759.ooo.test sudo[97528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:03 np0005486759.ooo.test sudo[97528]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:03 np0005486759.ooo.test sudo[97539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpujc_pxzs/privsep.sock
Oct 14 08:48:03 np0005486759.ooo.test sudo[97539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:04 np0005486759.ooo.test sudo[97539]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:04 np0005486759.ooo.test sudo[97557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy51a59z6/privsep.sock
Oct 14 08:48:04 np0005486759.ooo.test sudo[97557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:05 np0005486759.ooo.test sudo[97557]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:05 np0005486759.ooo.test sudo[97573]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9tlnsr6w/privsep.sock
Oct 14 08:48:05 np0005486759.ooo.test sudo[97573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:05 np0005486759.ooo.test sudo[97573]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:48:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:48:06 np0005486759.ooo.test systemd[1]: tmp-crun.fLr9HF.mount: Deactivated successfully.
Oct 14 08:48:06 np0005486759.ooo.test podman[97579]: 2025-10-14 08:48:06.134751845 +0000 UTC m=+0.099189335 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, architecture=x86_64, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_id=tripleo_step4)
Oct 14 08:48:06 np0005486759.ooo.test podman[97579]: 2025-10-14 08:48:06.187418332 +0000 UTC m=+0.151855882 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Oct 14 08:48:06 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:48:06 np0005486759.ooo.test podman[97580]: 2025-10-14 08:48:06.283772026 +0000 UTC m=+0.246367508 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, release=1, build-date=2025-07-21T13:28:44, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Oct 14 08:48:06 np0005486759.ooo.test sudo[97623]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkezy8r1i/privsep.sock
Oct 14 08:48:06 np0005486759.ooo.test sudo[97623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:06 np0005486759.ooo.test podman[97580]: 2025-10-14 08:48:06.337558587 +0000 UTC m=+0.300154069 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git)
Oct 14 08:48:06 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:48:06 np0005486759.ooo.test sudo[97623]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:07 np0005486759.ooo.test sudo[97647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb0csyl9s/privsep.sock
Oct 14 08:48:07 np0005486759.ooo.test sudo[97647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:07 np0005486759.ooo.test sudo[97647]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:08 np0005486759.ooo.test sudo[97659]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp__gwo0s_/privsep.sock
Oct 14 08:48:08 np0005486759.ooo.test sudo[97659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:08 np0005486759.ooo.test sudo[97659]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:09 np0005486759.ooo.test sudo[97670]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_gjeb9m4/privsep.sock
Oct 14 08:48:09 np0005486759.ooo.test sudo[97670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:09 np0005486759.ooo.test sudo[97670]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:10 np0005486759.ooo.test sudo[97681]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgaemfcxw/privsep.sock
Oct 14 08:48:10 np0005486759.ooo.test sudo[97681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:10 np0005486759.ooo.test sudo[97681]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:10 np0005486759.ooo.test sudo[97692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptxqnupuf/privsep.sock
Oct 14 08:48:10 np0005486759.ooo.test sudo[97692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:11 np0005486759.ooo.test sudo[97692]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:11 np0005486759.ooo.test sudo[97703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsu5t7bdd/privsep.sock
Oct 14 08:48:11 np0005486759.ooo.test sudo[97703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:12 np0005486759.ooo.test sudo[97703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:12 np0005486759.ooo.test sudo[97720]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa27n1cfv/privsep.sock
Oct 14 08:48:12 np0005486759.ooo.test sudo[97720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:13 np0005486759.ooo.test sudo[97720]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:13 np0005486759.ooo.test sudo[97731]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpriyrwre8/privsep.sock
Oct 14 08:48:13 np0005486759.ooo.test sudo[97731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:14 np0005486759.ooo.test sudo[97731]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:14 np0005486759.ooo.test sudo[97742]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_40wunr5/privsep.sock
Oct 14 08:48:14 np0005486759.ooo.test sudo[97742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:14 np0005486759.ooo.test auditd[725]: Audit daemon rotating log files
Oct 14 08:48:15 np0005486759.ooo.test sudo[97742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:15 np0005486759.ooo.test sudo[97753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkdtrk4gq/privsep.sock
Oct 14 08:48:15 np0005486759.ooo.test sudo[97753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:16 np0005486759.ooo.test sudo[97753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:16 np0005486759.ooo.test sudo[97764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsw7qd5xv/privsep.sock
Oct 14 08:48:16 np0005486759.ooo.test sudo[97764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:17 np0005486759.ooo.test sudo[97764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:17 np0005486759.ooo.test sudo[97775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeamj6akd/privsep.sock
Oct 14 08:48:17 np0005486759.ooo.test sudo[97775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:18 np0005486759.ooo.test sudo[97775]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:18 np0005486759.ooo.test sudo[97792]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmjy2x4i/privsep.sock
Oct 14 08:48:18 np0005486759.ooo.test sudo[97792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:18 np0005486759.ooo.test sudo[97792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:19 np0005486759.ooo.test sudo[97803]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxmim3kic/privsep.sock
Oct 14 08:48:19 np0005486759.ooo.test sudo[97803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:19 np0005486759.ooo.test sudo[97803]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:20 np0005486759.ooo.test sudo[97814]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvjx3yq94/privsep.sock
Oct 14 08:48:20 np0005486759.ooo.test sudo[97814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:48:20 np0005486759.ooo.test recover_tripleo_nova_virtqemud[97829]: 47951
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: tmp-crun.VbIL0j.mount: Deactivated successfully.
Oct 14 08:48:20 np0005486759.ooo.test podman[97816]: 2025-10-14 08:48:20.328402078 +0000 UTC m=+0.106951815 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:48:20 np0005486759.ooo.test podman[97817]: 2025-10-14 08:48:20.363454268 +0000 UTC m=+0.138865307 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Oct 14 08:48:20 np0005486759.ooo.test podman[97817]: 2025-10-14 08:48:20.376289357 +0000 UTC m=+0.151700416 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, architecture=x86_64, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, release=2, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:48:20 np0005486759.ooo.test podman[97843]: 2025-10-14 08:48:20.441522435 +0000 UTC m=+0.106401658 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:48:20 np0005486759.ooo.test podman[97816]: 2025-10-14 08:48:20.464473788 +0000 UTC m=+0.243023505 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git)
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:48:20 np0005486759.ooo.test podman[97843]: 2025-10-14 08:48:20.497431082 +0000 UTC m=+0.162310325 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, batch=17.1_20250721.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, container_name=nova_compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute)
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:48:20 np0005486759.ooo.test podman[97880]: 2025-10-14 08:48:20.573102144 +0000 UTC m=+0.082236227 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1)
Oct 14 08:48:20 np0005486759.ooo.test sudo[97814]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:20 np0005486759.ooo.test podman[97880]: 2025-10-14 08:48:20.904425552 +0000 UTC m=+0.413559635 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_migration_target)
Oct 14 08:48:20 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:48:21 np0005486759.ooo.test sudo[97911]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps_3fc7xq/privsep.sock
Oct 14 08:48:21 np0005486759.ooo.test sudo[97911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:21 np0005486759.ooo.test systemd[1]: tmp-crun.6WcYeK.mount: Deactivated successfully.
Oct 14 08:48:21 np0005486759.ooo.test sudo[97911]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:22 np0005486759.ooo.test sudo[97922]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3m6ddqwb/privsep.sock
Oct 14 08:48:22 np0005486759.ooo.test sudo[97922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:48:22 np0005486759.ooo.test podman[97924]: 2025-10-14 08:48:22.214137879 +0000 UTC m=+0.075565840 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, build-date=2025-07-21T13:07:59, tcib_managed=true, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, batch=17.1_20250721.1)
Oct 14 08:48:22 np0005486759.ooo.test podman[97924]: 2025-10-14 08:48:22.436892381 +0000 UTC m=+0.298320312 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., release=1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:48:22 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:48:22 np0005486759.ooo.test sudo[97922]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:23 np0005486759.ooo.test sudo[97962]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpasal7cgo/privsep.sock
Oct 14 08:48:23 np0005486759.ooo.test sudo[97962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:23 np0005486759.ooo.test sudo[97962]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:23 np0005486759.ooo.test sudo[97979]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk_9bqwvd/privsep.sock
Oct 14 08:48:23 np0005486759.ooo.test sudo[97979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:24 np0005486759.ooo.test sudo[97979]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:24 np0005486759.ooo.test sudo[97990]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfq8aix0e/privsep.sock
Oct 14 08:48:24 np0005486759.ooo.test sudo[97990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:25 np0005486759.ooo.test sudo[97990]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:25 np0005486759.ooo.test sudo[98001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpicwsyh3h/privsep.sock
Oct 14 08:48:25 np0005486759.ooo.test sudo[98001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:26 np0005486759.ooo.test sudo[98001]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:26 np0005486759.ooo.test sudo[98012]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ui_58pz/privsep.sock
Oct 14 08:48:26 np0005486759.ooo.test sudo[98012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:27 np0005486759.ooo.test sudo[98012]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:27 np0005486759.ooo.test sudo[98023]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf2em61fd/privsep.sock
Oct 14 08:48:27 np0005486759.ooo.test sudo[98023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:28 np0005486759.ooo.test sudo[98023]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:28 np0005486759.ooo.test sudo[98034]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1n46f4pq/privsep.sock
Oct 14 08:48:28 np0005486759.ooo.test sudo[98034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:29 np0005486759.ooo.test sudo[98034]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:29 np0005486759.ooo.test sudo[98051]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzem6dibk/privsep.sock
Oct 14 08:48:29 np0005486759.ooo.test sudo[98051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:30 np0005486759.ooo.test sudo[98051]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:48:30 np0005486759.ooo.test podman[98057]: 2025-10-14 08:48:30.274805367 +0000 UTC m=+0.050076237 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-cron-container)
Oct 14 08:48:30 np0005486759.ooo.test podman[98057]: 2025-10-14 08:48:30.284440896 +0000 UTC m=+0.059711786 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, batch=17.1_20250721.1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:48:30 np0005486759.ooo.test podman[98077]: 2025-10-14 08:48:30.433346844 +0000 UTC m=+0.055913297 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:48:30 np0005486759.ooo.test podman[98077]: 2025-10-14 08:48:30.472583404 +0000 UTC m=+0.095149877 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, io.openshift.expose-services=)
Oct 14 08:48:30 np0005486759.ooo.test sudo[98109]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6z25bgiz/privsep.sock
Oct 14 08:48:30 np0005486759.ooo.test sudo[98109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:30 np0005486759.ooo.test podman[98077]: unhealthy
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:48:30 np0005486759.ooo.test podman[98076]: 2025-10-14 08:48:30.479799289 +0000 UTC m=+0.100156125 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, architecture=x86_64, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Oct 14 08:48:30 np0005486759.ooo.test podman[98076]: 2025-10-14 08:48:30.563337695 +0000 UTC m=+0.183694501 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.9, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 08:48:30 np0005486759.ooo.test podman[98076]: unhealthy
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:48:30 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:48:31 np0005486759.ooo.test sudo[98109]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:31 np0005486759.ooo.test sudo[98130]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphx90q02j/privsep.sock
Oct 14 08:48:31 np0005486759.ooo.test sudo[98130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:31 np0005486759.ooo.test sudo[98130]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:32 np0005486759.ooo.test sudo[98141]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpisisz2m9/privsep.sock
Oct 14 08:48:32 np0005486759.ooo.test sudo[98141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:32 np0005486759.ooo.test sudo[98141]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:33 np0005486759.ooo.test sudo[98152]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxu3sgu_8/privsep.sock
Oct 14 08:48:33 np0005486759.ooo.test sudo[98152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:33 np0005486759.ooo.test sudo[98152]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:34 np0005486759.ooo.test sudo[98163]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnlzkbwua/privsep.sock
Oct 14 08:48:34 np0005486759.ooo.test sudo[98163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:34 np0005486759.ooo.test sudo[98163]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:35 np0005486759.ooo.test sudo[98180]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxc3k3qdk/privsep.sock
Oct 14 08:48:35 np0005486759.ooo.test sudo[98180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:35 np0005486759.ooo.test sudo[98180]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:35 np0005486759.ooo.test sudo[98191]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbixjuc53/privsep.sock
Oct 14 08:48:35 np0005486759.ooo.test sudo[98191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:48:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:48:36 np0005486759.ooo.test podman[98195]: 2025-10-14 08:48:36.461802932 +0000 UTC m=+0.082550207 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:48:36 np0005486759.ooo.test podman[98194]: 2025-10-14 08:48:36.518824774 +0000 UTC m=+0.143824561 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, container_name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:48:36 np0005486759.ooo.test podman[98195]: 2025-10-14 08:48:36.53317559 +0000 UTC m=+0.153922865 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.9)
Oct 14 08:48:36 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:48:36 np0005486759.ooo.test podman[98194]: 2025-10-14 08:48:36.543901154 +0000 UTC m=+0.168900921 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 08:48:36 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:48:36 np0005486759.ooo.test sudo[98191]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:36 np0005486759.ooo.test sudo[98249]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ruhsofx/privsep.sock
Oct 14 08:48:36 np0005486759.ooo.test sudo[98249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:37 np0005486759.ooo.test sudo[98249]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:37 np0005486759.ooo.test sudo[98260]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmlcnpryi/privsep.sock
Oct 14 08:48:37 np0005486759.ooo.test sudo[98260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:38 np0005486759.ooo.test sudo[98260]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:38 np0005486759.ooo.test sudo[98271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzzoo30wg/privsep.sock
Oct 14 08:48:38 np0005486759.ooo.test sudo[98271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:39 np0005486759.ooo.test sudo[98271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:39 np0005486759.ooo.test sudo[98284]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ec6rrs6/privsep.sock
Oct 14 08:48:39 np0005486759.ooo.test sudo[98284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:40 np0005486759.ooo.test sudo[98284]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:40 np0005486759.ooo.test sudo[98299]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc7_9f8fv/privsep.sock
Oct 14 08:48:40 np0005486759.ooo.test sudo[98299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:41 np0005486759.ooo.test sudo[98299]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:41 np0005486759.ooo.test sudo[98310]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbubgnrz9/privsep.sock
Oct 14 08:48:41 np0005486759.ooo.test sudo[98310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:42 np0005486759.ooo.test sudo[98310]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:42 np0005486759.ooo.test sudo[98321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8vzi3uj7/privsep.sock
Oct 14 08:48:42 np0005486759.ooo.test sudo[98321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:43 np0005486759.ooo.test sudo[98321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:43 np0005486759.ooo.test sudo[98332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppeoj9k_g/privsep.sock
Oct 14 08:48:43 np0005486759.ooo.test sudo[98332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:43 np0005486759.ooo.test sudo[98332]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:44 np0005486759.ooo.test sudo[98343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp42p2y_o9/privsep.sock
Oct 14 08:48:44 np0005486759.ooo.test sudo[98343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:44 np0005486759.ooo.test sudo[98343]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:45 np0005486759.ooo.test sudo[98359]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk7v2hsty/privsep.sock
Oct 14 08:48:45 np0005486759.ooo.test sudo[98359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:45 np0005486759.ooo.test sudo[98359]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:46 np0005486759.ooo.test sudo[98371]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbsqavrae/privsep.sock
Oct 14 08:48:46 np0005486759.ooo.test sudo[98371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:46 np0005486759.ooo.test sudo[98371]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:47 np0005486759.ooo.test sudo[98382]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9km4son1/privsep.sock
Oct 14 08:48:47 np0005486759.ooo.test sudo[98382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:47 np0005486759.ooo.test sudo[98382]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:48 np0005486759.ooo.test sudo[98393]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiu1kkoxp/privsep.sock
Oct 14 08:48:48 np0005486759.ooo.test sudo[98393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:48 np0005486759.ooo.test sudo[98393]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:49 np0005486759.ooo.test sudo[98404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp10_mvqyy/privsep.sock
Oct 14 08:48:49 np0005486759.ooo.test sudo[98404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:49 np0005486759.ooo.test sudo[98404]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:50 np0005486759.ooo.test sudo[98415]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpimgm1562/privsep.sock
Oct 14 08:48:50 np0005486759.ooo.test sudo[98415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:50 np0005486759.ooo.test sudo[98415]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:48:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:48:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:48:50 np0005486759.ooo.test podman[98428]: 2025-10-14 08:48:50.828846995 +0000 UTC m=+0.084436186 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, container_name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:48:50 np0005486759.ooo.test systemd[1]: tmp-crun.9Q9JJ4.mount: Deactivated successfully.
Oct 14 08:48:50 np0005486759.ooo.test podman[98428]: 2025-10-14 08:48:50.887553797 +0000 UTC m=+0.143142978 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, version=17.1.9, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, build-date=2025-07-21T14:48:37, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:48:50 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:48:50 np0005486759.ooo.test podman[98429]: 2025-10-14 08:48:50.893227174 +0000 UTC m=+0.147244916 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, container_name=collectd, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 08:48:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:48:50 np0005486759.ooo.test podman[98427]: 2025-10-14 08:48:50.954056792 +0000 UTC m=+0.210498429 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, container_name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Oct 14 08:48:50 np0005486759.ooo.test podman[98429]: 2025-10-14 08:48:50.976686518 +0000 UTC m=+0.230704210 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public)
Oct 14 08:48:51 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:48:51 np0005486759.ooo.test sudo[98505]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprl00oruh/privsep.sock
Oct 14 08:48:51 np0005486759.ooo.test sudo[98505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:51 np0005486759.ooo.test podman[98427]: 2025-10-14 08:48:51.041577902 +0000 UTC m=+0.298019589 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, version=17.1.9, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 08:48:51 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:48:51 np0005486759.ooo.test podman[98489]: 2025-10-14 08:48:51.137425853 +0000 UTC m=+0.178026675 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=nova_migration_target, config_id=tripleo_step4, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:48:51 np0005486759.ooo.test podman[98489]: 2025-10-14 08:48:51.516376587 +0000 UTC m=+0.556977419 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, container_name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, maintainer=OpenStack TripleO Team, vcs-type=git)
Oct 14 08:48:51 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:48:51 np0005486759.ooo.test sudo[98505]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:52 np0005486759.ooo.test sudo[98530]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq9_jggzd/privsep.sock
Oct 14 08:48:52 np0005486759.ooo.test sudo[98530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:52 np0005486759.ooo.test sudo[98530]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:48:52 np0005486759.ooo.test podman[98534]: 2025-10-14 08:48:52.763004074 +0000 UTC m=+0.084978972 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.33.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git)
Oct 14 08:48:52 np0005486759.ooo.test sudo[98568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqwnsdb87/privsep.sock
Oct 14 08:48:52 np0005486759.ooo.test sudo[98568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:52 np0005486759.ooo.test podman[98534]: 2025-10-14 08:48:52.966484254 +0000 UTC m=+0.288459132 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible)
Oct 14 08:48:52 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:48:53 np0005486759.ooo.test sudo[98568]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:53 np0005486759.ooo.test sudo[98580]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa7zzig9j/privsep.sock
Oct 14 08:48:53 np0005486759.ooo.test sudo[98580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:54 np0005486759.ooo.test sudo[98580]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:54 np0005486759.ooo.test sudo[98591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpial82nfo/privsep.sock
Oct 14 08:48:54 np0005486759.ooo.test sudo[98591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:55 np0005486759.ooo.test sudo[98591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:55 np0005486759.ooo.test sudo[98602]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5b7e4pob/privsep.sock
Oct 14 08:48:55 np0005486759.ooo.test sudo[98602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:56 np0005486759.ooo.test sudo[98602]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:56 np0005486759.ooo.test sudo[98619]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpigy1f7j_/privsep.sock
Oct 14 08:48:56 np0005486759.ooo.test sudo[98619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:57 np0005486759.ooo.test sudo[98619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:57 np0005486759.ooo.test sudo[98630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmug_1o9s/privsep.sock
Oct 14 08:48:57 np0005486759.ooo.test sudo[98630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:58 np0005486759.ooo.test sudo[98630]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:58 np0005486759.ooo.test sudo[98641]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn5yseu84/privsep.sock
Oct 14 08:48:58 np0005486759.ooo.test sudo[98641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:48:59 np0005486759.ooo.test sudo[98641]: pam_unix(sudo:session): session closed for user root
Oct 14 08:48:59 np0005486759.ooo.test sudo[98652]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbcx_tu__/privsep.sock
Oct 14 08:48:59 np0005486759.ooo.test sudo[98652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:00 np0005486759.ooo.test sudo[98652]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:49:00 np0005486759.ooo.test podman[98658]: 2025-10-14 08:49:00.455762006 +0000 UTC m=+0.082722622 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T13:07:52, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:49:00 np0005486759.ooo.test podman[98658]: 2025-10-14 08:49:00.464564411 +0000 UTC m=+0.091525047 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, release=1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc.)
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:49:00 np0005486759.ooo.test sudo[98682]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjoq39a6u/privsep.sock
Oct 14 08:49:00 np0005486759.ooo.test sudo[98682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: tmp-crun.jCixJh.mount: Deactivated successfully.
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:49:00 np0005486759.ooo.test podman[98684]: 2025-10-14 08:49:00.649225042 +0000 UTC m=+0.084038863 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi)
Oct 14 08:49:00 np0005486759.ooo.test podman[98684]: 2025-10-14 08:49:00.663471948 +0000 UTC m=+0.098285759 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, release=1)
Oct 14 08:49:00 np0005486759.ooo.test podman[98684]: unhealthy
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:49:00 np0005486759.ooo.test podman[98698]: 2025-10-14 08:49:00.71419792 +0000 UTC m=+0.062191782 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, release=1, architecture=x86_64, container_name=ceilometer_agent_compute)
Oct 14 08:49:00 np0005486759.ooo.test podman[98698]: 2025-10-14 08:49:00.724111739 +0000 UTC m=+0.072105621 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:49:00 np0005486759.ooo.test podman[98698]: unhealthy
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:49:00 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:49:01 np0005486759.ooo.test sudo[98682]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:01 np0005486759.ooo.test systemd[1]: tmp-crun.9QqNWp.mount: Deactivated successfully.
Oct 14 08:49:01 np0005486759.ooo.test sudo[98736]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzgx8njks/privsep.sock
Oct 14 08:49:01 np0005486759.ooo.test sudo[98736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:02 np0005486759.ooo.test sudo[98736]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:02 np0005486759.ooo.test sudo[98748]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk78cl7m9/privsep.sock
Oct 14 08:49:02 np0005486759.ooo.test sudo[98748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:02 np0005486759.ooo.test sudo[98748]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:03 np0005486759.ooo.test sudo[98759]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpogrev9tu/privsep.sock
Oct 14 08:49:03 np0005486759.ooo.test sudo[98759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:03 np0005486759.ooo.test sudo[98759]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:04 np0005486759.ooo.test sudo[98770]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6s_ev74t/privsep.sock
Oct 14 08:49:04 np0005486759.ooo.test sudo[98770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:04 np0005486759.ooo.test sudo[98770]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:04 np0005486759.ooo.test sudo[98781]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgsm__v48/privsep.sock
Oct 14 08:49:04 np0005486759.ooo.test sudo[98781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:05 np0005486759.ooo.test sudo[98781]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:05 np0005486759.ooo.test sudo[98804]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu10sjq7c/privsep.sock
Oct 14 08:49:05 np0005486759.ooo.test sudo[98804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:06 np0005486759.ooo.test sudo[98804]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:06 np0005486759.ooo.test sudo[98817]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfy5ad4my/privsep.sock
Oct 14 08:49:06 np0005486759.ooo.test sudo[98817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:49:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:49:06 np0005486759.ooo.test podman[98820]: 2025-10-14 08:49:06.802518889 +0000 UTC m=+0.069014135 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, config_id=tripleo_step4, version=17.1.9, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.33.12, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team)
Oct 14 08:49:06 np0005486759.ooo.test podman[98820]: 2025-10-14 08:49:06.848084121 +0000 UTC m=+0.114579397 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., batch=17.1_20250721.1)
Oct 14 08:49:06 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:49:06 np0005486759.ooo.test podman[98819]: 2025-10-14 08:49:06.862652205 +0000 UTC m=+0.126128156 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:49:06 np0005486759.ooo.test podman[98819]: 2025-10-14 08:49:06.924414553 +0000 UTC m=+0.187890524 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:49:06 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:49:07 np0005486759.ooo.test sudo[98817]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:07 np0005486759.ooo.test sudo[98878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7w060kjn/privsep.sock
Oct 14 08:49:07 np0005486759.ooo.test sudo[98878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:08 np0005486759.ooo.test sudo[98878]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:08 np0005486759.ooo.test sudo[98889]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpixid75su/privsep.sock
Oct 14 08:49:08 np0005486759.ooo.test sudo[98889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:09 np0005486759.ooo.test sudo[98889]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:09 np0005486759.ooo.test sudo[98900]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpylok6svr/privsep.sock
Oct 14 08:49:09 np0005486759.ooo.test sudo[98900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:10 np0005486759.ooo.test sudo[98900]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:10 np0005486759.ooo.test sudo[98911]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2u4vdrq8/privsep.sock
Oct 14 08:49:10 np0005486759.ooo.test sudo[98911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:10 np0005486759.ooo.test sudo[98911]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:11 np0005486759.ooo.test sudo[98922]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvh4pudqc/privsep.sock
Oct 14 08:49:11 np0005486759.ooo.test sudo[98922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:11 np0005486759.ooo.test sudo[98922]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:11 np0005486759.ooo.test sudo[98933]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5zm9lsgs/privsep.sock
Oct 14 08:49:11 np0005486759.ooo.test sudo[98933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:12 np0005486759.ooo.test sudo[98933]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:12 np0005486759.ooo.test sudo[98950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkmcsghqd/privsep.sock
Oct 14 08:49:12 np0005486759.ooo.test sudo[98950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:13 np0005486759.ooo.test sudo[98950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:13 np0005486759.ooo.test sudo[98961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpofw9p1gk/privsep.sock
Oct 14 08:49:13 np0005486759.ooo.test sudo[98961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:14 np0005486759.ooo.test sudo[98961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:14 np0005486759.ooo.test sudo[98972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5cue_51b/privsep.sock
Oct 14 08:49:14 np0005486759.ooo.test sudo[98972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:15 np0005486759.ooo.test sudo[98972]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:15 np0005486759.ooo.test sudo[98983]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa9ry0hs7/privsep.sock
Oct 14 08:49:15 np0005486759.ooo.test sudo[98983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:16 np0005486759.ooo.test sudo[98983]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:16 np0005486759.ooo.test sudo[98994]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8pwq3ec9/privsep.sock
Oct 14 08:49:16 np0005486759.ooo.test sudo[98994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:16 np0005486759.ooo.test sudo[98994]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:17 np0005486759.ooo.test sudo[99005]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeymjaluh/privsep.sock
Oct 14 08:49:17 np0005486759.ooo.test sudo[99005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:17 np0005486759.ooo.test sudo[99005]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:18 np0005486759.ooo.test sudo[99022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqka_ui2a/privsep.sock
Oct 14 08:49:18 np0005486759.ooo.test sudo[99022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:18 np0005486759.ooo.test sudo[99022]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:18 np0005486759.ooo.test sudo[99033]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj8f5p2zf/privsep.sock
Oct 14 08:49:18 np0005486759.ooo.test sudo[99033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:19 np0005486759.ooo.test sudo[99033]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:19 np0005486759.ooo.test sudo[99044]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpafzyy7al/privsep.sock
Oct 14 08:49:19 np0005486759.ooo.test sudo[99044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:20 np0005486759.ooo.test sudo[99044]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:20 np0005486759.ooo.test sudo[99055]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1i35_ioa/privsep.sock
Oct 14 08:49:20 np0005486759.ooo.test sudo[99055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:21 np0005486759.ooo.test sudo[99055]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:49:21 np0005486759.ooo.test podman[99063]: 2025-10-14 08:49:21.232074273 +0000 UTC m=+0.068186038 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, architecture=x86_64, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, release=2, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: tmp-crun.Fj5h3I.mount: Deactivated successfully.
Oct 14 08:49:21 np0005486759.ooo.test podman[99062]: 2025-10-14 08:49:21.256727483 +0000 UTC m=+0.092319462 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, release=1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:49:21 np0005486759.ooo.test podman[99062]: 2025-10-14 08:49:21.28423305 +0000 UTC m=+0.119825039 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37)
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:49:21 np0005486759.ooo.test podman[99059]: 2025-10-14 08:49:21.30793926 +0000 UTC m=+0.144723817 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid)
Oct 14 08:49:21 np0005486759.ooo.test podman[99063]: 2025-10-14 08:49:21.322117593 +0000 UTC m=+0.158229378 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, release=2, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, distribution-scope=public)
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:49:21 np0005486759.ooo.test podman[99059]: 2025-10-14 08:49:21.341809277 +0000 UTC m=+0.178593894 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:49:21 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:49:21 np0005486759.ooo.test sudo[99125]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7m206m1e/privsep.sock
Oct 14 08:49:21 np0005486759.ooo.test sudo[99125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:21 np0005486759.ooo.test sudo[99125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:49:22 np0005486759.ooo.test podman[99131]: 2025-10-14 08:49:22.095678579 +0000 UTC m=+0.069349905 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, container_name=nova_migration_target, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vcs-type=git)
Oct 14 08:49:22 np0005486759.ooo.test systemd[1]: tmp-crun.wlOFEn.mount: Deactivated successfully.
Oct 14 08:49:22 np0005486759.ooo.test sudo[99156]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsqk_s5o5/privsep.sock
Oct 14 08:49:22 np0005486759.ooo.test sudo[99156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:22 np0005486759.ooo.test podman[99131]: 2025-10-14 08:49:22.462915648 +0000 UTC m=+0.436586984 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, architecture=x86_64, distribution-scope=public)
Oct 14 08:49:22 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:49:22 np0005486759.ooo.test sudo[99156]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:23 np0005486759.ooo.test sudo[99176]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjxue8bzd/privsep.sock
Oct 14 08:49:23 np0005486759.ooo.test sudo[99176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:49:23 np0005486759.ooo.test podman[99178]: 2025-10-14 08:49:23.199606244 +0000 UTC m=+0.069607733 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container)
Oct 14 08:49:23 np0005486759.ooo.test podman[99178]: 2025-10-14 08:49:23.365028666 +0000 UTC m=+0.235030225 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1)
Oct 14 08:49:23 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:49:23 np0005486759.ooo.test sudo[99176]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:23 np0005486759.ooo.test sudo[99218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqu_dsd_q/privsep.sock
Oct 14 08:49:23 np0005486759.ooo.test sudo[99218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:24 np0005486759.ooo.test sudo[99218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:24 np0005486759.ooo.test sudo[99229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuk5kzl31/privsep.sock
Oct 14 08:49:24 np0005486759.ooo.test sudo[99229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:25 np0005486759.ooo.test sudo[99229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:25 np0005486759.ooo.test sudo[99240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp70md1xpg/privsep.sock
Oct 14 08:49:25 np0005486759.ooo.test sudo[99240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:26 np0005486759.ooo.test sudo[99240]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:26 np0005486759.ooo.test sudo[99251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv0wd0_b5/privsep.sock
Oct 14 08:49:26 np0005486759.ooo.test sudo[99251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:27 np0005486759.ooo.test sudo[99251]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:27 np0005486759.ooo.test sudo[99262]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp58dlbvf5/privsep.sock
Oct 14 08:49:27 np0005486759.ooo.test sudo[99262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:28 np0005486759.ooo.test sudo[99262]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:28 np0005486759.ooo.test sudo[99275]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj3gztwy7/privsep.sock
Oct 14 08:49:28 np0005486759.ooo.test sudo[99275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:28 np0005486759.ooo.test sudo[99275]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:29 np0005486759.ooo.test sudo[99290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp87ggyc_d/privsep.sock
Oct 14 08:49:29 np0005486759.ooo.test sudo[99290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:29 np0005486759.ooo.test sudo[99290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:30 np0005486759.ooo.test sudo[99301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8x9g25vu/privsep.sock
Oct 14 08:49:30 np0005486759.ooo.test sudo[99301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:30 np0005486759.ooo.test sudo[99301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: tmp-crun.0dju0x.mount: Deactivated successfully.
Oct 14 08:49:30 np0005486759.ooo.test podman[99307]: 2025-10-14 08:49:30.771346469 +0000 UTC m=+0.086265462 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.9, release=1, container_name=logrotate_crond, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 08:49:30 np0005486759.ooo.test podman[99307]: 2025-10-14 08:49:30.806704363 +0000 UTC m=+0.121623366 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.33.12, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:49:30 np0005486759.ooo.test podman[99308]: 2025-10-14 08:49:30.822169305 +0000 UTC m=+0.132644599 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T15:29:47, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4)
Oct 14 08:49:30 np0005486759.ooo.test podman[99308]: 2025-10-14 08:49:30.854274357 +0000 UTC m=+0.164749651 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:49:30 np0005486759.ooo.test podman[99308]: unhealthy
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:49:30 np0005486759.ooo.test podman[99329]: 2025-10-14 08:49:30.858481068 +0000 UTC m=+0.076371894 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:49:30 np0005486759.ooo.test sudo[99366]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpirlerw1k/privsep.sock
Oct 14 08:49:30 np0005486759.ooo.test podman[99329]: 2025-10-14 08:49:30.942144399 +0000 UTC m=+0.160035235 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:49:30 np0005486759.ooo.test sudo[99366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:30 np0005486759.ooo.test podman[99329]: unhealthy
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:49:30 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:49:31 np0005486759.ooo.test sudo[99366]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:31 np0005486759.ooo.test systemd[1]: tmp-crun.dAuLby.mount: Deactivated successfully.
Oct 14 08:49:31 np0005486759.ooo.test sudo[99377]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4zw8mc5j/privsep.sock
Oct 14 08:49:31 np0005486759.ooo.test sudo[99377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:32 np0005486759.ooo.test sudo[99377]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:32 np0005486759.ooo.test sudo[99388]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5kkf9um/privsep.sock
Oct 14 08:49:32 np0005486759.ooo.test sudo[99388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:33 np0005486759.ooo.test sudo[99388]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:33 np0005486759.ooo.test sudo[99399]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbirkfolw/privsep.sock
Oct 14 08:49:33 np0005486759.ooo.test sudo[99399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:34 np0005486759.ooo.test sudo[99399]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:34 np0005486759.ooo.test sudo[99416]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfm909bpo/privsep.sock
Oct 14 08:49:34 np0005486759.ooo.test sudo[99416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:35 np0005486759.ooo.test sudo[99416]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:35 np0005486759.ooo.test sudo[99427]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6uyx8w4w/privsep.sock
Oct 14 08:49:35 np0005486759.ooo.test sudo[99427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:35 np0005486759.ooo.test sudo[99427]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:36 np0005486759.ooo.test sudo[99438]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_gshwe5z/privsep.sock
Oct 14 08:49:36 np0005486759.ooo.test sudo[99438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:36 np0005486759.ooo.test sudo[99438]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:49:36 np0005486759.ooo.test podman[99443]: 2025-10-14 08:49:36.973185371 +0000 UTC m=+0.082606729 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.33.12, release=1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:49:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:49:37 np0005486759.ooo.test podman[99443]: 2025-10-14 08:49:37.011515397 +0000 UTC m=+0.120936735 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, release=1, batch=17.1_20250721.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:49:37 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:49:37 np0005486759.ooo.test podman[99463]: 2025-10-14 08:49:37.079455307 +0000 UTC m=+0.076824729 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:49:37 np0005486759.ooo.test podman[99463]: 2025-10-14 08:49:37.114236642 +0000 UTC m=+0.111606044 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 08:49:37 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:49:37 np0005486759.ooo.test sudo[99497]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptl_obnz5/privsep.sock
Oct 14 08:49:37 np0005486759.ooo.test sudo[99497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:37 np0005486759.ooo.test sudo[99497]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:37 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:49:37 np0005486759.ooo.test recover_tripleo_nova_virtqemud[99504]: 47951
Oct 14 08:49:37 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:49:37 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:49:38 np0005486759.ooo.test sudo[99510]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdwncfmc6/privsep.sock
Oct 14 08:49:38 np0005486759.ooo.test sudo[99510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:38 np0005486759.ooo.test sudo[99510]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:38 np0005486759.ooo.test sudo[99521]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzs461pyu/privsep.sock
Oct 14 08:49:38 np0005486759.ooo.test sudo[99521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:39 np0005486759.ooo.test sudo[99521]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:39 np0005486759.ooo.test sudo[99538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc7tltqgx/privsep.sock
Oct 14 08:49:39 np0005486759.ooo.test sudo[99538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:40 np0005486759.ooo.test sudo[99538]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:40 np0005486759.ooo.test sudo[99549]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt399ki9c/privsep.sock
Oct 14 08:49:40 np0005486759.ooo.test sudo[99549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:41 np0005486759.ooo.test sudo[99549]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:41 np0005486759.ooo.test sudo[99560]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz6a9n0wl/privsep.sock
Oct 14 08:49:41 np0005486759.ooo.test sudo[99560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:42 np0005486759.ooo.test sudo[99560]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:42 np0005486759.ooo.test sudo[99571]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaxqrchwz/privsep.sock
Oct 14 08:49:42 np0005486759.ooo.test sudo[99571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:43 np0005486759.ooo.test sudo[99571]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:43 np0005486759.ooo.test sudo[99582]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmri8mc9z/privsep.sock
Oct 14 08:49:43 np0005486759.ooo.test sudo[99582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:43 np0005486759.ooo.test sudo[99582]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:44 np0005486759.ooo.test sudo[99593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxbm0nhnp/privsep.sock
Oct 14 08:49:44 np0005486759.ooo.test sudo[99593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:44 np0005486759.ooo.test sudo[99593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:45 np0005486759.ooo.test sudo[99610]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp29trt5t3/privsep.sock
Oct 14 08:49:45 np0005486759.ooo.test sudo[99610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:45 np0005486759.ooo.test sudo[99610]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:45 np0005486759.ooo.test sudo[99621]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm2bub3dq/privsep.sock
Oct 14 08:49:45 np0005486759.ooo.test sudo[99621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:46 np0005486759.ooo.test sudo[99621]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:46 np0005486759.ooo.test sudo[99632]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1a8dqinq/privsep.sock
Oct 14 08:49:46 np0005486759.ooo.test sudo[99632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:47 np0005486759.ooo.test sudo[99632]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:47 np0005486759.ooo.test sudo[99643]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd6ui87sv/privsep.sock
Oct 14 08:49:47 np0005486759.ooo.test sudo[99643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:48 np0005486759.ooo.test sudo[99643]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:48 np0005486759.ooo.test sudo[99654]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6tlc0dc0/privsep.sock
Oct 14 08:49:48 np0005486759.ooo.test sudo[99654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:49 np0005486759.ooo.test sudo[99654]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:49 np0005486759.ooo.test sudo[99665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu9bx7fpo/privsep.sock
Oct 14 08:49:49 np0005486759.ooo.test sudo[99665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:50 np0005486759.ooo.test sudo[99665]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:50 np0005486759.ooo.test sudo[99682]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3hvmlv_0/privsep.sock
Oct 14 08:49:50 np0005486759.ooo.test sudo[99682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:50 np0005486759.ooo.test sudo[99682]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:51 np0005486759.ooo.test sudo[99693]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp94ep8i3o/privsep.sock
Oct 14 08:49:51 np0005486759.ooo.test sudo[99693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: tmp-crun.ZtdCzJ.mount: Deactivated successfully.
Oct 14 08:49:51 np0005486759.ooo.test podman[99695]: 2025-10-14 08:49:51.452099684 +0000 UTC m=+0.082817574 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, release=1, architecture=x86_64, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true)
Oct 14 08:49:51 np0005486759.ooo.test podman[99696]: 2025-10-14 08:49:51.492091753 +0000 UTC m=+0.122510824 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=2, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:49:51 np0005486759.ooo.test podman[99695]: 2025-10-14 08:49:51.502241459 +0000 UTC m=+0.132959349 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.buildah.version=1.33.12, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64)
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:49:51 np0005486759.ooo.test podman[99696]: 2025-10-14 08:49:51.53527247 +0000 UTC m=+0.165691571 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, release=2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:49:51 np0005486759.ooo.test podman[99697]: 2025-10-14 08:49:51.553406615 +0000 UTC m=+0.177307723 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.9, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12)
Oct 14 08:49:51 np0005486759.ooo.test podman[99697]: 2025-10-14 08:49:51.59134892 +0000 UTC m=+0.215249998 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid)
Oct 14 08:49:51 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:49:51 np0005486759.ooo.test sudo[99693]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:52 np0005486759.ooo.test sudo[99767]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwie5sem4/privsep.sock
Oct 14 08:49:52 np0005486759.ooo.test sudo[99767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:52 np0005486759.ooo.test sudo[99767]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:49:52 np0005486759.ooo.test systemd[1]: tmp-crun.Wm4Acq.mount: Deactivated successfully.
Oct 14 08:49:52 np0005486759.ooo.test podman[99772]: 2025-10-14 08:49:52.853404869 +0000 UTC m=+0.054493232 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_id=tripleo_step4, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Oct 14 08:49:53 np0005486759.ooo.test sudo[99801]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxtmgva7j/privsep.sock
Oct 14 08:49:53 np0005486759.ooo.test sudo[99801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:53 np0005486759.ooo.test podman[99772]: 2025-10-14 08:49:53.216192978 +0000 UTC m=+0.417281321 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:49:53 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:49:53 np0005486759.ooo.test sudo[99801]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:49:53 np0005486759.ooo.test podman[99808]: 2025-10-14 08:49:53.738055581 +0000 UTC m=+0.067606450 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container)
Oct 14 08:49:53 np0005486759.ooo.test systemd[1]: tmp-crun.kA7Pca.mount: Deactivated successfully.
Oct 14 08:49:53 np0005486759.ooo.test sudo[99842]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpenodzrpy/privsep.sock
Oct 14 08:49:53 np0005486759.ooo.test sudo[99842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:53 np0005486759.ooo.test podman[99808]: 2025-10-14 08:49:53.983256563 +0000 UTC m=+0.312807342 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, config_id=tripleo_step1, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git)
Oct 14 08:49:53 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:49:54 np0005486759.ooo.test sudo[99842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:54 np0005486759.ooo.test sudo[99854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptarmm19y/privsep.sock
Oct 14 08:49:54 np0005486759.ooo.test sudo[99854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:55 np0005486759.ooo.test sudo[99854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:55 np0005486759.ooo.test sudo[99871]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdgqhkvxz/privsep.sock
Oct 14 08:49:55 np0005486759.ooo.test sudo[99871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:56 np0005486759.ooo.test sudo[99871]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:56 np0005486759.ooo.test sudo[99882]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo39kb75y/privsep.sock
Oct 14 08:49:56 np0005486759.ooo.test sudo[99882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:57 np0005486759.ooo.test sudo[99882]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:57 np0005486759.ooo.test sudo[99893]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpix2kmcgo/privsep.sock
Oct 14 08:49:57 np0005486759.ooo.test sudo[99893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:58 np0005486759.ooo.test sudo[99893]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:58 np0005486759.ooo.test sudo[99904]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9qlvmnrf/privsep.sock
Oct 14 08:49:58 np0005486759.ooo.test sudo[99904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:58 np0005486759.ooo.test sudo[99904]: pam_unix(sudo:session): session closed for user root
Oct 14 08:49:59 np0005486759.ooo.test sudo[99915]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp00cjv628/privsep.sock
Oct 14 08:49:59 np0005486759.ooo.test sudo[99915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:49:59 np0005486759.ooo.test sudo[99915]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:00 np0005486759.ooo.test sudo[99926]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1gcbqh87/privsep.sock
Oct 14 08:50:00 np0005486759.ooo.test sudo[99926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:00 np0005486759.ooo.test sudo[99926]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:50:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: tmp-crun.LmMLDR.mount: Deactivated successfully.
Oct 14 08:50:01 np0005486759.ooo.test podman[99938]: 2025-10-14 08:50:01.032452203 +0000 UTC m=+0.076563910 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, architecture=x86_64, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:50:01 np0005486759.ooo.test podman[99937]: 2025-10-14 08:50:01.078845801 +0000 UTC m=+0.123656949 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=)
Oct 14 08:50:01 np0005486759.ooo.test podman[99938]: 2025-10-14 08:50:01.08330923 +0000 UTC m=+0.127420967 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:50:01 np0005486759.ooo.test podman[99938]: unhealthy
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:50:01 np0005486759.ooo.test podman[99937]: 2025-10-14 08:50:01.115447103 +0000 UTC m=+0.160258261 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, architecture=x86_64, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:50:01 np0005486759.ooo.test podman[99963]: 2025-10-14 08:50:01.160933712 +0000 UTC m=+0.125083434 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1)
Oct 14 08:50:01 np0005486759.ooo.test sudo[100003]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvfq5h2yl/privsep.sock
Oct 14 08:50:01 np0005486759.ooo.test sudo[100003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:01 np0005486759.ooo.test podman[99963]: 2025-10-14 08:50:01.177926152 +0000 UTC m=+0.142075884 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true)
Oct 14 08:50:01 np0005486759.ooo.test podman[99963]: unhealthy
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:50:01 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:50:01 np0005486759.ooo.test sudo[100003]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:02 np0005486759.ooo.test sudo[100017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_t8y5msy/privsep.sock
Oct 14 08:50:02 np0005486759.ooo.test sudo[100017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:02 np0005486759.ooo.test sudo[100017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:02 np0005486759.ooo.test sudo[100028]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpotiag2i3/privsep.sock
Oct 14 08:50:02 np0005486759.ooo.test sudo[100028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:03 np0005486759.ooo.test sudo[100028]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:03 np0005486759.ooo.test sudo[100039]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpii9j6_gl/privsep.sock
Oct 14 08:50:03 np0005486759.ooo.test sudo[100039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:04 np0005486759.ooo.test sudo[100039]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:04 np0005486759.ooo.test sudo[100050]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpupgko9cm/privsep.sock
Oct 14 08:50:04 np0005486759.ooo.test sudo[100050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:05 np0005486759.ooo.test sudo[100050]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:05 np0005486759.ooo.test sudo[100061]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppjcvazlz/privsep.sock
Oct 14 08:50:05 np0005486759.ooo.test sudo[100061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:06 np0005486759.ooo.test sudo[100061]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:06 np0005486759.ooo.test sudo[100085]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdqxhi7qn/privsep.sock
Oct 14 08:50:06 np0005486759.ooo.test sudo[100085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:07 np0005486759.ooo.test sudo[100085]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:50:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:50:07 np0005486759.ooo.test systemd[1]: tmp-crun.mb2pge.mount: Deactivated successfully.
Oct 14 08:50:07 np0005486759.ooo.test podman[100096]: 2025-10-14 08:50:07.188170826 +0000 UTC m=+0.099760494 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4)
Oct 14 08:50:07 np0005486759.ooo.test podman[100096]: 2025-10-14 08:50:07.215436386 +0000 UTC m=+0.127026064 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T13:28:44, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, vendor=Red Hat, Inc., container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:50:07 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:50:07 np0005486759.ooo.test podman[100111]: 2025-10-14 08:50:07.275679116 +0000 UTC m=+0.083124915 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.9, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12)
Oct 14 08:50:07 np0005486759.ooo.test podman[100111]: 2025-10-14 08:50:07.313451754 +0000 UTC m=+0.120897593 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:50:07 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:50:07 np0005486759.ooo.test sudo[100147]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprjf_xj0s/privsep.sock
Oct 14 08:50:07 np0005486759.ooo.test sudo[100147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:07 np0005486759.ooo.test sudo[100147]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:08 np0005486759.ooo.test sudo[100158]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm03uaigm/privsep.sock
Oct 14 08:50:08 np0005486759.ooo.test sudo[100158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:08 np0005486759.ooo.test sudo[100158]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:09 np0005486759.ooo.test sudo[100169]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr9lhptg9/privsep.sock
Oct 14 08:50:09 np0005486759.ooo.test sudo[100169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:09 np0005486759.ooo.test sudo[100169]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:10 np0005486759.ooo.test sudo[100180]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0i357jlr/privsep.sock
Oct 14 08:50:10 np0005486759.ooo.test sudo[100180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:10 np0005486759.ooo.test sudo[100180]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:10 np0005486759.ooo.test sudo[100191]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphaigbw6_/privsep.sock
Oct 14 08:50:10 np0005486759.ooo.test sudo[100191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:11 np0005486759.ooo.test sudo[100191]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:11 np0005486759.ooo.test sudo[100207]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp224b25_4/privsep.sock
Oct 14 08:50:11 np0005486759.ooo.test sudo[100207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:12 np0005486759.ooo.test sudo[100207]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:12 np0005486759.ooo.test sudo[100219]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb6w19k85/privsep.sock
Oct 14 08:50:12 np0005486759.ooo.test sudo[100219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:13 np0005486759.ooo.test sudo[100219]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:13 np0005486759.ooo.test sudo[100230]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0hpz8yeu/privsep.sock
Oct 14 08:50:13 np0005486759.ooo.test sudo[100230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:14 np0005486759.ooo.test sudo[100230]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:14 np0005486759.ooo.test sudo[100241]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp04c9mp5b/privsep.sock
Oct 14 08:50:14 np0005486759.ooo.test sudo[100241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:15 np0005486759.ooo.test sudo[100241]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:15 np0005486759.ooo.test sudo[100252]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprkonpwc2/privsep.sock
Oct 14 08:50:15 np0005486759.ooo.test sudo[100252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:16 np0005486759.ooo.test sudo[100252]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:16 np0005486759.ooo.test sudo[100263]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuav6_7q5/privsep.sock
Oct 14 08:50:16 np0005486759.ooo.test sudo[100263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:16 np0005486759.ooo.test sudo[100263]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:17 np0005486759.ooo.test sudo[100277]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8r87w06e/privsep.sock
Oct 14 08:50:17 np0005486759.ooo.test sudo[100277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:17 np0005486759.ooo.test sudo[100277]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:17 np0005486759.ooo.test sudo[100291]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuxkpejj9/privsep.sock
Oct 14 08:50:17 np0005486759.ooo.test sudo[100291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:18 np0005486759.ooo.test sudo[100291]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:18 np0005486759.ooo.test sudo[100302]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppcm_8up4/privsep.sock
Oct 14 08:50:18 np0005486759.ooo.test sudo[100302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:19 np0005486759.ooo.test sudo[100302]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:19 np0005486759.ooo.test sudo[100313]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxfknkvnr/privsep.sock
Oct 14 08:50:19 np0005486759.ooo.test sudo[100313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:20 np0005486759.ooo.test sudo[100313]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:20 np0005486759.ooo.test sudo[100324]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxtlur4y8/privsep.sock
Oct 14 08:50:20 np0005486759.ooo.test sudo[100324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:21 np0005486759.ooo.test sudo[100324]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:21 np0005486759.ooo.test sudo[100335]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwlr0i97u/privsep.sock
Oct 14 08:50:21 np0005486759.ooo.test sudo[100335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:50:21 np0005486759.ooo.test systemd[93754]: Created slice User Background Tasks Slice.
Oct 14 08:50:21 np0005486759.ooo.test systemd[93754]: Starting Cleanup of User's Temporary Files and Directories...
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:50:21 np0005486759.ooo.test systemd[93754]: Finished Cleanup of User's Temporary Files and Directories.
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: tmp-crun.rKYNFi.mount: Deactivated successfully.
Oct 14 08:50:21 np0005486759.ooo.test podman[100337]: 2025-10-14 08:50:21.617742019 +0000 UTC m=+0.076821228 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64)
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:50:21 np0005486759.ooo.test podman[100337]: 2025-10-14 08:50:21.647326753 +0000 UTC m=+0.106405962 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, tcib_managed=true, vcs-type=git, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:50:21 np0005486759.ooo.test podman[100365]: 2025-10-14 08:50:21.691437639 +0000 UTC m=+0.059807218 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3)
Oct 14 08:50:21 np0005486759.ooo.test podman[100365]: 2025-10-14 08:50:21.723146298 +0000 UTC m=+0.091515767 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, io.buildah.version=1.33.12, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, version=17.1.9)
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:50:21 np0005486759.ooo.test podman[100355]: 2025-10-14 08:50:21.725901594 +0000 UTC m=+0.105395370 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.9, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, managed_by=tripleo_ansible, release=2, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=)
Oct 14 08:50:21 np0005486759.ooo.test podman[100355]: 2025-10-14 08:50:21.804395703 +0000 UTC m=+0.183889469 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, version=17.1.9, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:50:21 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:50:22 np0005486759.ooo.test sudo[100335]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:22 np0005486759.ooo.test sudo[100413]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsmdwyeua/privsep.sock
Oct 14 08:50:22 np0005486759.ooo.test sudo[100413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:23 np0005486759.ooo.test sudo[100413]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:23 np0005486759.ooo.test sudo[100427]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvs6_eivr/privsep.sock
Oct 14 08:50:23 np0005486759.ooo.test sudo[100427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:50:23 np0005486759.ooo.test podman[100429]: 2025-10-14 08:50:23.42050412 +0000 UTC m=+0.061222682 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, release=1, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:50:23 np0005486759.ooo.test podman[100429]: 2025-10-14 08:50:23.821266024 +0000 UTC m=+0.461984576 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, container_name=nova_migration_target, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:50:23 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:50:23 np0005486759.ooo.test sudo[100427]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:24 np0005486759.ooo.test sudo[100460]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe08xcuy5/privsep.sock
Oct 14 08:50:24 np0005486759.ooo.test sudo[100460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:50:24 np0005486759.ooo.test systemd[1]: tmp-crun.ko7ifE.mount: Deactivated successfully.
Oct 14 08:50:24 np0005486759.ooo.test podman[100462]: 2025-10-14 08:50:24.199168736 +0000 UTC m=+0.054210662 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:50:24 np0005486759.ooo.test podman[100462]: 2025-10-14 08:50:24.377385767 +0000 UTC m=+0.232427753 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.9, config_id=tripleo_step1, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd)
Oct 14 08:50:24 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:50:24 np0005486759.ooo.test sudo[100460]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:24 np0005486759.ooo.test sudo[100500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiedpsy2x/privsep.sock
Oct 14 08:50:24 np0005486759.ooo.test sudo[100500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:25 np0005486759.ooo.test sudo[100500]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:25 np0005486759.ooo.test sudo[100511]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpon6j7g7q/privsep.sock
Oct 14 08:50:25 np0005486759.ooo.test sudo[100511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:26 np0005486759.ooo.test sudo[100511]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:26 np0005486759.ooo.test sudo[100522]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5tmt0y_y/privsep.sock
Oct 14 08:50:26 np0005486759.ooo.test sudo[100522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:27 np0005486759.ooo.test sudo[100522]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:27 np0005486759.ooo.test sudo[100533]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1bra4e0g/privsep.sock
Oct 14 08:50:27 np0005486759.ooo.test sudo[100533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:28 np0005486759.ooo.test sudo[100533]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:28 np0005486759.ooo.test sudo[100550]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0j6qruci/privsep.sock
Oct 14 08:50:28 np0005486759.ooo.test sudo[100550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:29 np0005486759.ooo.test sudo[100550]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:29 np0005486759.ooo.test sudo[100561]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7tp6tm9d/privsep.sock
Oct 14 08:50:29 np0005486759.ooo.test sudo[100561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:29 np0005486759.ooo.test sudo[100561]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:30 np0005486759.ooo.test sudo[100572]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd8oy7exe/privsep.sock
Oct 14 08:50:30 np0005486759.ooo.test sudo[100572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:30 np0005486759.ooo.test sudo[100572]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:31 np0005486759.ooo.test sudo[100583]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2fjpip5v/privsep.sock
Oct 14 08:50:31 np0005486759.ooo.test sudo[100583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:50:31 np0005486759.ooo.test podman[100585]: 2025-10-14 08:50:31.197331244 +0000 UTC m=+0.062027036 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64)
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:50:31 np0005486759.ooo.test podman[100585]: 2025-10-14 08:50:31.219470195 +0000 UTC m=+0.084165987 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step4, tcib_managed=true, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:50:31 np0005486759.ooo.test podman[100585]: unhealthy
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: tmp-crun.D0nBI1.mount: Deactivated successfully.
Oct 14 08:50:31 np0005486759.ooo.test podman[100616]: 2025-10-14 08:50:31.270777635 +0000 UTC m=+0.058234158 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, release=1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.)
Oct 14 08:50:31 np0005486759.ooo.test podman[100616]: 2025-10-14 08:50:31.308557165 +0000 UTC m=+0.096013708 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33)
Oct 14 08:50:31 np0005486759.ooo.test podman[100586]: 2025-10-14 08:50:31.312167917 +0000 UTC m=+0.168631062 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=logrotate_crond, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T13:07:52)
Oct 14 08:50:31 np0005486759.ooo.test podman[100586]: 2025-10-14 08:50:31.348351286 +0000 UTC m=+0.204814491 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, io.buildah.version=1.33.12, version=17.1.9)
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:50:31 np0005486759.ooo.test podman[100616]: unhealthy
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:50:31 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:50:31 np0005486759.ooo.test sudo[100583]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:31 np0005486759.ooo.test sudo[100651]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpijlizyyw/privsep.sock
Oct 14 08:50:31 np0005486759.ooo.test sudo[100651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:32 np0005486759.ooo.test sudo[100651]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:32 np0005486759.ooo.test sudo[100662]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu4v7il04/privsep.sock
Oct 14 08:50:32 np0005486759.ooo.test sudo[100662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:33 np0005486759.ooo.test sudo[100662]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:33 np0005486759.ooo.test sudo[100679]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzgkkrofa/privsep.sock
Oct 14 08:50:33 np0005486759.ooo.test sudo[100679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:34 np0005486759.ooo.test sudo[100679]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:34 np0005486759.ooo.test sudo[100690]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp60rj_1i7/privsep.sock
Oct 14 08:50:34 np0005486759.ooo.test sudo[100690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:35 np0005486759.ooo.test sudo[100690]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:35 np0005486759.ooo.test sudo[100701]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkf_d2ki7/privsep.sock
Oct 14 08:50:35 np0005486759.ooo.test sudo[100701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:35 np0005486759.ooo.test sudo[100701]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:36 np0005486759.ooo.test sudo[100712]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx2ccv7vl/privsep.sock
Oct 14 08:50:36 np0005486759.ooo.test sudo[100712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:36 np0005486759.ooo.test sudo[100712]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:37 np0005486759.ooo.test sudo[100723]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi8831u3o/privsep.sock
Oct 14 08:50:37 np0005486759.ooo.test sudo[100723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:50:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:50:37 np0005486759.ooo.test systemd[1]: tmp-crun.ccVeLv.mount: Deactivated successfully.
Oct 14 08:50:37 np0005486759.ooo.test podman[100726]: 2025-10-14 08:50:37.422047079 +0000 UTC m=+0.056367430 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:50:37 np0005486759.ooo.test podman[100727]: 2025-10-14 08:50:37.473250356 +0000 UTC m=+0.104579254 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_id=tripleo_step4)
Oct 14 08:50:37 np0005486759.ooo.test podman[100726]: 2025-10-14 08:50:37.4986815 +0000 UTC m=+0.133001881 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, release=1)
Oct 14 08:50:37 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:50:37 np0005486759.ooo.test podman[100727]: 2025-10-14 08:50:37.5162848 +0000 UTC m=+0.147613698 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller)
Oct 14 08:50:37 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:50:37 np0005486759.ooo.test sudo[100723]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:37 np0005486759.ooo.test sudo[100784]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphb4wbzgu/privsep.sock
Oct 14 08:50:37 np0005486759.ooo.test sudo[100784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:38 np0005486759.ooo.test sudo[100784]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:38 np0005486759.ooo.test sudo[100801]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp312wki6g/privsep.sock
Oct 14 08:50:38 np0005486759.ooo.test sudo[100801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:39 np0005486759.ooo.test sudo[100801]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:39 np0005486759.ooo.test sudo[100812]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2v8wz_wk/privsep.sock
Oct 14 08:50:39 np0005486759.ooo.test sudo[100812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:40 np0005486759.ooo.test sudo[100812]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:40 np0005486759.ooo.test sudo[100823]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpki_k76pq/privsep.sock
Oct 14 08:50:40 np0005486759.ooo.test sudo[100823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:41 np0005486759.ooo.test sudo[100823]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:41 np0005486759.ooo.test sudo[100834]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvykv5flx/privsep.sock
Oct 14 08:50:41 np0005486759.ooo.test sudo[100834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:42 np0005486759.ooo.test sudo[100834]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:42 np0005486759.ooo.test sudo[100845]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpss4t1dl7/privsep.sock
Oct 14 08:50:42 np0005486759.ooo.test sudo[100845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:42 np0005486759.ooo.test sudo[100845]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:43 np0005486759.ooo.test sudo[100856]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaf8d39io/privsep.sock
Oct 14 08:50:43 np0005486759.ooo.test sudo[100856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:43 np0005486759.ooo.test sudo[100856]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:44 np0005486759.ooo.test sudo[100870]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp958a91tf/privsep.sock
Oct 14 08:50:44 np0005486759.ooo.test sudo[100870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:44 np0005486759.ooo.test sudo[100870]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:44 np0005486759.ooo.test sudo[100884]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyxsgyraa/privsep.sock
Oct 14 08:50:44 np0005486759.ooo.test sudo[100884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:45 np0005486759.ooo.test sudo[100884]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:45 np0005486759.ooo.test sudo[100895]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptr_zb2m5/privsep.sock
Oct 14 08:50:45 np0005486759.ooo.test sudo[100895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:46 np0005486759.ooo.test sudo[100895]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:46 np0005486759.ooo.test sudo[100906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_c23bkba/privsep.sock
Oct 14 08:50:46 np0005486759.ooo.test sudo[100906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:47 np0005486759.ooo.test sudo[100906]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:47 np0005486759.ooo.test sudo[100917]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9i1zisso/privsep.sock
Oct 14 08:50:47 np0005486759.ooo.test sudo[100917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:48 np0005486759.ooo.test sudo[100917]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:48 np0005486759.ooo.test sudo[100928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfncvieaf/privsep.sock
Oct 14 08:50:48 np0005486759.ooo.test sudo[100928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:48 np0005486759.ooo.test sudo[100928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:49 np0005486759.ooo.test sudo[100939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5c9x0tb/privsep.sock
Oct 14 08:50:49 np0005486759.ooo.test sudo[100939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:49 np0005486759.ooo.test sudo[100939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:50 np0005486759.ooo.test sudo[100956]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkszqoent/privsep.sock
Oct 14 08:50:50 np0005486759.ooo.test sudo[100956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:50 np0005486759.ooo.test sudo[100956]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:50 np0005486759.ooo.test sudo[100967]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ggylx4j/privsep.sock
Oct 14 08:50:50 np0005486759.ooo.test sudo[100967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:51 np0005486759.ooo.test sudo[100967]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:51 np0005486759.ooo.test sudo[100978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp16bwicwa/privsep.sock
Oct 14 08:50:51 np0005486759.ooo.test sudo[100978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: tmp-crun.Kpwief.mount: Deactivated successfully.
Oct 14 08:50:51 np0005486759.ooo.test podman[100980]: 2025-10-14 08:50:51.889167995 +0000 UTC m=+0.067706394 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, release=1, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, architecture=x86_64, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, batch=17.1_20250721.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: tmp-crun.D1JClB.mount: Deactivated successfully.
Oct 14 08:50:51 np0005486759.ooo.test podman[100981]: 2025-10-14 08:50:51.903348987 +0000 UTC m=+0.077661923 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, build-date=2025-07-21T14:48:37, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:50:51 np0005486759.ooo.test podman[100980]: 2025-10-14 08:50:51.925125257 +0000 UTC m=+0.103663656 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, release=1, tcib_managed=true)
Oct 14 08:50:51 np0005486759.ooo.test podman[100981]: 2025-10-14 08:50:51.927204392 +0000 UTC m=+0.101517328 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_id=tripleo_step5, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T14:48:37)
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:50:51 np0005486759.ooo.test podman[100987]: 2025-10-14 08:50:51.93610307 +0000 UTC m=+0.106553386 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=2, version=17.1.9, vendor=Red Hat, Inc., container_name=collectd, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd)
Oct 14 08:50:51 np0005486759.ooo.test podman[100987]: 2025-10-14 08:50:51.969313516 +0000 UTC m=+0.139763842 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:50:51 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:50:52 np0005486759.ooo.test sudo[100978]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:52 np0005486759.ooo.test sudo[101049]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkb5ylb7h/privsep.sock
Oct 14 08:50:52 np0005486759.ooo.test sudo[101049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:53 np0005486759.ooo.test sudo[101049]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:53 np0005486759.ooo.test sudo[101060]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk5ll9vl9/privsep.sock
Oct 14 08:50:53 np0005486759.ooo.test sudo[101060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:54 np0005486759.ooo.test sudo[101060]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:50:54 np0005486759.ooo.test systemd[1]: tmp-crun.jxFpQG.mount: Deactivated successfully.
Oct 14 08:50:54 np0005486759.ooo.test podman[101064]: 2025-10-14 08:50:54.237925231 +0000 UTC m=+0.062214453 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:50:54 np0005486759.ooo.test sudo[101094]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk0wqnm33/privsep.sock
Oct 14 08:50:54 np0005486759.ooo.test sudo[101094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:50:54 np0005486759.ooo.test systemd[1]: tmp-crun.DsIxOR.mount: Deactivated successfully.
Oct 14 08:50:54 np0005486759.ooo.test podman[101096]: 2025-10-14 08:50:54.587478607 +0000 UTC m=+0.093034114 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:50:54 np0005486759.ooo.test podman[101064]: 2025-10-14 08:50:54.61320157 +0000 UTC m=+0.437490772 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, version=17.1.9, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target)
Oct 14 08:50:54 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:50:54 np0005486759.ooo.test podman[101096]: 2025-10-14 08:50:54.770914931 +0000 UTC m=+0.276470408 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1)
Oct 14 08:50:54 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:50:55 np0005486759.ooo.test sudo[101094]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:55 np0005486759.ooo.test sudo[101139]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1txpspuh/privsep.sock
Oct 14 08:50:55 np0005486759.ooo.test sudo[101139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:56 np0005486759.ooo.test sudo[101139]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:56 np0005486759.ooo.test sudo[101150]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcqo9xd0y/privsep.sock
Oct 14 08:50:56 np0005486759.ooo.test sudo[101150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:56 np0005486759.ooo.test sudo[101150]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:57 np0005486759.ooo.test sudo[101161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpth9ml93w/privsep.sock
Oct 14 08:50:57 np0005486759.ooo.test sudo[101161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:57 np0005486759.ooo.test sudo[101161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:58 np0005486759.ooo.test sudo[101172]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfhxsun7x/privsep.sock
Oct 14 08:50:58 np0005486759.ooo.test sudo[101172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:58 np0005486759.ooo.test sudo[101172]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:58 np0005486759.ooo.test sudo[101183]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9s2058vm/privsep.sock
Oct 14 08:50:58 np0005486759.ooo.test sudo[101183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:50:59 np0005486759.ooo.test sudo[101183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:50:59 np0005486759.ooo.test sudo[101194]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_swqes50/privsep.sock
Oct 14 08:50:59 np0005486759.ooo.test sudo[101194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:00 np0005486759.ooo.test sudo[101194]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:00 np0005486759.ooo.test sudo[101211]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx2a6_e5g/privsep.sock
Oct 14 08:51:00 np0005486759.ooo.test sudo[101211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:01 np0005486759.ooo.test sudo[101211]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:51:01 np0005486759.ooo.test podman[101221]: 2025-10-14 08:51:01.460173481 +0000 UTC m=+0.070046276 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, release=1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64)
Oct 14 08:51:01 np0005486759.ooo.test sudo[101254]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp75gndq0c/privsep.sock
Oct 14 08:51:01 np0005486759.ooo.test sudo[101254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:01 np0005486759.ooo.test podman[101218]: 2025-10-14 08:51:01.471192695 +0000 UTC m=+0.086360376 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-07-21T15:29:47, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, release=1, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vendor=Red Hat, Inc.)
Oct 14 08:51:01 np0005486759.ooo.test podman[101221]: 2025-10-14 08:51:01.498154226 +0000 UTC m=+0.108027111 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, tcib_managed=true, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Oct 14 08:51:01 np0005486759.ooo.test podman[101221]: unhealthy
Oct 14 08:51:01 np0005486759.ooo.test podman[101218]: 2025-10-14 08:51:01.507136697 +0000 UTC m=+0.122304348 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, release=1, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:51:01 np0005486759.ooo.test podman[101218]: unhealthy
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:51:01 np0005486759.ooo.test podman[101220]: 2025-10-14 08:51:01.599727436 +0000 UTC m=+0.208475417 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1)
Oct 14 08:51:01 np0005486759.ooo.test podman[101220]: 2025-10-14 08:51:01.635433999 +0000 UTC m=+0.244182010 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, container_name=logrotate_crond, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:51:01 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:51:02 np0005486759.ooo.test sudo[101254]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:02 np0005486759.ooo.test sudo[101292]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbjd0m9ob/privsep.sock
Oct 14 08:51:02 np0005486759.ooo.test sudo[101292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:02 np0005486759.ooo.test systemd[1]: tmp-crun.HAkMtt.mount: Deactivated successfully.
Oct 14 08:51:02 np0005486759.ooo.test sudo[101292]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:03 np0005486759.ooo.test sudo[101303]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsqbu8xwv/privsep.sock
Oct 14 08:51:03 np0005486759.ooo.test sudo[101303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:03 np0005486759.ooo.test sudo[101303]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:04 np0005486759.ooo.test sudo[101314]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkcps7k8_/privsep.sock
Oct 14 08:51:04 np0005486759.ooo.test sudo[101314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:04 np0005486759.ooo.test sudo[101314]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:04 np0005486759.ooo.test sudo[101325]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsip_m76s/privsep.sock
Oct 14 08:51:04 np0005486759.ooo.test sudo[101325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:05 np0005486759.ooo.test sudo[101325]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:05 np0005486759.ooo.test sudo[101342]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzxakquvu/privsep.sock
Oct 14 08:51:05 np0005486759.ooo.test sudo[101342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:06 np0005486759.ooo.test sudo[101342]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:06 np0005486759.ooo.test sudo[101353]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo5wz23t5/privsep.sock
Oct 14 08:51:06 np0005486759.ooo.test sudo[101353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:07 np0005486759.ooo.test sudo[101353]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:07 np0005486759.ooo.test sudo[101376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3qy62_8e/privsep.sock
Oct 14 08:51:07 np0005486759.ooo.test sudo[101376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:51:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:51:07 np0005486759.ooo.test systemd[1]: tmp-crun.45JTet.mount: Deactivated successfully.
Oct 14 08:51:07 np0005486759.ooo.test systemd[1]: tmp-crun.197rCw.mount: Deactivated successfully.
Oct 14 08:51:07 np0005486759.ooo.test podman[101378]: 2025-10-14 08:51:07.675166832 +0000 UTC m=+0.084771006 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, version=17.1.9, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:51:07 np0005486759.ooo.test podman[101378]: 2025-10-14 08:51:07.69557953 +0000 UTC m=+0.105183684 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, release=1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 08:51:07 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:51:07 np0005486759.ooo.test podman[101379]: 2025-10-14 08:51:07.676332889 +0000 UTC m=+0.081837155 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Oct 14 08:51:07 np0005486759.ooo.test podman[101379]: 2025-10-14 08:51:07.755586612 +0000 UTC m=+0.161090868 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44)
Oct 14 08:51:07 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:51:08 np0005486759.ooo.test sudo[101376]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:08 np0005486759.ooo.test sudo[101431]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6lqv4djs/privsep.sock
Oct 14 08:51:08 np0005486759.ooo.test sudo[101431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:09 np0005486759.ooo.test sudo[101431]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:09 np0005486759.ooo.test sudo[101442]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6nf9iepm/privsep.sock
Oct 14 08:51:09 np0005486759.ooo.test sudo[101442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:10 np0005486759.ooo.test sudo[101442]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:10 np0005486759.ooo.test sudo[101453]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfnth06oo/privsep.sock
Oct 14 08:51:10 np0005486759.ooo.test sudo[101453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:10 np0005486759.ooo.test sudo[101453]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:11 np0005486759.ooo.test sudo[101470]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6nkk7wzx/privsep.sock
Oct 14 08:51:11 np0005486759.ooo.test sudo[101470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:11 np0005486759.ooo.test sudo[101470]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:12 np0005486759.ooo.test sudo[101481]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbkem4k29/privsep.sock
Oct 14 08:51:12 np0005486759.ooo.test sudo[101481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:12 np0005486759.ooo.test sudo[101481]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:12 np0005486759.ooo.test sudo[101492]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp48m8tvgg/privsep.sock
Oct 14 08:51:12 np0005486759.ooo.test sudo[101492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:13 np0005486759.ooo.test sudo[101492]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:13 np0005486759.ooo.test sudo[101503]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyxocwp6e/privsep.sock
Oct 14 08:51:13 np0005486759.ooo.test sudo[101503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:14 np0005486759.ooo.test sudo[101503]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:14 np0005486759.ooo.test sudo[101514]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk6ktblkf/privsep.sock
Oct 14 08:51:14 np0005486759.ooo.test sudo[101514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:15 np0005486759.ooo.test sudo[101514]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:15 np0005486759.ooo.test sudo[101525]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6qusnnv9/privsep.sock
Oct 14 08:51:15 np0005486759.ooo.test sudo[101525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:16 np0005486759.ooo.test sudo[101525]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:16 np0005486759.ooo.test sudo[101541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmow0xkm6/privsep.sock
Oct 14 08:51:16 np0005486759.ooo.test sudo[101541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:16 np0005486759.ooo.test sudo[101541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:17 np0005486759.ooo.test sudo[101553]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxyqw9h5c/privsep.sock
Oct 14 08:51:17 np0005486759.ooo.test sudo[101553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:17 np0005486759.ooo.test sudo[101553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:18 np0005486759.ooo.test sudo[101564]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxu9bt8dt/privsep.sock
Oct 14 08:51:18 np0005486759.ooo.test sudo[101564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:18 np0005486759.ooo.test sudo[101564]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:18 np0005486759.ooo.test sudo[101575]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphrl4vkmt/privsep.sock
Oct 14 08:51:18 np0005486759.ooo.test sudo[101575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:19 np0005486759.ooo.test sudo[101575]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:19 np0005486759.ooo.test sudo[101586]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp580qzv6a/privsep.sock
Oct 14 08:51:19 np0005486759.ooo.test sudo[101586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:20 np0005486759.ooo.test sudo[101586]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:20 np0005486759.ooo.test sudo[101597]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2db1iqht/privsep.sock
Oct 14 08:51:20 np0005486759.ooo.test sudo[101597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:21 np0005486759.ooo.test sudo[101597]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:21 np0005486759.ooo.test sudo[101608]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdqm8ty_o/privsep.sock
Oct 14 08:51:21 np0005486759.ooo.test sudo[101608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:22 np0005486759.ooo.test sudo[101608]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: tmp-crun.aPFTQy.mount: Deactivated successfully.
Oct 14 08:51:22 np0005486759.ooo.test podman[101620]: 2025-10-14 08:51:22.15002076 +0000 UTC m=+0.071713698 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, architecture=x86_64, version=17.1.9, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:51:22 np0005486759.ooo.test podman[101620]: 2025-10-14 08:51:22.155270404 +0000 UTC m=+0.076963312 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, maintainer=OpenStack TripleO Team)
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:51:22 np0005486759.ooo.test podman[101622]: 2025-10-14 08:51:22.215427161 +0000 UTC m=+0.126952972 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=2, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:51:22 np0005486759.ooo.test podman[101622]: 2025-10-14 08:51:22.250395692 +0000 UTC m=+0.161921453 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=)
Oct 14 08:51:22 np0005486759.ooo.test podman[101621]: 2025-10-14 08:51:22.267020141 +0000 UTC m=+0.184200779 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:51:22 np0005486759.ooo.test sudo[101676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw8mtg2_n/privsep.sock
Oct 14 08:51:22 np0005486759.ooo.test sudo[101676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:51:22 np0005486759.ooo.test podman[101621]: 2025-10-14 08:51:22.316390772 +0000 UTC m=+0.233571390 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-07-21T14:48:37, container_name=nova_compute)
Oct 14 08:51:22 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:51:22 np0005486759.ooo.test sudo[101676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:23 np0005486759.ooo.test sudo[101697]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8yb8790p/privsep.sock
Oct 14 08:51:23 np0005486759.ooo.test sudo[101697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:23 np0005486759.ooo.test systemd[1]: tmp-crun.1lqxc0.mount: Deactivated successfully.
Oct 14 08:51:23 np0005486759.ooo.test sudo[101697]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:23 np0005486759.ooo.test sudo[101708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2227o5fn/privsep.sock
Oct 14 08:51:23 np0005486759.ooo.test sudo[101708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:24 np0005486759.ooo.test sudo[101708]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:24 np0005486759.ooo.test sudo[101719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_bwm3_z5/privsep.sock
Oct 14 08:51:24 np0005486759.ooo.test sudo[101719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:51:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:51:24 np0005486759.ooo.test podman[101722]: 2025-10-14 08:51:24.941066807 +0000 UTC m=+0.068318413 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-qdrouterd, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:51:25 np0005486759.ooo.test systemd[1]: tmp-crun.WhrSQE.mount: Deactivated successfully.
Oct 14 08:51:25 np0005486759.ooo.test podman[101721]: 2025-10-14 08:51:25.019757633 +0000 UTC m=+0.144555692 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:51:25 np0005486759.ooo.test podman[101722]: 2025-10-14 08:51:25.143518174 +0000 UTC m=+0.270769780 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:51:25 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:51:25 np0005486759.ooo.test podman[101721]: 2025-10-14 08:51:25.359211174 +0000 UTC m=+0.484009213 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:48:37, release=1, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.expose-services=)
Oct 14 08:51:25 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:51:25 np0005486759.ooo.test sudo[101719]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:25 np0005486759.ooo.test sudo[101781]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf3xcdwa6/privsep.sock
Oct 14 08:51:25 np0005486759.ooo.test sudo[101781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:26 np0005486759.ooo.test sudo[101781]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:26 np0005486759.ooo.test sudo[101792]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfrj89zc8/privsep.sock
Oct 14 08:51:26 np0005486759.ooo.test sudo[101792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:27 np0005486759.ooo.test sudo[101792]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:27 np0005486759.ooo.test sudo[101809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe6nybr06/privsep.sock
Oct 14 08:51:27 np0005486759.ooo.test sudo[101809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:28 np0005486759.ooo.test sudo[101809]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:28 np0005486759.ooo.test sudo[101820]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqm48jfb1/privsep.sock
Oct 14 08:51:28 np0005486759.ooo.test sudo[101820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:29 np0005486759.ooo.test sudo[101820]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:29 np0005486759.ooo.test sudo[101831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwr4dmu5g/privsep.sock
Oct 14 08:51:29 np0005486759.ooo.test sudo[101831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:30 np0005486759.ooo.test sudo[101831]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:30 np0005486759.ooo.test sudo[101842]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgebe34w9/privsep.sock
Oct 14 08:51:30 np0005486759.ooo.test sudo[101842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:30 np0005486759.ooo.test sudo[101842]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:31 np0005486759.ooo.test sudo[101853]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptuhjce9q/privsep.sock
Oct 14 08:51:31 np0005486759.ooo.test sudo[101853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:31 np0005486759.ooo.test sudo[101853]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: tmp-crun.27Y2Hm.mount: Deactivated successfully.
Oct 14 08:51:31 np0005486759.ooo.test podman[101861]: 2025-10-14 08:51:31.865930669 +0000 UTC m=+0.056579968 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:51:31 np0005486759.ooo.test podman[101860]: 2025-10-14 08:51:31.878043366 +0000 UTC m=+0.067744184 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:51:31 np0005486759.ooo.test podman[101861]: 2025-10-14 08:51:31.883347082 +0000 UTC m=+0.073996421 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:51:31 np0005486759.ooo.test podman[101861]: unhealthy
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:51:31 np0005486759.ooo.test podman[101860]: 2025-10-14 08:51:31.915219396 +0000 UTC m=+0.104920134 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, release=1, config_id=tripleo_step4)
Oct 14 08:51:31 np0005486759.ooo.test podman[101860]: unhealthy
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:51:31 np0005486759.ooo.test podman[101859]: 2025-10-14 08:51:31.959786517 +0000 UTC m=+0.151434666 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_id=tripleo_step4, release=1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:51:31 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:51:32 np0005486759.ooo.test recover_tripleo_nova_virtqemud[101916]: 47951
Oct 14 08:51:32 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:51:32 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:51:32 np0005486759.ooo.test podman[101859]: 2025-10-14 08:51:32.007568127 +0000 UTC m=+0.199216276 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, distribution-scope=public, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:51:32 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:51:32 np0005486759.ooo.test sudo[101921]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp97lqnmm0/privsep.sock
Oct 14 08:51:32 np0005486759.ooo.test sudo[101921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:32 np0005486759.ooo.test sudo[101921]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:32 np0005486759.ooo.test sudo[101938]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp16w68jvf/privsep.sock
Oct 14 08:51:32 np0005486759.ooo.test sudo[101938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:33 np0005486759.ooo.test sudo[101938]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:33 np0005486759.ooo.test sudo[101949]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5bimzt9z/privsep.sock
Oct 14 08:51:33 np0005486759.ooo.test sudo[101949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:34 np0005486759.ooo.test sudo[101949]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:34 np0005486759.ooo.test sudo[101960]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpymfba8cf/privsep.sock
Oct 14 08:51:34 np0005486759.ooo.test sudo[101960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:35 np0005486759.ooo.test sudo[101960]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:35 np0005486759.ooo.test sudo[101971]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbqtvgjwu/privsep.sock
Oct 14 08:51:35 np0005486759.ooo.test sudo[101971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:36 np0005486759.ooo.test sudo[101971]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:36 np0005486759.ooo.test sudo[101982]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj9n7cdua/privsep.sock
Oct 14 08:51:36 np0005486759.ooo.test sudo[101982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:37 np0005486759.ooo.test sudo[101982]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:37 np0005486759.ooo.test sudo[101993]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp00bz4i2o/privsep.sock
Oct 14 08:51:37 np0005486759.ooo.test sudo[101993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:37 np0005486759.ooo.test sudo[101993]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:51:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:51:38 np0005486759.ooo.test systemd[1]: tmp-crun.oMaoUt.mount: Deactivated successfully.
Oct 14 08:51:38 np0005486759.ooo.test podman[102005]: 2025-10-14 08:51:38.072658572 +0000 UTC m=+0.064335459 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.9, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, io.openshift.expose-services=)
Oct 14 08:51:38 np0005486759.ooo.test podman[102005]: 2025-10-14 08:51:38.088246219 +0000 UTC m=+0.079923076 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.33.12, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller)
Oct 14 08:51:38 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:51:38 np0005486759.ooo.test podman[102004]: 2025-10-14 08:51:38.164992553 +0000 UTC m=+0.159001813 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-type=git, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12)
Oct 14 08:51:38 np0005486759.ooo.test sudo[102056]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprcrgb6o1/privsep.sock
Oct 14 08:51:38 np0005486759.ooo.test sudo[102056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:38 np0005486759.ooo.test podman[102004]: 2025-10-14 08:51:38.220317019 +0000 UTC m=+0.214326309 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-type=git, release=1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:51:38 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:51:38 np0005486759.ooo.test sudo[102056]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:38 np0005486759.ooo.test sudo[102067]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt2692qz6/privsep.sock
Oct 14 08:51:38 np0005486759.ooo.test sudo[102067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:39 np0005486759.ooo.test sudo[102067]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:39 np0005486759.ooo.test sudo[102078]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5yrz2wbf/privsep.sock
Oct 14 08:51:39 np0005486759.ooo.test sudo[102078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:40 np0005486759.ooo.test sudo[102078]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:40 np0005486759.ooo.test sudo[102089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp16xxal1/privsep.sock
Oct 14 08:51:40 np0005486759.ooo.test sudo[102089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:41 np0005486759.ooo.test sudo[102089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:41 np0005486759.ooo.test sudo[102100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpafki_64x/privsep.sock
Oct 14 08:51:41 np0005486759.ooo.test sudo[102100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:42 np0005486759.ooo.test sudo[102100]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:42 np0005486759.ooo.test sudo[102111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6_5oygq5/privsep.sock
Oct 14 08:51:42 np0005486759.ooo.test sudo[102111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:43 np0005486759.ooo.test sudo[102111]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:43 np0005486759.ooo.test sudo[102127]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmperjyql39/privsep.sock
Oct 14 08:51:43 np0005486759.ooo.test sudo[102127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:43 np0005486759.ooo.test sudo[102127]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:44 np0005486759.ooo.test sudo[102139]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpekkwytzm/privsep.sock
Oct 14 08:51:44 np0005486759.ooo.test sudo[102139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:44 np0005486759.ooo.test sudo[102139]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:45 np0005486759.ooo.test sudo[102150]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7yolf2q5/privsep.sock
Oct 14 08:51:45 np0005486759.ooo.test sudo[102150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:45 np0005486759.ooo.test sudo[102150]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:45 np0005486759.ooo.test sudo[102161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi2ma7chn/privsep.sock
Oct 14 08:51:45 np0005486759.ooo.test sudo[102161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:46 np0005486759.ooo.test sudo[102161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:46 np0005486759.ooo.test sudo[102172]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfznb5jxd/privsep.sock
Oct 14 08:51:46 np0005486759.ooo.test sudo[102172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:47 np0005486759.ooo.test sudo[102172]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:47 np0005486759.ooo.test sudo[102183]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyilelv64/privsep.sock
Oct 14 08:51:47 np0005486759.ooo.test sudo[102183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:48 np0005486759.ooo.test sudo[102183]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:48 np0005486759.ooo.test sudo[102196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpufpczn52/privsep.sock
Oct 14 08:51:48 np0005486759.ooo.test sudo[102196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:49 np0005486759.ooo.test sudo[102196]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:49 np0005486759.ooo.test sudo[102211]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4wvilstz/privsep.sock
Oct 14 08:51:49 np0005486759.ooo.test sudo[102211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:50 np0005486759.ooo.test sudo[102211]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:50 np0005486759.ooo.test sudo[102222]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpboefqidq/privsep.sock
Oct 14 08:51:50 np0005486759.ooo.test sudo[102222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:50 np0005486759.ooo.test sudo[102222]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:51 np0005486759.ooo.test sudo[102233]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpswdicx1e/privsep.sock
Oct 14 08:51:51 np0005486759.ooo.test sudo[102233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:51 np0005486759.ooo.test sudo[102233]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:51 np0005486759.ooo.test sudo[102244]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwr2_ne2m/privsep.sock
Oct 14 08:51:51 np0005486759.ooo.test sudo[102244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:51:52 np0005486759.ooo.test podman[102247]: 2025-10-14 08:51:52.451180064 +0000 UTC m=+0.077155799 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, build-date=2025-07-21T13:27:15, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1)
Oct 14 08:51:52 np0005486759.ooo.test podman[102247]: 2025-10-14 08:51:52.464388086 +0000 UTC m=+0.090363851 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: tmp-crun.Qi3Nft.mount: Deactivated successfully.
Oct 14 08:51:52 np0005486759.ooo.test podman[102248]: 2025-10-14 08:51:52.512229059 +0000 UTC m=+0.133640781 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 08:51:52 np0005486759.ooo.test podman[102248]: 2025-10-14 08:51:52.537844838 +0000 UTC m=+0.159256570 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc.)
Oct 14 08:51:52 np0005486759.ooo.test sudo[102244]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:51:52 np0005486759.ooo.test podman[102249]: 2025-10-14 08:51:52.608101681 +0000 UTC m=+0.226745246 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, build-date=2025-07-21T13:04:03, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container)
Oct 14 08:51:52 np0005486759.ooo.test podman[102249]: 2025-10-14 08:51:52.616347718 +0000 UTC m=+0.234991283 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, vcs-type=git, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 08:51:52 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:51:52 np0005486759.ooo.test sudo[102320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpni1uxdvg/privsep.sock
Oct 14 08:51:52 np0005486759.ooo.test sudo[102320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:53 np0005486759.ooo.test sudo[102320]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:53 np0005486759.ooo.test systemd[1]: tmp-crun.pAYqqI.mount: Deactivated successfully.
Oct 14 08:51:53 np0005486759.ooo.test sudo[102331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg8rd4rbj/privsep.sock
Oct 14 08:51:53 np0005486759.ooo.test sudo[102331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:54 np0005486759.ooo.test sudo[102331]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:54 np0005486759.ooo.test sudo[102348]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7q2v0iar/privsep.sock
Oct 14 08:51:54 np0005486759.ooo.test sudo[102348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:55 np0005486759.ooo.test sudo[102348]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:51:55 np0005486759.ooo.test sudo[102360]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl3nb57qf/privsep.sock
Oct 14 08:51:55 np0005486759.ooo.test sudo[102360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:51:55 np0005486759.ooo.test podman[102358]: 2025-10-14 08:51:55.451659686 +0000 UTC m=+0.077531181 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd)
Oct 14 08:51:55 np0005486759.ooo.test systemd[1]: tmp-crun.6H2MCU.mount: Deactivated successfully.
Oct 14 08:51:55 np0005486759.ooo.test podman[102366]: 2025-10-14 08:51:55.509313984 +0000 UTC m=+0.115632629 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Oct 14 08:51:55 np0005486759.ooo.test podman[102358]: 2025-10-14 08:51:55.655551128 +0000 UTC m=+0.281422643 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, name=rhosp17/openstack-qdrouterd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 08:51:55 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:51:55 np0005486759.ooo.test podman[102366]: 2025-10-14 08:51:55.830478476 +0000 UTC m=+0.436797131 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, version=17.1.9, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team)
Oct 14 08:51:55 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:51:55 np0005486759.ooo.test sudo[102360]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:56 np0005486759.ooo.test sudo[102422]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxabt2akw/privsep.sock
Oct 14 08:51:56 np0005486759.ooo.test sudo[102422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:56 np0005486759.ooo.test sudo[102422]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:57 np0005486759.ooo.test sudo[102433]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1aml28ey/privsep.sock
Oct 14 08:51:57 np0005486759.ooo.test sudo[102433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:57 np0005486759.ooo.test sudo[102433]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:58 np0005486759.ooo.test sudo[102444]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9p7hy6q4/privsep.sock
Oct 14 08:51:58 np0005486759.ooo.test sudo[102444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:58 np0005486759.ooo.test sudo[102444]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:58 np0005486759.ooo.test sudo[102455]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8jxmlg8x/privsep.sock
Oct 14 08:51:58 np0005486759.ooo.test sudo[102455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:51:59 np0005486759.ooo.test sudo[102455]: pam_unix(sudo:session): session closed for user root
Oct 14 08:51:59 np0005486759.ooo.test sudo[102472]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb57d_pn0/privsep.sock
Oct 14 08:51:59 np0005486759.ooo.test sudo[102472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:00 np0005486759.ooo.test sudo[102472]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:00 np0005486759.ooo.test sudo[102483]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg1b6i_v5/privsep.sock
Oct 14 08:52:00 np0005486759.ooo.test sudo[102483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:01 np0005486759.ooo.test sudo[102483]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:01 np0005486759.ooo.test sudo[102494]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm9c4noy7/privsep.sock
Oct 14 08:52:01 np0005486759.ooo.test sudo[102494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:01 np0005486759.ooo.test sudo[102494]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:52:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:52:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:52:02 np0005486759.ooo.test podman[102500]: 2025-10-14 08:52:02.055063186 +0000 UTC m=+0.094176469 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, build-date=2025-07-21T14:45:33, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.)
Oct 14 08:52:02 np0005486759.ooo.test podman[102501]: 2025-10-14 08:52:02.021082186 +0000 UTC m=+0.059412854 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:52:02 np0005486759.ooo.test podman[102500]: 2025-10-14 08:52:02.097298944 +0000 UTC m=+0.136412207 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64)
Oct 14 08:52:02 np0005486759.ooo.test podman[102500]: unhealthy
Oct 14 08:52:02 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:52:02 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:52:02 np0005486759.ooo.test sudo[102552]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplfw92ye4/privsep.sock
Oct 14 08:52:02 np0005486759.ooo.test sudo[102552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:02 np0005486759.ooo.test podman[102529]: 2025-10-14 08:52:02.178022873 +0000 UTC m=+0.126018513 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:52:02 np0005486759.ooo.test podman[102529]: 2025-10-14 08:52:02.183231086 +0000 UTC m=+0.131226686 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:52:02 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:52:02 np0005486759.ooo.test podman[102501]: 2025-10-14 08:52:02.200887797 +0000 UTC m=+0.239218465 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 08:52:02 np0005486759.ooo.test podman[102501]: unhealthy
Oct 14 08:52:02 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:52:02 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:52:02 np0005486759.ooo.test sudo[102552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:03 np0005486759.ooo.test sudo[102569]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbb173nz4/privsep.sock
Oct 14 08:52:03 np0005486759.ooo.test sudo[102569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:03 np0005486759.ooo.test systemd[1]: tmp-crun.zVsP7M.mount: Deactivated successfully.
Oct 14 08:52:03 np0005486759.ooo.test sudo[102569]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:03 np0005486759.ooo.test sudo[102580]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsczcmms8/privsep.sock
Oct 14 08:52:03 np0005486759.ooo.test sudo[102580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:04 np0005486759.ooo.test sudo[102580]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:04 np0005486759.ooo.test sudo[102593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyrke6g99/privsep.sock
Oct 14 08:52:04 np0005486759.ooo.test sudo[102593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:05 np0005486759.ooo.test sudo[102593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:05 np0005486759.ooo.test sudo[102608]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgt_k3l9o/privsep.sock
Oct 14 08:52:05 np0005486759.ooo.test sudo[102608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:06 np0005486759.ooo.test sudo[102608]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:06 np0005486759.ooo.test sudo[102619]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgbkkj0in/privsep.sock
Oct 14 08:52:06 np0005486759.ooo.test sudo[102619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:06 np0005486759.ooo.test sudo[102619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:07 np0005486759.ooo.test sudo[102630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr6arndp0/privsep.sock
Oct 14 08:52:07 np0005486759.ooo.test sudo[102630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:07 np0005486759.ooo.test sudo[102630]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:08 np0005486759.ooo.test sudo[102653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy80e7tgk/privsep.sock
Oct 14 08:52:08 np0005486759.ooo.test sudo[102653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:52:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:52:08 np0005486759.ooo.test podman[102656]: 2025-10-14 08:52:08.454919557 +0000 UTC m=+0.079754300 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-07-21T16:28:53, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:52:08 np0005486759.ooo.test podman[102657]: 2025-10-14 08:52:08.430179755 +0000 UTC m=+0.057205257 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., architecture=x86_64, release=1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44)
Oct 14 08:52:08 np0005486759.ooo.test podman[102656]: 2025-10-14 08:52:08.507255409 +0000 UTC m=+0.132090112 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, distribution-scope=public, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.9)
Oct 14 08:52:08 np0005486759.ooo.test podman[102657]: 2025-10-14 08:52:08.512319357 +0000 UTC m=+0.139344859 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12)
Oct 14 08:52:08 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:52:08 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:52:08 np0005486759.ooo.test sudo[102653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:08 np0005486759.ooo.test sudo[102710]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo268osfa/privsep.sock
Oct 14 08:52:08 np0005486759.ooo.test sudo[102710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:09 np0005486759.ooo.test sudo[102710]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:09 np0005486759.ooo.test sudo[102721]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdfaiz66_/privsep.sock
Oct 14 08:52:09 np0005486759.ooo.test sudo[102721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:10 np0005486759.ooo.test sudo[102721]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:10 np0005486759.ooo.test sudo[102738]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkurpnkwh/privsep.sock
Oct 14 08:52:10 np0005486759.ooo.test sudo[102738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:11 np0005486759.ooo.test sudo[102738]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:11 np0005486759.ooo.test sudo[102749]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg8h6ised/privsep.sock
Oct 14 08:52:11 np0005486759.ooo.test sudo[102749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:12 np0005486759.ooo.test sudo[102749]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:12 np0005486759.ooo.test sudo[102760]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7pcqwt1t/privsep.sock
Oct 14 08:52:12 np0005486759.ooo.test sudo[102760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:12 np0005486759.ooo.test sudo[102760]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:13 np0005486759.ooo.test sudo[102771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaufylruz/privsep.sock
Oct 14 08:52:13 np0005486759.ooo.test sudo[102771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:13 np0005486759.ooo.test sudo[102771]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:14 np0005486759.ooo.test sudo[102782]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfw39rao0/privsep.sock
Oct 14 08:52:14 np0005486759.ooo.test sudo[102782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:14 np0005486759.ooo.test sudo[102782]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:14 np0005486759.ooo.test sudo[102793]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv711t0t4/privsep.sock
Oct 14 08:52:14 np0005486759.ooo.test sudo[102793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:15 np0005486759.ooo.test sudo[102793]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:15 np0005486759.ooo.test sudo[102810]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpab6_ikkp/privsep.sock
Oct 14 08:52:15 np0005486759.ooo.test sudo[102810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:16 np0005486759.ooo.test sudo[102810]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:16 np0005486759.ooo.test sudo[102821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp55c2jeh0/privsep.sock
Oct 14 08:52:16 np0005486759.ooo.test sudo[102821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:17 np0005486759.ooo.test sudo[102821]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:17 np0005486759.ooo.test sudo[102832]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91mdi28k/privsep.sock
Oct 14 08:52:17 np0005486759.ooo.test sudo[102832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:18 np0005486759.ooo.test sudo[102832]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:18 np0005486759.ooo.test sudo[102843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbmz_f_43/privsep.sock
Oct 14 08:52:18 np0005486759.ooo.test sudo[102843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:19 np0005486759.ooo.test sudo[102843]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:19 np0005486759.ooo.test sudo[102854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9i80y1b_/privsep.sock
Oct 14 08:52:19 np0005486759.ooo.test sudo[102854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:19 np0005486759.ooo.test sudo[102854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:20 np0005486759.ooo.test sudo[102865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp674fwrl_/privsep.sock
Oct 14 08:52:20 np0005486759.ooo.test sudo[102865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:20 np0005486759.ooo.test sudo[102865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:20 np0005486759.ooo.test sudo[102878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9tytrkhv/privsep.sock
Oct 14 08:52:20 np0005486759.ooo.test sudo[102878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:21 np0005486759.ooo.test sudo[102878]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:21 np0005486759.ooo.test sudo[102893]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3opm14np/privsep.sock
Oct 14 08:52:21 np0005486759.ooo.test sudo[102893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:22 np0005486759.ooo.test sudo[102893]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: tmp-crun.0OU4Bj.mount: Deactivated successfully.
Oct 14 08:52:22 np0005486759.ooo.test podman[102898]: 2025-10-14 08:52:22.607288602 +0000 UTC m=+0.093227860 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:52:22 np0005486759.ooo.test podman[102898]: 2025-10-14 08:52:22.656324012 +0000 UTC m=+0.142263280 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=iscsid)
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:52:22 np0005486759.ooo.test podman[102917]: 2025-10-14 08:52:22.722326781 +0000 UTC m=+0.090103992 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, version=17.1.9, config_id=tripleo_step5, architecture=x86_64, container_name=nova_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc.)
Oct 14 08:52:22 np0005486759.ooo.test podman[102917]: 2025-10-14 08:52:22.77485344 +0000 UTC m=+0.142630651 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, tcib_managed=true, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, container_name=nova_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:52:22 np0005486759.ooo.test sudo[102967]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpba8ybd2q/privsep.sock
Oct 14 08:52:22 np0005486759.ooo.test sudo[102967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:22 np0005486759.ooo.test podman[102930]: 2025-10-14 08:52:22.776841123 +0000 UTC m=+0.077331335 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:52:22 np0005486759.ooo.test podman[102930]: 2025-10-14 08:52:22.861267607 +0000 UTC m=+0.161757809 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible)
Oct 14 08:52:22 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:52:23 np0005486759.ooo.test sudo[102967]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:23 np0005486759.ooo.test sudo[102978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8hto7rdt/privsep.sock
Oct 14 08:52:23 np0005486759.ooo.test sudo[102978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:24 np0005486759.ooo.test sudo[102978]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:24 np0005486759.ooo.test sudo[102989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1nalxciv/privsep.sock
Oct 14 08:52:24 np0005486759.ooo.test sudo[102989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:25 np0005486759.ooo.test sudo[102989]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:25 np0005486759.ooo.test sudo[103000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9okd9me2/privsep.sock
Oct 14 08:52:25 np0005486759.ooo.test sudo[103000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:25 np0005486759.ooo.test sudo[103000]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:52:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:52:26 np0005486759.ooo.test podman[103007]: 2025-10-14 08:52:26.09574329 +0000 UTC m=+0.065553017 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64)
Oct 14 08:52:26 np0005486759.ooo.test podman[103006]: 2025-10-14 08:52:26.157184687 +0000 UTC m=+0.128388447 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1)
Oct 14 08:52:26 np0005486759.ooo.test sudo[103058]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmputjxmc1_/privsep.sock
Oct 14 08:52:26 np0005486759.ooo.test sudo[103058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:26 np0005486759.ooo.test podman[103007]: 2025-10-14 08:52:26.308496348 +0000 UTC m=+0.278306075 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1)
Oct 14 08:52:26 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:52:26 np0005486759.ooo.test podman[103006]: 2025-10-14 08:52:26.485330646 +0000 UTC m=+0.456534426 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, architecture=x86_64, tcib_managed=true, version=17.1.9)
Oct 14 08:52:26 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:52:26 np0005486759.ooo.test sudo[103058]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:27 np0005486759.ooo.test sudo[103076]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpego_l48s/privsep.sock
Oct 14 08:52:27 np0005486759.ooo.test sudo[103076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:27 np0005486759.ooo.test sudo[103076]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:28 np0005486759.ooo.test sudo[103087]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq68gvwak/privsep.sock
Oct 14 08:52:28 np0005486759.ooo.test sudo[103087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:28 np0005486759.ooo.test sudo[103087]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:28 np0005486759.ooo.test sudo[103098]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkmgjl9qm/privsep.sock
Oct 14 08:52:28 np0005486759.ooo.test sudo[103098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:29 np0005486759.ooo.test sudo[103098]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:29 np0005486759.ooo.test sudo[103109]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5bazc7jt/privsep.sock
Oct 14 08:52:29 np0005486759.ooo.test sudo[103109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:30 np0005486759.ooo.test sudo[103109]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:30 np0005486759.ooo.test sudo[103120]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp34_24akt/privsep.sock
Oct 14 08:52:30 np0005486759.ooo.test sudo[103120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:31 np0005486759.ooo.test sudo[103120]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:31 np0005486759.ooo.test sudo[103131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmr2yalr4/privsep.sock
Oct 14 08:52:31 np0005486759.ooo.test sudo[103131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:32 np0005486759.ooo.test sudo[103131]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:32 np0005486759.ooo.test sudo[103148]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3f9g0lvr/privsep.sock
Oct 14 08:52:32 np0005486759.ooo.test sudo[103148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:52:32 np0005486759.ooo.test podman[103150]: 2025-10-14 08:52:32.472175039 +0000 UTC m=+0.093259081 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, container_name=logrotate_crond, io.buildah.version=1.33.12, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:52:32 np0005486759.ooo.test podman[103150]: 2025-10-14 08:52:32.501097151 +0000 UTC m=+0.122181183 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, version=17.1.9)
Oct 14 08:52:32 np0005486759.ooo.test podman[103151]: 2025-10-14 08:52:32.515606273 +0000 UTC m=+0.131262257 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-type=git, version=17.1.9, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, tcib_managed=true, build-date=2025-07-21T14:45:33, release=1)
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:52:32 np0005486759.ooo.test podman[103151]: 2025-10-14 08:52:32.522430757 +0000 UTC m=+0.138086791 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:52:32 np0005486759.ooo.test podman[103151]: unhealthy
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:52:32 np0005486759.ooo.test podman[103152]: 2025-10-14 08:52:32.563461556 +0000 UTC m=+0.176409525 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:52:32 np0005486759.ooo.test podman[103152]: 2025-10-14 08:52:32.579390954 +0000 UTC m=+0.192338983 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12)
Oct 14 08:52:32 np0005486759.ooo.test podman[103152]: unhealthy
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:52:32 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:52:32 np0005486759.ooo.test sudo[103148]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:33 np0005486759.ooo.test sudo[103215]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx2f726ij/privsep.sock
Oct 14 08:52:33 np0005486759.ooo.test sudo[103215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:33 np0005486759.ooo.test sudo[103215]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:34 np0005486759.ooo.test sudo[103226]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzigocwrk/privsep.sock
Oct 14 08:52:34 np0005486759.ooo.test sudo[103226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:34 np0005486759.ooo.test sudo[103226]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:34 np0005486759.ooo.test sudo[103237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpek8uxg3z/privsep.sock
Oct 14 08:52:34 np0005486759.ooo.test sudo[103237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:35 np0005486759.ooo.test sudo[103237]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:35 np0005486759.ooo.test sudo[103248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpofuyiskw/privsep.sock
Oct 14 08:52:35 np0005486759.ooo.test sudo[103248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:36 np0005486759.ooo.test sudo[103248]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:36 np0005486759.ooo.test sudo[103259]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzcbza7ux/privsep.sock
Oct 14 08:52:36 np0005486759.ooo.test sudo[103259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:37 np0005486759.ooo.test sudo[103259]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:37 np0005486759.ooo.test sudo[103276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2hdq0so0/privsep.sock
Oct 14 08:52:37 np0005486759.ooo.test sudo[103276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:38 np0005486759.ooo.test sudo[103276]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:38 np0005486759.ooo.test sudo[103287]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpft73sflb/privsep.sock
Oct 14 08:52:38 np0005486759.ooo.test sudo[103287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:38 np0005486759.ooo.test sudo[103287]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:52:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:52:38 np0005486759.ooo.test podman[103294]: 2025-10-14 08:52:38.962645156 +0000 UTC m=+0.052548801 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, release=1, batch=17.1_20250721.1, container_name=ovn_controller, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 08:52:38 np0005486759.ooo.test podman[103294]: 2025-10-14 08:52:38.985276702 +0000 UTC m=+0.075180357 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller)
Oct 14 08:52:38 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:52:39 np0005486759.ooo.test podman[103293]: 2025-10-14 08:52:39.071026357 +0000 UTC m=+0.163332047 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team)
Oct 14 08:52:39 np0005486759.ooo.test sudo[103347]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy6n3w8n9/privsep.sock
Oct 14 08:52:39 np0005486759.ooo.test sudo[103347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:39 np0005486759.ooo.test podman[103293]: 2025-10-14 08:52:39.133309031 +0000 UTC m=+0.225614741 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vendor=Red Hat, Inc., release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible)
Oct 14 08:52:39 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:52:39 np0005486759.ooo.test sudo[103347]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:40 np0005486759.ooo.test sudo[103358]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3_ysh15m/privsep.sock
Oct 14 08:52:40 np0005486759.ooo.test sudo[103358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:40 np0005486759.ooo.test sudo[103358]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:40 np0005486759.ooo.test sudo[103369]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmz_dwd08/privsep.sock
Oct 14 08:52:40 np0005486759.ooo.test sudo[103369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:41 np0005486759.ooo.test sudo[103369]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:41 np0005486759.ooo.test sudo[103380]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprh4gtfgt/privsep.sock
Oct 14 08:52:41 np0005486759.ooo.test sudo[103380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:42 np0005486759.ooo.test sudo[103380]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:42 np0005486759.ooo.test sudo[103391]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ms8czh5/privsep.sock
Oct 14 08:52:42 np0005486759.ooo.test sudo[103391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:42 np0005486759.ooo.test sudo[103391]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:43 np0005486759.ooo.test sudo[103408]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfh5uojcj/privsep.sock
Oct 14 08:52:43 np0005486759.ooo.test sudo[103408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:43 np0005486759.ooo.test sudo[103408]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:44 np0005486759.ooo.test sudo[103419]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmdzz4lmy/privsep.sock
Oct 14 08:52:44 np0005486759.ooo.test sudo[103419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:44 np0005486759.ooo.test sudo[103419]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:44 np0005486759.ooo.test sudo[103430]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsgqul_hb/privsep.sock
Oct 14 08:52:44 np0005486759.ooo.test sudo[103430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:45 np0005486759.ooo.test sudo[103430]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:45 np0005486759.ooo.test sudo[103441]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnz8ngk7p/privsep.sock
Oct 14 08:52:45 np0005486759.ooo.test sudo[103441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:46 np0005486759.ooo.test sudo[103441]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:46 np0005486759.ooo.test sudo[103452]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj0ryij0o/privsep.sock
Oct 14 08:52:46 np0005486759.ooo.test sudo[103452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:47 np0005486759.ooo.test sudo[103452]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:47 np0005486759.ooo.test sudo[103463]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp902s3yj6/privsep.sock
Oct 14 08:52:47 np0005486759.ooo.test sudo[103463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:48 np0005486759.ooo.test sudo[103463]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:48 np0005486759.ooo.test sudo[103480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz5tdfk_o/privsep.sock
Oct 14 08:52:48 np0005486759.ooo.test sudo[103480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:49 np0005486759.ooo.test sudo[103480]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:49 np0005486759.ooo.test sudo[103491]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzukpahgv/privsep.sock
Oct 14 08:52:49 np0005486759.ooo.test sudo[103491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:50 np0005486759.ooo.test sudo[103491]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:50 np0005486759.ooo.test sudo[103502]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3t8qo72p/privsep.sock
Oct 14 08:52:50 np0005486759.ooo.test sudo[103502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:50 np0005486759.ooo.test sudo[103502]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:51 np0005486759.ooo.test sudo[103513]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppsv0vbv5/privsep.sock
Oct 14 08:52:51 np0005486759.ooo.test sudo[103513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:51 np0005486759.ooo.test sudo[103513]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:52 np0005486759.ooo.test sudo[103524]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl8jf2c2c/privsep.sock
Oct 14 08:52:52 np0005486759.ooo.test sudo[103524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:52 np0005486759.ooo.test sudo[103524]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:52 np0005486759.ooo.test sudo[103535]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp35qm9ucl/privsep.sock
Oct 14 08:52:52 np0005486759.ooo.test sudo[103535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:52:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:52:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:52:53 np0005486759.ooo.test podman[103537]: 2025-10-14 08:52:53.053808932 +0000 UTC m=+0.086207742 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:52:53 np0005486759.ooo.test podman[103537]: 2025-10-14 08:52:53.086933745 +0000 UTC m=+0.119332535 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Oct 14 08:52:53 np0005486759.ooo.test systemd[1]: tmp-crun.Yt1YXx.mount: Deactivated successfully.
Oct 14 08:52:53 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:52:53 np0005486759.ooo.test podman[103539]: 2025-10-14 08:52:53.100950543 +0000 UTC m=+0.127242852 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, container_name=collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vcs-type=git, com.redhat.component=openstack-collectd-container, release=2, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.)
Oct 14 08:52:53 np0005486759.ooo.test podman[103539]: 2025-10-14 08:52:53.110250212 +0000 UTC m=+0.136542521 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.9, release=2, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:52:53 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:52:53 np0005486759.ooo.test podman[103538]: 2025-10-14 08:52:53.174532328 +0000 UTC m=+0.202655684 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public)
Oct 14 08:52:53 np0005486759.ooo.test podman[103538]: 2025-10-14 08:52:53.197246177 +0000 UTC m=+0.225369533 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, release=1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, version=17.1.9, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Oct 14 08:52:53 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:52:53 np0005486759.ooo.test sudo[103535]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:53 np0005486759.ooo.test sudo[103615]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq11ks_fx/privsep.sock
Oct 14 08:52:53 np0005486759.ooo.test sudo[103615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:54 np0005486759.ooo.test sudo[103615]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:54 np0005486759.ooo.test sudo[103626]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc793_kf3/privsep.sock
Oct 14 08:52:54 np0005486759.ooo.test sudo[103626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:55 np0005486759.ooo.test sudo[103626]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:55 np0005486759.ooo.test sudo[103637]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5rzh2480/privsep.sock
Oct 14 08:52:55 np0005486759.ooo.test sudo[103637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:56 np0005486759.ooo.test sudo[103637]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:52:56 np0005486759.ooo.test sudo[103657]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpss9bdo4f/privsep.sock
Oct 14 08:52:56 np0005486759.ooo.test sudo[103657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:56 np0005486759.ooo.test podman[103645]: 2025-10-14 08:52:56.448858935 +0000 UTC m=+0.076347644 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, release=1, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_id=tripleo_step1)
Oct 14 08:52:56 np0005486759.ooo.test podman[103645]: 2025-10-14 08:52:56.623270507 +0000 UTC m=+0.250759166 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, container_name=metrics_qdr, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:52:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:52:56 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:52:56 np0005486759.ooo.test podman[103680]: 2025-10-14 08:52:56.731931927 +0000 UTC m=+0.078273203 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:52:57 np0005486759.ooo.test sudo[103657]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:57 np0005486759.ooo.test podman[103680]: 2025-10-14 08:52:57.074709252 +0000 UTC m=+0.421050558 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:52:57 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:52:57 np0005486759.ooo.test sudo[103710]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpccumpulu/privsep.sock
Oct 14 08:52:57 np0005486759.ooo.test sudo[103710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:57 np0005486759.ooo.test sudo[103710]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:58 np0005486759.ooo.test sudo[103721]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp56pc2qzm/privsep.sock
Oct 14 08:52:58 np0005486759.ooo.test sudo[103721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:58 np0005486759.ooo.test sudo[103721]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:59 np0005486759.ooo.test sudo[103737]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzctie7ac/privsep.sock
Oct 14 08:52:59 np0005486759.ooo.test sudo[103737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:52:59 np0005486759.ooo.test sudo[103737]: pam_unix(sudo:session): session closed for user root
Oct 14 08:52:59 np0005486759.ooo.test sudo[103749]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2jo5t9ax/privsep.sock
Oct 14 08:52:59 np0005486759.ooo.test sudo[103749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:00 np0005486759.ooo.test sudo[103749]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:00 np0005486759.ooo.test sudo[103760]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe7xw4z__/privsep.sock
Oct 14 08:53:00 np0005486759.ooo.test sudo[103760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:01 np0005486759.ooo.test sudo[103760]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:01 np0005486759.ooo.test sudo[103771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptbkp040o/privsep.sock
Oct 14 08:53:01 np0005486759.ooo.test sudo[103771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:02 np0005486759.ooo.test sudo[103771]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:02 np0005486759.ooo.test sudo[103782]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_2u845u9/privsep.sock
Oct 14 08:53:02 np0005486759.ooo.test sudo[103782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:02 np0005486759.ooo.test sudo[103782]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:53:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:53:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:53:03 np0005486759.ooo.test podman[103788]: 2025-10-14 08:53:03.077528804 +0000 UTC m=+0.080077442 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.9, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52)
Oct 14 08:53:03 np0005486759.ooo.test systemd[1]: tmp-crun.oE4RFz.mount: Deactivated successfully.
Oct 14 08:53:03 np0005486759.ooo.test podman[103789]: 2025-10-14 08:53:03.093535813 +0000 UTC m=+0.091321672 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:53:03 np0005486759.ooo.test podman[103788]: 2025-10-14 08:53:03.11851843 +0000 UTC m=+0.121067108 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 08:53:03 np0005486759.ooo.test podman[103789]: 2025-10-14 08:53:03.132328798 +0000 UTC m=+0.130114657 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Oct 14 08:53:03 np0005486759.ooo.test podman[103789]: unhealthy
Oct 14 08:53:03 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:53:03 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:53:03 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:53:03 np0005486759.ooo.test podman[103790]: 2025-10-14 08:53:03.137450968 +0000 UTC m=+0.129142048 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, release=1, build-date=2025-07-21T15:29:47, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:53:03 np0005486759.ooo.test podman[103790]: 2025-10-14 08:53:03.220381927 +0000 UTC m=+0.212073017 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:53:03 np0005486759.ooo.test podman[103790]: unhealthy
Oct 14 08:53:03 np0005486759.ooo.test sudo[103848]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_ur0s9jd/privsep.sock
Oct 14 08:53:03 np0005486759.ooo.test sudo[103848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:03 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:53:03 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:53:03 np0005486759.ooo.test sudo[103848]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:04 np0005486759.ooo.test sudo[103861]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw95xxyum/privsep.sock
Oct 14 08:53:04 np0005486759.ooo.test sudo[103861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:04 np0005486759.ooo.test sudo[103861]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:05 np0005486759.ooo.test sudo[103876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd9qky2j0/privsep.sock
Oct 14 08:53:05 np0005486759.ooo.test sudo[103876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:05 np0005486759.ooo.test sudo[103876]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:05 np0005486759.ooo.test sudo[103887]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg_rclq27/privsep.sock
Oct 14 08:53:05 np0005486759.ooo.test sudo[103887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:06 np0005486759.ooo.test sudo[103887]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:06 np0005486759.ooo.test sudo[103898]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2odriz9t/privsep.sock
Oct 14 08:53:06 np0005486759.ooo.test sudo[103898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:07 np0005486759.ooo.test sudo[103898]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:07 np0005486759.ooo.test sudo[103909]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsyul793i/privsep.sock
Oct 14 08:53:07 np0005486759.ooo.test sudo[103909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:08 np0005486759.ooo.test sudo[103909]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:08 np0005486759.ooo.test sudo[103932]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxc81_ve6/privsep.sock
Oct 14 08:53:08 np0005486759.ooo.test sudo[103932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:09 np0005486759.ooo.test sudo[103932]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:53:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:53:09 np0005486759.ooo.test podman[103939]: 2025-10-14 08:53:09.293051396 +0000 UTC m=+0.069693809 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller)
Oct 14 08:53:09 np0005486759.ooo.test systemd[1]: tmp-crun.nJeNct.mount: Deactivated successfully.
Oct 14 08:53:09 np0005486759.ooo.test podman[103939]: 2025-10-14 08:53:09.344264969 +0000 UTC m=+0.120907302 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, container_name=ovn_controller, io.buildah.version=1.33.12, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Oct 14 08:53:09 np0005486759.ooo.test podman[103938]: 2025-10-14 08:53:09.351589906 +0000 UTC m=+0.128291770 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, container_name=ovn_metadata_agent, batch=17.1_20250721.1, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible)
Oct 14 08:53:09 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:53:09 np0005486759.ooo.test podman[103938]: 2025-10-14 08:53:09.404361147 +0000 UTC m=+0.181063011 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, build-date=2025-07-21T16:28:53, release=1)
Oct 14 08:53:09 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:53:09 np0005486759.ooo.test sudo[103989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplrxasna3/privsep.sock
Oct 14 08:53:09 np0005486759.ooo.test sudo[103989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:10 np0005486759.ooo.test sudo[103989]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:10 np0005486759.ooo.test sudo[104006]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsi3y90eg/privsep.sock
Oct 14 08:53:10 np0005486759.ooo.test sudo[104006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:10 np0005486759.ooo.test sudo[104006]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:11 np0005486759.ooo.test sudo[104017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz6rovuf1/privsep.sock
Oct 14 08:53:11 np0005486759.ooo.test sudo[104017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:11 np0005486759.ooo.test sudo[104017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:12 np0005486759.ooo.test sudo[104028]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp50sgz_5k/privsep.sock
Oct 14 08:53:12 np0005486759.ooo.test sudo[104028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:12 np0005486759.ooo.test sudo[104028]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:13 np0005486759.ooo.test sudo[104039]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp37ye0n7x/privsep.sock
Oct 14 08:53:13 np0005486759.ooo.test sudo[104039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:13 np0005486759.ooo.test sudo[104039]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:13 np0005486759.ooo.test sudo[104050]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxn6ygbb9/privsep.sock
Oct 14 08:53:13 np0005486759.ooo.test sudo[104050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:14 np0005486759.ooo.test sudo[104050]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:14 np0005486759.ooo.test sudo[104061]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqkf93ihq/privsep.sock
Oct 14 08:53:14 np0005486759.ooo.test sudo[104061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:15 np0005486759.ooo.test sudo[104061]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:15 np0005486759.ooo.test sudo[104078]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpci6_dk2y/privsep.sock
Oct 14 08:53:15 np0005486759.ooo.test sudo[104078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:16 np0005486759.ooo.test sudo[104078]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:16 np0005486759.ooo.test sudo[104089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7u5ampek/privsep.sock
Oct 14 08:53:16 np0005486759.ooo.test sudo[104089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:17 np0005486759.ooo.test sudo[104089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:17 np0005486759.ooo.test sudo[104100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp23alx213/privsep.sock
Oct 14 08:53:17 np0005486759.ooo.test sudo[104100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:17 np0005486759.ooo.test sudo[104100]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:18 np0005486759.ooo.test sudo[104111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpikpv9jao/privsep.sock
Oct 14 08:53:18 np0005486759.ooo.test sudo[104111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:18 np0005486759.ooo.test sudo[104111]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:19 np0005486759.ooo.test sudo[104122]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqub205eq/privsep.sock
Oct 14 08:53:19 np0005486759.ooo.test sudo[104122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:19 np0005486759.ooo.test sudo[104122]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:20 np0005486759.ooo.test sudo[104133]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg1b5sj1z/privsep.sock
Oct 14 08:53:20 np0005486759.ooo.test sudo[104133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:20 np0005486759.ooo.test sudo[104133]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:20 np0005486759.ooo.test sudo[104150]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps5q448zh/privsep.sock
Oct 14 08:53:20 np0005486759.ooo.test sudo[104150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:21 np0005486759.ooo.test sudo[104150]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:21 np0005486759.ooo.test sudo[104161]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe8l31aqk/privsep.sock
Oct 14 08:53:21 np0005486759.ooo.test sudo[104161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:22 np0005486759.ooo.test sudo[104161]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:22 np0005486759.ooo.test sudo[104172]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvi0dv6yj/privsep.sock
Oct 14 08:53:22 np0005486759.ooo.test sudo[104172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:23 np0005486759.ooo.test sudo[104172]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: tmp-crun.DmW1Vu.mount: Deactivated successfully.
Oct 14 08:53:23 np0005486759.ooo.test podman[104178]: 2025-10-14 08:53:23.359143771 +0000 UTC m=+0.088402420 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, version=17.1.9, build-date=2025-07-21T13:27:15, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, container_name=iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12)
Oct 14 08:53:23 np0005486759.ooo.test podman[104178]: 2025-10-14 08:53:23.428131146 +0000 UTC m=+0.157389805 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15)
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: tmp-crun.oYtjpQ.mount: Deactivated successfully.
Oct 14 08:53:23 np0005486759.ooo.test podman[104180]: 2025-10-14 08:53:23.431935895 +0000 UTC m=+0.152103092 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, architecture=x86_64, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:53:23 np0005486759.ooo.test podman[104180]: 2025-10-14 08:53:23.466353145 +0000 UTC m=+0.186520322 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git)
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:53:23 np0005486759.ooo.test sudo[104245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsxabwjj7/privsep.sock
Oct 14 08:53:23 np0005486759.ooo.test sudo[104245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:53:23 np0005486759.ooo.test podman[104179]: 2025-10-14 08:53:23.332645837 +0000 UTC m=+0.062048431 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.9, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37)
Oct 14 08:53:23 np0005486759.ooo.test podman[104179]: 2025-10-14 08:53:23.563685363 +0000 UTC m=+0.293087997 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, release=1, architecture=x86_64, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:53:23 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:53:24 np0005486759.ooo.test sudo[104245]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:24 np0005486759.ooo.test sudo[104258]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp13k7ozbt/privsep.sock
Oct 14 08:53:24 np0005486759.ooo.test sudo[104258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:25 np0005486759.ooo.test sudo[104258]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:25 np0005486759.ooo.test sudo[104269]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpig1igw2d/privsep.sock
Oct 14 08:53:25 np0005486759.ooo.test sudo[104269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:25 np0005486759.ooo.test sudo[104269]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:26 np0005486759.ooo.test sudo[104286]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp204o044l/privsep.sock
Oct 14 08:53:26 np0005486759.ooo.test sudo[104286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:26 np0005486759.ooo.test sudo[104286]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:53:26 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:53:26 np0005486759.ooo.test recover_tripleo_nova_virtqemud[104294]: 47951
Oct 14 08:53:26 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:53:26 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:53:26 np0005486759.ooo.test systemd[1]: tmp-crun.va89L6.mount: Deactivated successfully.
Oct 14 08:53:26 np0005486759.ooo.test podman[104292]: 2025-10-14 08:53:26.932100265 +0000 UTC m=+0.110064354 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 08:53:27 np0005486759.ooo.test sudo[104328]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw2kfxmvv/privsep.sock
Oct 14 08:53:27 np0005486759.ooo.test sudo[104328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:53:27 np0005486759.ooo.test podman[104330]: 2025-10-14 08:53:27.213874049 +0000 UTC m=+0.094559812 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, vcs-type=git)
Oct 14 08:53:27 np0005486759.ooo.test podman[104292]: 2025-10-14 08:53:27.251631013 +0000 UTC m=+0.429595082 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Oct 14 08:53:27 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:53:27 np0005486759.ooo.test podman[104330]: 2025-10-14 08:53:27.60267208 +0000 UTC m=+0.483357903 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, release=1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.9)
Oct 14 08:53:27 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:53:27 np0005486759.ooo.test sudo[104328]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:27 np0005486759.ooo.test sudo[104362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm2udp5ek/privsep.sock
Oct 14 08:53:27 np0005486759.ooo.test sudo[104362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:28 np0005486759.ooo.test sudo[104362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:28 np0005486759.ooo.test sudo[104373]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwdmyqviu/privsep.sock
Oct 14 08:53:28 np0005486759.ooo.test sudo[104373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:29 np0005486759.ooo.test sudo[104373]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:29 np0005486759.ooo.test sudo[104384]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi7wsdhlw/privsep.sock
Oct 14 08:53:29 np0005486759.ooo.test sudo[104384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:30 np0005486759.ooo.test sudo[104384]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:30 np0005486759.ooo.test sudo[104395]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaxc864fj/privsep.sock
Oct 14 08:53:30 np0005486759.ooo.test sudo[104395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:31 np0005486759.ooo.test sudo[104395]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:31 np0005486759.ooo.test sudo[104412]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptnj7vvyl/privsep.sock
Oct 14 08:53:31 np0005486759.ooo.test sudo[104412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:32 np0005486759.ooo.test sudo[104412]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:32 np0005486759.ooo.test sudo[104423]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg9covue1/privsep.sock
Oct 14 08:53:32 np0005486759.ooo.test sudo[104423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:32 np0005486759.ooo.test sudo[104423]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:33 np0005486759.ooo.test sudo[104434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfr7p813/privsep.sock
Oct 14 08:53:33 np0005486759.ooo.test sudo[104434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: tmp-crun.lxiOqN.mount: Deactivated successfully.
Oct 14 08:53:33 np0005486759.ooo.test podman[104436]: 2025-10-14 08:53:33.305380543 +0000 UTC m=+0.055900950 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, architecture=x86_64)
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:53:33 np0005486759.ooo.test podman[104436]: 2025-10-14 08:53:33.330425222 +0000 UTC m=+0.080945679 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond)
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:53:33 np0005486759.ooo.test podman[104437]: 2025-10-14 08:53:33.390212281 +0000 UTC m=+0.133806072 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 08:53:33 np0005486759.ooo.test podman[104437]: 2025-10-14 08:53:33.401554314 +0000 UTC m=+0.145148085 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20250721.1, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute)
Oct 14 08:53:33 np0005486759.ooo.test podman[104437]: unhealthy
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:53:33 np0005486759.ooo.test podman[104466]: 2025-10-14 08:53:33.492341788 +0000 UTC m=+0.161073061 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public)
Oct 14 08:53:33 np0005486759.ooo.test podman[104466]: 2025-10-14 08:53:33.50944849 +0000 UTC m=+0.178179783 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:53:33 np0005486759.ooo.test podman[104466]: unhealthy
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:53:33 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:53:33 np0005486759.ooo.test sudo[104434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:34 np0005486759.ooo.test sudo[104503]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp72cq7tma/privsep.sock
Oct 14 08:53:34 np0005486759.ooo.test sudo[104503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:34 np0005486759.ooo.test sudo[104503]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:34 np0005486759.ooo.test sudo[104514]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqllw44wm/privsep.sock
Oct 14 08:53:34 np0005486759.ooo.test sudo[104514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:35 np0005486759.ooo.test sudo[104514]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:35 np0005486759.ooo.test sudo[104525]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpix42_mwz/privsep.sock
Oct 14 08:53:35 np0005486759.ooo.test sudo[104525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:36 np0005486759.ooo.test sudo[104525]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:36 np0005486759.ooo.test sudo[104539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp92g3466k/privsep.sock
Oct 14 08:53:36 np0005486759.ooo.test sudo[104539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:37 np0005486759.ooo.test sudo[104539]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:37 np0005486759.ooo.test sudo[104553]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo439iek3/privsep.sock
Oct 14 08:53:37 np0005486759.ooo.test sudo[104553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:38 np0005486759.ooo.test sudo[104553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:38 np0005486759.ooo.test sudo[104564]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpch2gldm9/privsep.sock
Oct 14 08:53:38 np0005486759.ooo.test sudo[104564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:39 np0005486759.ooo.test sudo[104564]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:39 np0005486759.ooo.test sudo[104575]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiu5b7t_5/privsep.sock
Oct 14 08:53:39 np0005486759.ooo.test sudo[104575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:39 np0005486759.ooo.test sudo[104575]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:53:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:53:39 np0005486759.ooo.test podman[104580]: 2025-10-14 08:53:39.995738062 +0000 UTC m=+0.069360198 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent)
Oct 14 08:53:40 np0005486759.ooo.test podman[104582]: 2025-10-14 08:53:40.060568989 +0000 UTC m=+0.130325625 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, architecture=x86_64, version=17.1.9, managed_by=tripleo_ansible)
Oct 14 08:53:40 np0005486759.ooo.test podman[104580]: 2025-10-14 08:53:40.061101275 +0000 UTC m=+0.134723411 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1, vcs-type=git)
Oct 14 08:53:40 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:53:40 np0005486759.ooo.test podman[104582]: 2025-10-14 08:53:40.143338253 +0000 UTC m=+0.213094809 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:53:40 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:53:40 np0005486759.ooo.test sudo[104631]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu6_kau6o/privsep.sock
Oct 14 08:53:40 np0005486759.ooo.test sudo[104631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:40 np0005486759.ooo.test sudo[104631]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:41 np0005486759.ooo.test sudo[104642]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvig66bs3/privsep.sock
Oct 14 08:53:41 np0005486759.ooo.test sudo[104642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:41 np0005486759.ooo.test sudo[104642]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:41 np0005486759.ooo.test sudo[104653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo6vezanl/privsep.sock
Oct 14 08:53:41 np0005486759.ooo.test sudo[104653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:42 np0005486759.ooo.test sudo[104653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:42 np0005486759.ooo.test sudo[104670]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcyba9vl6/privsep.sock
Oct 14 08:53:42 np0005486759.ooo.test sudo[104670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:43 np0005486759.ooo.test sudo[104670]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:43 np0005486759.ooo.test sudo[104681]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5nedj8tq/privsep.sock
Oct 14 08:53:43 np0005486759.ooo.test sudo[104681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:44 np0005486759.ooo.test sudo[104681]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:44 np0005486759.ooo.test sudo[104692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4gaf0sdd/privsep.sock
Oct 14 08:53:44 np0005486759.ooo.test sudo[104692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:45 np0005486759.ooo.test sudo[104692]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:45 np0005486759.ooo.test sudo[104703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfkui7syw/privsep.sock
Oct 14 08:53:45 np0005486759.ooo.test sudo[104703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:46 np0005486759.ooo.test sudo[104703]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:46 np0005486759.ooo.test sudo[104714]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnfin3tng/privsep.sock
Oct 14 08:53:46 np0005486759.ooo.test sudo[104714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:46 np0005486759.ooo.test sudo[104714]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:47 np0005486759.ooo.test sudo[104725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqgikau3a/privsep.sock
Oct 14 08:53:47 np0005486759.ooo.test sudo[104725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:47 np0005486759.ooo.test sudo[104725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:48 np0005486759.ooo.test sudo[104742]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcdgced3_/privsep.sock
Oct 14 08:53:48 np0005486759.ooo.test sudo[104742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:48 np0005486759.ooo.test sudo[104742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:48 np0005486759.ooo.test sudo[104753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr4sis4by/privsep.sock
Oct 14 08:53:48 np0005486759.ooo.test sudo[104753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:49 np0005486759.ooo.test sudo[104753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:49 np0005486759.ooo.test sudo[104764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaxp6eqp3/privsep.sock
Oct 14 08:53:49 np0005486759.ooo.test sudo[104764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:50 np0005486759.ooo.test sudo[104764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:50 np0005486759.ooo.test sudo[104775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr5zxdcok/privsep.sock
Oct 14 08:53:50 np0005486759.ooo.test sudo[104775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:51 np0005486759.ooo.test sudo[104775]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:51 np0005486759.ooo.test sudo[104786]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl94dvwzl/privsep.sock
Oct 14 08:53:51 np0005486759.ooo.test sudo[104786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:52 np0005486759.ooo.test sudo[104786]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:52 np0005486759.ooo.test sudo[104797]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprviy7z24/privsep.sock
Oct 14 08:53:52 np0005486759.ooo.test sudo[104797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:53 np0005486759.ooo.test sudo[104797]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:53 np0005486759.ooo.test sudo[104814]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfqd88mxq/privsep.sock
Oct 14 08:53:53 np0005486759.ooo.test sudo[104814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:53 np0005486759.ooo.test sudo[104814]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:53:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:53:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:53:54 np0005486759.ooo.test podman[104827]: 2025-10-14 08:53:54.07925581 +0000 UTC m=+0.075760837 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, config_id=tripleo_step3, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:53:54 np0005486759.ooo.test podman[104827]: 2025-10-14 08:53:54.08924606 +0000 UTC m=+0.085751067 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=2, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, version=17.1.9, batch=17.1_20250721.1)
Oct 14 08:53:54 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:53:54 np0005486759.ooo.test podman[104820]: 2025-10-14 08:53:54.058999599 +0000 UTC m=+0.067606864 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, container_name=iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:53:54 np0005486759.ooo.test podman[104820]: 2025-10-14 08:53:54.142335401 +0000 UTC m=+0.150942706 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:27:15, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.33.12, release=1, version=17.1.9, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team)
Oct 14 08:53:54 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:53:54 np0005486759.ooo.test sudo[104874]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_sqpspfo/privsep.sock
Oct 14 08:53:54 np0005486759.ooo.test sudo[104874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:54 np0005486759.ooo.test podman[104821]: 2025-10-14 08:53:54.224861938 +0000 UTC m=+0.226781494 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1)
Oct 14 08:53:54 np0005486759.ooo.test podman[104821]: 2025-10-14 08:53:54.276272937 +0000 UTC m=+0.278192453 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37)
Oct 14 08:53:54 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:53:54 np0005486759.ooo.test sudo[104874]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:55 np0005486759.ooo.test sudo[104898]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuh15trgx/privsep.sock
Oct 14 08:53:55 np0005486759.ooo.test sudo[104898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:55 np0005486759.ooo.test sudo[104898]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:55 np0005486759.ooo.test sudo[104909]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptbze0b2t/privsep.sock
Oct 14 08:53:55 np0005486759.ooo.test sudo[104909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:56 np0005486759.ooo.test sudo[104909]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:56 np0005486759.ooo.test sudo[104920]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyxbj44tc/privsep.sock
Oct 14 08:53:56 np0005486759.ooo.test sudo[104920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:53:57 np0005486759.ooo.test sudo[104920]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:57 np0005486759.ooo.test systemd[1]: tmp-crun.NoV7Sb.mount: Deactivated successfully.
Oct 14 08:53:57 np0005486759.ooo.test podman[104924]: 2025-10-14 08:53:57.499600347 +0000 UTC m=+0.119547649 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, version=17.1.9, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_id=tripleo_step1, io.openshift.expose-services=, batch=17.1_20250721.1, release=1, maintainer=OpenStack TripleO Team)
Oct 14 08:53:57 np0005486759.ooo.test sudo[104960]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqk3v1mac/privsep.sock
Oct 14 08:53:57 np0005486759.ooo.test sudo[104960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:53:57 np0005486759.ooo.test podman[104924]: 2025-10-14 08:53:57.733699908 +0000 UTC m=+0.353647220 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:53:57 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:53:57 np0005486759.ooo.test podman[104961]: 2025-10-14 08:53:57.783998282 +0000 UTC m=+0.083827278 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9)
Oct 14 08:53:58 np0005486759.ooo.test podman[104961]: 2025-10-14 08:53:58.165548259 +0000 UTC m=+0.465377225 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Oct 14 08:53:58 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:53:58 np0005486759.ooo.test sudo[104960]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:58 np0005486759.ooo.test sudo[105000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptauprmuu/privsep.sock
Oct 14 08:53:58 np0005486759.ooo.test sudo[105000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:59 np0005486759.ooo.test sudo[105000]: pam_unix(sudo:session): session closed for user root
Oct 14 08:53:59 np0005486759.ooo.test sudo[105011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6vipgpsv/privsep.sock
Oct 14 08:53:59 np0005486759.ooo.test sudo[105011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:53:59 np0005486759.ooo.test sudo[105011]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:00 np0005486759.ooo.test sudo[105022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6jv8r4av/privsep.sock
Oct 14 08:54:00 np0005486759.ooo.test sudo[105022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:00 np0005486759.ooo.test sudo[105022]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:01 np0005486759.ooo.test sudo[105033]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq2_nrtx3/privsep.sock
Oct 14 08:54:01 np0005486759.ooo.test sudo[105033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:01 np0005486759.ooo.test sudo[105033]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:01 np0005486759.ooo.test sudo[105044]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkqcmkxk2/privsep.sock
Oct 14 08:54:01 np0005486759.ooo.test sudo[105044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:02 np0005486759.ooo.test sudo[105044]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:02 np0005486759.ooo.test sudo[105055]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_y1qesul/privsep.sock
Oct 14 08:54:02 np0005486759.ooo.test sudo[105055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:54:03 np0005486759.ooo.test sudo[105055]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: tmp-crun.DIvQft.mount: Deactivated successfully.
Oct 14 08:54:03 np0005486759.ooo.test podman[105059]: 2025-10-14 08:54:03.442085777 +0000 UTC m=+0.072542618 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team)
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:54:03 np0005486759.ooo.test podman[105059]: 2025-10-14 08:54:03.490273915 +0000 UTC m=+0.120730736 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:54:03 np0005486759.ooo.test podman[105080]: 2025-10-14 08:54:03.589700498 +0000 UTC m=+0.121711307 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:54:03 np0005486759.ooo.test sudo[105116]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj45zkd4d/privsep.sock
Oct 14 08:54:03 np0005486759.ooo.test sudo[105116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:03 np0005486759.ooo.test podman[105080]: 2025-10-14 08:54:03.627290497 +0000 UTC m=+0.159301286 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team)
Oct 14 08:54:03 np0005486759.ooo.test podman[105080]: unhealthy
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:54:03 np0005486759.ooo.test podman[105104]: 2025-10-14 08:54:03.67274329 +0000 UTC m=+0.079813133 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20250721.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47)
Oct 14 08:54:03 np0005486759.ooo.test podman[105104]: 2025-10-14 08:54:03.6904063 +0000 UTC m=+0.097476193 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, release=1, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f)
Oct 14 08:54:03 np0005486759.ooo.test podman[105104]: unhealthy
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:54:03 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:54:04 np0005486759.ooo.test sudo[105116]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:04 np0005486759.ooo.test systemd[1]: tmp-crun.vEccJJ.mount: Deactivated successfully.
Oct 14 08:54:04 np0005486759.ooo.test sudo[105140]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmc5lj9xz/privsep.sock
Oct 14 08:54:04 np0005486759.ooo.test sudo[105140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:04 np0005486759.ooo.test sudo[105140]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:05 np0005486759.ooo.test sudo[105151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv5b5bm65/privsep.sock
Oct 14 08:54:05 np0005486759.ooo.test sudo[105151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:05 np0005486759.ooo.test sudo[105151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:06 np0005486759.ooo.test sudo[105162]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp40s98hjq/privsep.sock
Oct 14 08:54:06 np0005486759.ooo.test sudo[105162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:06 np0005486759.ooo.test sudo[105162]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:07 np0005486759.ooo.test sudo[105173]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxb9k486e/privsep.sock
Oct 14 08:54:07 np0005486759.ooo.test sudo[105173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:07 np0005486759.ooo.test sudo[105173]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:07 np0005486759.ooo.test sudo[105184]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9107so6g/privsep.sock
Oct 14 08:54:07 np0005486759.ooo.test sudo[105184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:08 np0005486759.ooo.test sudo[105184]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:08 np0005486759.ooo.test sudo[105197]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp99a5k4qi/privsep.sock
Oct 14 08:54:08 np0005486759.ooo.test sudo[105197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:09 np0005486759.ooo.test sudo[105197]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:09 np0005486759.ooo.test sudo[105212]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp36bwyn3a/privsep.sock
Oct 14 08:54:09 np0005486759.ooo.test sudo[105212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:10 np0005486759.ooo.test sudo[105212]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:54:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:54:10 np0005486759.ooo.test systemd[1]: tmp-crun.Lr4QHx.mount: Deactivated successfully.
Oct 14 08:54:10 np0005486759.ooo.test podman[105219]: 2025-10-14 08:54:10.410175094 +0000 UTC m=+0.075481209 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public)
Oct 14 08:54:10 np0005486759.ooo.test podman[105218]: 2025-10-14 08:54:10.420227537 +0000 UTC m=+0.084397336 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, distribution-scope=public, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 08:54:10 np0005486759.ooo.test podman[105218]: 2025-10-14 08:54:10.454486022 +0000 UTC m=+0.118655591 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_id=tripleo_step4, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53)
Oct 14 08:54:10 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:54:10 np0005486759.ooo.test podman[105219]: 2025-10-14 08:54:10.478506069 +0000 UTC m=+0.143812154 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:54:10 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:54:10 np0005486759.ooo.test sudo[105274]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1rt4fw41/privsep.sock
Oct 14 08:54:10 np0005486759.ooo.test sudo[105274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:11 np0005486759.ooo.test sudo[105274]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:11 np0005486759.ooo.test sudo[105285]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmy261490/privsep.sock
Oct 14 08:54:11 np0005486759.ooo.test sudo[105285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:11 np0005486759.ooo.test sudo[105285]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:12 np0005486759.ooo.test sudo[105296]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprftkf6du/privsep.sock
Oct 14 08:54:12 np0005486759.ooo.test sudo[105296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:12 np0005486759.ooo.test sudo[105296]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:13 np0005486759.ooo.test sudo[105307]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph36mctxn/privsep.sock
Oct 14 08:54:13 np0005486759.ooo.test sudo[105307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:13 np0005486759.ooo.test sudo[105307]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:13 np0005486759.ooo.test sudo[105318]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb1b6x9xq/privsep.sock
Oct 14 08:54:13 np0005486759.ooo.test sudo[105318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:14 np0005486759.ooo.test sudo[105318]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:14 np0005486759.ooo.test sudo[105335]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzyhqd_v2/privsep.sock
Oct 14 08:54:14 np0005486759.ooo.test sudo[105335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:15 np0005486759.ooo.test sudo[105335]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:15 np0005486759.ooo.test sudo[105346]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph2tj81d0/privsep.sock
Oct 14 08:54:15 np0005486759.ooo.test sudo[105346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:16 np0005486759.ooo.test sudo[105346]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:16 np0005486759.ooo.test sudo[105357]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmperdb5qky/privsep.sock
Oct 14 08:54:16 np0005486759.ooo.test sudo[105357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:17 np0005486759.ooo.test sudo[105357]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:17 np0005486759.ooo.test sudo[105368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp00xougme/privsep.sock
Oct 14 08:54:17 np0005486759.ooo.test sudo[105368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:17 np0005486759.ooo.test sudo[105368]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:18 np0005486759.ooo.test sudo[105379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplu33q_v7/privsep.sock
Oct 14 08:54:18 np0005486759.ooo.test sudo[105379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:18 np0005486759.ooo.test sudo[105379]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:19 np0005486759.ooo.test sudo[105390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw2cu8uq1/privsep.sock
Oct 14 08:54:19 np0005486759.ooo.test sudo[105390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:19 np0005486759.ooo.test sudo[105390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:20 np0005486759.ooo.test sudo[105407]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy0q8fdp0/privsep.sock
Oct 14 08:54:20 np0005486759.ooo.test sudo[105407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:20 np0005486759.ooo.test sudo[105407]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:20 np0005486759.ooo.test sudo[105418]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4c4b19z5/privsep.sock
Oct 14 08:54:20 np0005486759.ooo.test sudo[105418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:21 np0005486759.ooo.test sudo[105418]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:21 np0005486759.ooo.test sudo[105429]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7wivkxlh/privsep.sock
Oct 14 08:54:21 np0005486759.ooo.test sudo[105429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:22 np0005486759.ooo.test sudo[105429]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:22 np0005486759.ooo.test sudo[105440]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpruam2zfy/privsep.sock
Oct 14 08:54:22 np0005486759.ooo.test sudo[105440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:23 np0005486759.ooo.test sudo[105440]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:23 np0005486759.ooo.test sudo[105451]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpswtnmgvy/privsep.sock
Oct 14 08:54:23 np0005486759.ooo.test sudo[105451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:24 np0005486759.ooo.test sudo[105451]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:24 np0005486759.ooo.test sudo[105462]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb2sw6wl3/privsep.sock
Oct 14 08:54:24 np0005486759.ooo.test sudo[105462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:54:24 np0005486759.ooo.test podman[105465]: 2025-10-14 08:54:24.421593829 +0000 UTC m=+0.061092091 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, container_name=nova_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37)
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: tmp-crun.dnlroE.mount: Deactivated successfully.
Oct 14 08:54:24 np0005486759.ooo.test podman[105465]: 2025-10-14 08:54:24.471000765 +0000 UTC m=+0.110499017 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc.)
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:54:24 np0005486759.ooo.test podman[105466]: 2025-10-14 08:54:24.475233657 +0000 UTC m=+0.113111909 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, release=2, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3)
Oct 14 08:54:24 np0005486759.ooo.test podman[105464]: 2025-10-14 08:54:24.532020833 +0000 UTC m=+0.172318530 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=iscsid, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 08:54:24 np0005486759.ooo.test podman[105464]: 2025-10-14 08:54:24.541170988 +0000 UTC m=+0.181468735 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public)
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:54:24 np0005486759.ooo.test podman[105466]: 2025-10-14 08:54:24.557252048 +0000 UTC m=+0.195130260 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, release=2, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03)
Oct 14 08:54:24 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:54:24 np0005486759.ooo.test sudo[105462]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:25 np0005486759.ooo.test sudo[105541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy4vpr52z/privsep.sock
Oct 14 08:54:25 np0005486759.ooo.test sudo[105541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:25 np0005486759.ooo.test sudo[105541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:26 np0005486759.ooo.test sudo[105553]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx9bgwrgm/privsep.sock
Oct 14 08:54:26 np0005486759.ooo.test sudo[105553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:26 np0005486759.ooo.test sudo[105553]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:26 np0005486759.ooo.test sudo[105564]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptk23v2lz/privsep.sock
Oct 14 08:54:26 np0005486759.ooo.test sudo[105564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:27 np0005486759.ooo.test sudo[105564]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:27 np0005486759.ooo.test sudo[105575]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0t6i5asq/privsep.sock
Oct 14 08:54:27 np0005486759.ooo.test sudo[105575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:54:27 np0005486759.ooo.test podman[105577]: 2025-10-14 08:54:27.890823396 +0000 UTC m=+0.067088587 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20250721.1, config_id=tripleo_step1, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, managed_by=tripleo_ansible, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, release=1, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 08:54:28 np0005486759.ooo.test podman[105577]: 2025-10-14 08:54:28.057259163 +0000 UTC m=+0.233524365 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1)
Oct 14 08:54:28 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:54:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:54:28 np0005486759.ooo.test sudo[105575]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:28 np0005486759.ooo.test systemd[1]: tmp-crun.Y4sxD3.mount: Deactivated successfully.
Oct 14 08:54:28 np0005486759.ooo.test podman[105608]: 2025-10-14 08:54:28.449631117 +0000 UTC m=+0.069137022 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, tcib_managed=true, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:54:28 np0005486759.ooo.test sudo[105639]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwouqx5l8/privsep.sock
Oct 14 08:54:28 np0005486759.ooo.test sudo[105639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:28 np0005486759.ooo.test podman[105608]: 2025-10-14 08:54:28.806705822 +0000 UTC m=+0.426211757 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, managed_by=tripleo_ansible)
Oct 14 08:54:28 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:54:29 np0005486759.ooo.test sudo[105639]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:29 np0005486759.ooo.test sudo[105650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2a2907yt/privsep.sock
Oct 14 08:54:29 np0005486759.ooo.test sudo[105650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:30 np0005486759.ooo.test sudo[105650]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:30 np0005486759.ooo.test sudo[105663]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjraxq96o/privsep.sock
Oct 14 08:54:30 np0005486759.ooo.test sudo[105663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:31 np0005486759.ooo.test sudo[105663]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:31 np0005486759.ooo.test sudo[105678]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpayxe2cxx/privsep.sock
Oct 14 08:54:31 np0005486759.ooo.test sudo[105678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:31 np0005486759.ooo.test sudo[105678]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:32 np0005486759.ooo.test sudo[105689]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg249p_3j/privsep.sock
Oct 14 08:54:32 np0005486759.ooo.test sudo[105689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:32 np0005486759.ooo.test sudo[105689]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:33 np0005486759.ooo.test sudo[105700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqsqfk_eg/privsep.sock
Oct 14 08:54:33 np0005486759.ooo.test sudo[105700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:33 np0005486759.ooo.test sudo[105700]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:54:33 np0005486759.ooo.test podman[105706]: 2025-10-14 08:54:33.868368467 +0000 UTC m=+0.088480904 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, container_name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., name=rhosp17/openstack-cron)
Oct 14 08:54:33 np0005486759.ooo.test podman[105706]: 2025-10-14 08:54:33.876384826 +0000 UTC m=+0.096497263 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: tmp-crun.4rH2Fs.mount: Deactivated successfully.
Oct 14 08:54:33 np0005486759.ooo.test podman[105707]: 2025-10-14 08:54:33.921093547 +0000 UTC m=+0.135602319 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 08:54:33 np0005486759.ooo.test podman[105707]: 2025-10-14 08:54:33.95626886 +0000 UTC m=+0.170777612 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, version=17.1.9, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, config_id=tripleo_step4, architecture=x86_64)
Oct 14 08:54:33 np0005486759.ooo.test podman[105707]: unhealthy
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: tmp-crun.dIE3p2.mount: Deactivated successfully.
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:54:33 np0005486759.ooo.test podman[105708]: 2025-10-14 08:54:33.971517515 +0000 UTC m=+0.180383642 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, release=1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 08:54:33 np0005486759.ooo.test podman[105708]: 2025-10-14 08:54:33.979167903 +0000 UTC m=+0.188034020 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, release=1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Oct 14 08:54:33 np0005486759.ooo.test podman[105708]: unhealthy
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:54:33 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:54:34 np0005486759.ooo.test sudo[105766]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8vz58_fy/privsep.sock
Oct 14 08:54:34 np0005486759.ooo.test sudo[105766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:34 np0005486759.ooo.test sudo[105766]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:34 np0005486759.ooo.test sudo[105777]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptk3rad0i/privsep.sock
Oct 14 08:54:34 np0005486759.ooo.test sudo[105777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:35 np0005486759.ooo.test sudo[105777]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:35 np0005486759.ooo.test sudo[105790]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfxcia_69/privsep.sock
Oct 14 08:54:35 np0005486759.ooo.test sudo[105790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:36 np0005486759.ooo.test sudo[105790]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:36 np0005486759.ooo.test sudo[105805]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprcfqgcrb/privsep.sock
Oct 14 08:54:36 np0005486759.ooo.test sudo[105805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:37 np0005486759.ooo.test sudo[105805]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:37 np0005486759.ooo.test sudo[105816]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpehnvssmd/privsep.sock
Oct 14 08:54:37 np0005486759.ooo.test sudo[105816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:38 np0005486759.ooo.test sudo[105816]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:38 np0005486759.ooo.test sudo[105827]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpneo45gxz/privsep.sock
Oct 14 08:54:38 np0005486759.ooo.test sudo[105827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:39 np0005486759.ooo.test sudo[105827]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:39 np0005486759.ooo.test sudo[105838]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp16u6lxmv/privsep.sock
Oct 14 08:54:39 np0005486759.ooo.test sudo[105838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:39 np0005486759.ooo.test sudo[105838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:40 np0005486759.ooo.test sudo[105850]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_7_daa1w/privsep.sock
Oct 14 08:54:40 np0005486759.ooo.test sudo[105850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:40 np0005486759.ooo.test sudo[105850]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:54:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:54:40 np0005486759.ooo.test podman[105854]: 2025-10-14 08:54:40.866462567 +0000 UTC m=+0.067378936 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step4, release=1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git)
Oct 14 08:54:40 np0005486759.ooo.test podman[105854]: 2025-10-14 08:54:40.918230328 +0000 UTC m=+0.119146717 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, build-date=2025-07-21T16:28:53, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:54:40 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:54:40 np0005486759.ooo.test podman[105857]: 2025-10-14 08:54:40.923518491 +0000 UTC m=+0.118227598 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.9, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:54:41 np0005486759.ooo.test podman[105857]: 2025-10-14 08:54:41.008485024 +0000 UTC m=+0.203194161 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, version=17.1.9, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4)
Oct 14 08:54:41 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:54:41 np0005486759.ooo.test sudo[105909]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnwwpmlzk/privsep.sock
Oct 14 08:54:41 np0005486759.ooo.test sudo[105909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:41 np0005486759.ooo.test sudo[105909]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:41 np0005486759.ooo.test sudo[105926]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaugn55_1/privsep.sock
Oct 14 08:54:41 np0005486759.ooo.test sudo[105926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:42 np0005486759.ooo.test sudo[105926]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:42 np0005486759.ooo.test sudo[105937]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkkph3_t7/privsep.sock
Oct 14 08:54:42 np0005486759.ooo.test sudo[105937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:43 np0005486759.ooo.test sudo[105937]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:43 np0005486759.ooo.test sudo[105948]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0x8hrejp/privsep.sock
Oct 14 08:54:43 np0005486759.ooo.test sudo[105948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:44 np0005486759.ooo.test sudo[105948]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:44 np0005486759.ooo.test sudo[105959]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv9218g5l/privsep.sock
Oct 14 08:54:44 np0005486759.ooo.test sudo[105959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:45 np0005486759.ooo.test sudo[105959]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:45 np0005486759.ooo.test sudo[105970]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxfba6oik/privsep.sock
Oct 14 08:54:45 np0005486759.ooo.test sudo[105970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:46 np0005486759.ooo.test sudo[105970]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:46 np0005486759.ooo.test sudo[105981]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0zgig3u9/privsep.sock
Oct 14 08:54:46 np0005486759.ooo.test sudo[105981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:46 np0005486759.ooo.test sudo[105981]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:47 np0005486759.ooo.test sudo[105998]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_x121qx4/privsep.sock
Oct 14 08:54:47 np0005486759.ooo.test sudo[105998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:47 np0005486759.ooo.test sudo[105998]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:48 np0005486759.ooo.test sudo[106009]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1odg8pg/privsep.sock
Oct 14 08:54:48 np0005486759.ooo.test sudo[106009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:48 np0005486759.ooo.test sudo[106009]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:48 np0005486759.ooo.test sudo[106020]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjuvx3065/privsep.sock
Oct 14 08:54:48 np0005486759.ooo.test sudo[106020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:49 np0005486759.ooo.test sudo[106020]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:49 np0005486759.ooo.test sudo[106031]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqfohxbzm/privsep.sock
Oct 14 08:54:49 np0005486759.ooo.test sudo[106031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:50 np0005486759.ooo.test sudo[106031]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:50 np0005486759.ooo.test sudo[106042]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy1bdawc0/privsep.sock
Oct 14 08:54:50 np0005486759.ooo.test sudo[106042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:51 np0005486759.ooo.test sudo[106042]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:51 np0005486759.ooo.test sudo[106053]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa65nzo8h/privsep.sock
Oct 14 08:54:51 np0005486759.ooo.test sudo[106053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:52 np0005486759.ooo.test sudo[106053]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:52 np0005486759.ooo.test sudo[106070]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpak08afwf/privsep.sock
Oct 14 08:54:52 np0005486759.ooo.test sudo[106070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:53 np0005486759.ooo.test sudo[106070]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:53 np0005486759.ooo.test sudo[106081]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqtapfd02/privsep.sock
Oct 14 08:54:53 np0005486759.ooo.test sudo[106081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:53 np0005486759.ooo.test sudo[106081]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:54 np0005486759.ooo.test sudo[106092]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkbgob1bn/privsep.sock
Oct 14 08:54:54 np0005486759.ooo.test sudo[106092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:54 np0005486759.ooo.test sudo[106092]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:54:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:54:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:54:54 np0005486759.ooo.test podman[106099]: 2025-10-14 08:54:54.92637842 +0000 UTC m=+0.073472885 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, container_name=nova_compute)
Oct 14 08:54:54 np0005486759.ooo.test systemd[1]: tmp-crun.EFrDTz.mount: Deactivated successfully.
Oct 14 08:54:54 np0005486759.ooo.test podman[106099]: 2025-10-14 08:54:54.988184083 +0000 UTC m=+0.135278598 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true)
Oct 14 08:54:55 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:54:55 np0005486759.ooo.test podman[106098]: 2025-10-14 08:54:54.994755857 +0000 UTC m=+0.141256284 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, tcib_managed=true, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team)
Oct 14 08:54:55 np0005486759.ooo.test podman[106100]: 2025-10-14 08:54:55.055051432 +0000 UTC m=+0.194067206 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, release=2, batch=17.1_20250721.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 08:54:55 np0005486759.ooo.test sudo[106159]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjbs29lcq/privsep.sock
Oct 14 08:54:55 np0005486759.ooo.test sudo[106159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:55 np0005486759.ooo.test podman[106100]: 2025-10-14 08:54:55.060914045 +0000 UTC m=+0.199929809 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, config_id=tripleo_step3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:54:55 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:54:55 np0005486759.ooo.test podman[106098]: 2025-10-14 08:54:55.080303098 +0000 UTC m=+0.226803465 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, batch=17.1_20250721.1)
Oct 14 08:54:55 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:54:55 np0005486759.ooo.test sudo[106159]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:55 np0005486759.ooo.test sudo[106174]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3bor64ot/privsep.sock
Oct 14 08:54:55 np0005486759.ooo.test sudo[106174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:55 np0005486759.ooo.test systemd[1]: tmp-crun.Fgkegy.mount: Deactivated successfully.
Oct 14 08:54:56 np0005486759.ooo.test sudo[106174]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:56 np0005486759.ooo.test sudo[106185]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn2row2ki/privsep.sock
Oct 14 08:54:56 np0005486759.ooo.test sudo[106185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:57 np0005486759.ooo.test sudo[106185]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:57 np0005486759.ooo.test sudo[106201]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjmhlj2tf/privsep.sock
Oct 14 08:54:57 np0005486759.ooo.test sudo[106201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:58 np0005486759.ooo.test sudo[106201]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:54:58 np0005486759.ooo.test systemd[1]: tmp-crun.D8RZVS.mount: Deactivated successfully.
Oct 14 08:54:58 np0005486759.ooo.test podman[106206]: 2025-10-14 08:54:58.379385774 +0000 UTC m=+0.068772360 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, container_name=metrics_qdr, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:54:58 np0005486759.ooo.test sudo[106242]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp87nnjb0o/privsep.sock
Oct 14 08:54:58 np0005486759.ooo.test sudo[106242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:58 np0005486759.ooo.test podman[106206]: 2025-10-14 08:54:58.611448041 +0000 UTC m=+0.300834647 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:54:58 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:54:59 np0005486759.ooo.test sudo[106242]: pam_unix(sudo:session): session closed for user root
Oct 14 08:54:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:54:59 np0005486759.ooo.test podman[106248]: 2025-10-14 08:54:59.254692737 +0000 UTC m=+0.049415838 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20250721.1, release=1, architecture=x86_64, vcs-type=git, tcib_managed=true, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team)
Oct 14 08:54:59 np0005486759.ooo.test sudo[106274]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx4yt20_a/privsep.sock
Oct 14 08:54:59 np0005486759.ooo.test sudo[106274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:54:59 np0005486759.ooo.test podman[106248]: 2025-10-14 08:54:59.612598529 +0000 UTC m=+0.407321640 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Oct 14 08:54:59 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:55:00 np0005486759.ooo.test sudo[106274]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:00 np0005486759.ooo.test sudo[106286]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzygodqfo/privsep.sock
Oct 14 08:55:00 np0005486759.ooo.test sudo[106286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:00 np0005486759.ooo.test sudo[106286]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:01 np0005486759.ooo.test sudo[106297]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpod6dqv_3/privsep.sock
Oct 14 08:55:01 np0005486759.ooo.test sudo[106297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:01 np0005486759.ooo.test sudo[106297]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:02 np0005486759.ooo.test sudo[106308]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwd88c3ih/privsep.sock
Oct 14 08:55:02 np0005486759.ooo.test sudo[106308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:02 np0005486759.ooo.test sudo[106308]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:03 np0005486759.ooo.test sudo[106324]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz8pozljh/privsep.sock
Oct 14 08:55:03 np0005486759.ooo.test sudo[106324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:03 np0005486759.ooo.test sudo[106324]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:03 np0005486759.ooo.test sudo[106336]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpymwkn98n/privsep.sock
Oct 14 08:55:03 np0005486759.ooo.test sudo[106336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:55:04 np0005486759.ooo.test podman[106339]: 2025-10-14 08:55:04.087470283 +0000 UTC m=+0.076862441 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, release=1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.9, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:55:04 np0005486759.ooo.test podman[106340]: 2025-10-14 08:55:04.14556795 +0000 UTC m=+0.131109239 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Oct 14 08:55:04 np0005486759.ooo.test podman[106340]: 2025-10-14 08:55:04.155889132 +0000 UTC m=+0.141430301 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:55:04 np0005486759.ooo.test podman[106340]: unhealthy
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:55:04 np0005486759.ooo.test podman[106339]: 2025-10-14 08:55:04.180601069 +0000 UTC m=+0.169993237 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Oct 14 08:55:04 np0005486759.ooo.test podman[106339]: unhealthy
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:55:04 np0005486759.ooo.test podman[106338]: 2025-10-14 08:55:04.196102572 +0000 UTC m=+0.186016126 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1, tcib_managed=true, architecture=x86_64)
Oct 14 08:55:04 np0005486759.ooo.test podman[106338]: 2025-10-14 08:55:04.230348197 +0000 UTC m=+0.220261811 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public)
Oct 14 08:55:04 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:55:04 np0005486759.ooo.test sudo[106336]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:04 np0005486759.ooo.test sudo[106403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7zuk9nct/privsep.sock
Oct 14 08:55:04 np0005486759.ooo.test sudo[106403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:05 np0005486759.ooo.test systemd[1]: tmp-crun.WhLwgi.mount: Deactivated successfully.
Oct 14 08:55:05 np0005486759.ooo.test sudo[106403]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:05 np0005486759.ooo.test sudo[106414]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzsiedg15/privsep.sock
Oct 14 08:55:05 np0005486759.ooo.test sudo[106414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:06 np0005486759.ooo.test sudo[106414]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:06 np0005486759.ooo.test sudo[106425]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp86l1bch6/privsep.sock
Oct 14 08:55:06 np0005486759.ooo.test sudo[106425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:07 np0005486759.ooo.test sudo[106425]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:07 np0005486759.ooo.test sudo[106436]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbb_cllkd/privsep.sock
Oct 14 08:55:07 np0005486759.ooo.test sudo[106436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:07 np0005486759.ooo.test sudo[106436]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:08 np0005486759.ooo.test sudo[106447]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqeonc030/privsep.sock
Oct 14 08:55:08 np0005486759.ooo.test sudo[106447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:08 np0005486759.ooo.test sudo[106447]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:08 np0005486759.ooo.test sudo[106464]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyhdijn7v/privsep.sock
Oct 14 08:55:08 np0005486759.ooo.test sudo[106464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:09 np0005486759.ooo.test sudo[106464]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:09 np0005486759.ooo.test sudo[106475]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuoracpc7/privsep.sock
Oct 14 08:55:09 np0005486759.ooo.test sudo[106475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:10 np0005486759.ooo.test sudo[106475]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:10 np0005486759.ooo.test sudo[106486]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpukqa2rz5/privsep.sock
Oct 14 08:55:10 np0005486759.ooo.test sudo[106486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:11 np0005486759.ooo.test sudo[106486]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:55:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:55:11 np0005486759.ooo.test systemd[1]: tmp-crun.cuUYeE.mount: Deactivated successfully.
Oct 14 08:55:11 np0005486759.ooo.test podman[106490]: 2025-10-14 08:55:11.324222426 +0000 UTC m=+0.092415295 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9)
Oct 14 08:55:11 np0005486759.ooo.test podman[106492]: 2025-10-14 08:55:11.365027455 +0000 UTC m=+0.132232743 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, batch=17.1_20250721.1, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44)
Oct 14 08:55:11 np0005486759.ooo.test podman[106492]: 2025-10-14 08:55:11.386354369 +0000 UTC m=+0.153559697 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, vcs-type=git, architecture=x86_64, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12)
Oct 14 08:55:11 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:55:11 np0005486759.ooo.test podman[106490]: 2025-10-14 08:55:11.436925072 +0000 UTC m=+0.205117931 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 08:55:11 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:55:11 np0005486759.ooo.test sudo[106541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4350_vep/privsep.sock
Oct 14 08:55:11 np0005486759.ooo.test sudo[106541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:12 np0005486759.ooo.test sudo[106541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:12 np0005486759.ooo.test sudo[106552]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5s307u58/privsep.sock
Oct 14 08:55:12 np0005486759.ooo.test sudo[106552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:12 np0005486759.ooo.test sudo[106552]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:13 np0005486759.ooo.test sudo[106563]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf_859m8f/privsep.sock
Oct 14 08:55:13 np0005486759.ooo.test sudo[106563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:13 np0005486759.ooo.test sudo[106563]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:14 np0005486759.ooo.test sudo[106580]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv6pzz02k/privsep.sock
Oct 14 08:55:14 np0005486759.ooo.test sudo[106580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:14 np0005486759.ooo.test sudo[106580]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:14 np0005486759.ooo.test sudo[106591]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuj4efqkz/privsep.sock
Oct 14 08:55:14 np0005486759.ooo.test sudo[106591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:15 np0005486759.ooo.test sudo[106591]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:15 np0005486759.ooo.test sudo[106602]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp76l_whn1/privsep.sock
Oct 14 08:55:15 np0005486759.ooo.test sudo[106602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:16 np0005486759.ooo.test sudo[106602]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:16 np0005486759.ooo.test sudo[106613]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp25hyegqw/privsep.sock
Oct 14 08:55:16 np0005486759.ooo.test sudo[106613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:17 np0005486759.ooo.test sudo[106613]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:17 np0005486759.ooo.test sudo[106624]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaxvbd4aq/privsep.sock
Oct 14 08:55:17 np0005486759.ooo.test sudo[106624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:18 np0005486759.ooo.test sudo[106624]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:18 np0005486759.ooo.test sudo[106635]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgxzoi0ly/privsep.sock
Oct 14 08:55:18 np0005486759.ooo.test sudo[106635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:18 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:55:18 np0005486759.ooo.test recover_tripleo_nova_virtqemud[106638]: 47951
Oct 14 08:55:18 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:55:18 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:55:19 np0005486759.ooo.test sudo[106635]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:19 np0005486759.ooo.test sudo[106653]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmposvuxjio/privsep.sock
Oct 14 08:55:19 np0005486759.ooo.test sudo[106653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:19 np0005486759.ooo.test sudo[106653]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:20 np0005486759.ooo.test sudo[106665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8eqo2xgy/privsep.sock
Oct 14 08:55:20 np0005486759.ooo.test sudo[106665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:20 np0005486759.ooo.test sudo[106665]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:21 np0005486759.ooo.test sudo[106676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmporr72ot7/privsep.sock
Oct 14 08:55:21 np0005486759.ooo.test sudo[106676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:21 np0005486759.ooo.test sudo[106676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:21 np0005486759.ooo.test sudo[106687]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi8o2rik2/privsep.sock
Oct 14 08:55:21 np0005486759.ooo.test sudo[106687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:22 np0005486759.ooo.test sudo[106687]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:22 np0005486759.ooo.test sudo[106698]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpib9hrxqe/privsep.sock
Oct 14 08:55:22 np0005486759.ooo.test sudo[106698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:23 np0005486759.ooo.test sudo[106698]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:23 np0005486759.ooo.test sudo[106709]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeehnm3zy/privsep.sock
Oct 14 08:55:23 np0005486759.ooo.test sudo[106709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:24 np0005486759.ooo.test sudo[106709]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:24 np0005486759.ooo.test sudo[106725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqd1qwm_v/privsep.sock
Oct 14 08:55:24 np0005486759.ooo.test sudo[106725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:25 np0005486759.ooo.test sudo[106725]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:55:25 np0005486759.ooo.test podman[106732]: 2025-10-14 08:55:25.365351906 +0000 UTC m=+0.084620334 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, release=1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=iscsid)
Oct 14 08:55:25 np0005486759.ooo.test podman[106733]: 2025-10-14 08:55:25.422196104 +0000 UTC m=+0.138164838 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.buildah.version=1.33.12, distribution-scope=public, release=1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: tmp-crun.2N28o2.mount: Deactivated successfully.
Oct 14 08:55:25 np0005486759.ooo.test podman[106733]: 2025-10-14 08:55:25.478164064 +0000 UTC m=+0.194132768 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-type=git, distribution-scope=public, release=1, tcib_managed=true, config_id=tripleo_step5)
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:55:25 np0005486759.ooo.test podman[106732]: 2025-10-14 08:55:25.502082538 +0000 UTC m=+0.221350966 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:27:15, version=17.1.9)
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:55:25 np0005486759.ooo.test sudo[106802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphklribi6/privsep.sock
Oct 14 08:55:25 np0005486759.ooo.test podman[106734]: 2025-10-14 08:55:25.481022033 +0000 UTC m=+0.193405496 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible)
Oct 14 08:55:25 np0005486759.ooo.test sudo[106802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:25 np0005486759.ooo.test podman[106734]: 2025-10-14 08:55:25.562434135 +0000 UTC m=+0.274817598 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2)
Oct 14 08:55:25 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:55:26 np0005486759.ooo.test sudo[106802]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:26 np0005486759.ooo.test sudo[106813]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsn55fang/privsep.sock
Oct 14 08:55:26 np0005486759.ooo.test sudo[106813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:27 np0005486759.ooo.test sudo[106813]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:27 np0005486759.ooo.test sudo[106824]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp26ezcway/privsep.sock
Oct 14 08:55:27 np0005486759.ooo.test sudo[106824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:27 np0005486759.ooo.test sudo[106824]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:28 np0005486759.ooo.test sudo[106835]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphvc2t7ah/privsep.sock
Oct 14 08:55:28 np0005486759.ooo.test sudo[106835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:28 np0005486759.ooo.test sudo[106835]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:55:28 np0005486759.ooo.test systemd[1]: tmp-crun.mIQJOk.mount: Deactivated successfully.
Oct 14 08:55:28 np0005486759.ooo.test podman[106840]: 2025-10-14 08:55:28.850119927 +0000 UTC m=+0.080874986 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, distribution-scope=public, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, release=1, build-date=2025-07-21T13:07:59, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64)
Oct 14 08:55:29 np0005486759.ooo.test podman[106840]: 2025-10-14 08:55:29.053385229 +0000 UTC m=+0.284140218 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Oct 14 08:55:29 np0005486759.ooo.test sudo[106876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwd9ztqhr/privsep.sock
Oct 14 08:55:29 np0005486759.ooo.test sudo[106876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:29 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:55:29 np0005486759.ooo.test sudo[106876]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:55:29 np0005486759.ooo.test podman[106882]: 2025-10-14 08:55:29.742005076 +0000 UTC m=+0.077638366 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-07-21T14:48:37, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.9)
Oct 14 08:55:29 np0005486759.ooo.test sudo[106913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa03g2v5o/privsep.sock
Oct 14 08:55:29 np0005486759.ooo.test sudo[106913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:30 np0005486759.ooo.test podman[106882]: 2025-10-14 08:55:30.051202422 +0000 UTC m=+0.386835652 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, version=17.1.9, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:55:30 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:55:30 np0005486759.ooo.test sudo[106913]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:30 np0005486759.ooo.test sudo[106928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8jz_petf/privsep.sock
Oct 14 08:55:30 np0005486759.ooo.test sudo[106928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:31 np0005486759.ooo.test sudo[106928]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:31 np0005486759.ooo.test sudo[106939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuid1zk0h/privsep.sock
Oct 14 08:55:31 np0005486759.ooo.test sudo[106939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:32 np0005486759.ooo.test sudo[106939]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:32 np0005486759.ooo.test sudo[106950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7639pbdi/privsep.sock
Oct 14 08:55:32 np0005486759.ooo.test sudo[106950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:32 np0005486759.ooo.test sudo[106950]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:33 np0005486759.ooo.test sudo[106961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplddc7k1j/privsep.sock
Oct 14 08:55:33 np0005486759.ooo.test sudo[106961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:33 np0005486759.ooo.test sudo[106961]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:34 np0005486759.ooo.test sudo[106972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2z_iigcv/privsep.sock
Oct 14 08:55:34 np0005486759.ooo.test sudo[106972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: tmp-crun.yCMUUn.mount: Deactivated successfully.
Oct 14 08:55:34 np0005486759.ooo.test podman[106975]: 2025-10-14 08:55:34.462625414 +0000 UTC m=+0.092900060 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:55:34 np0005486759.ooo.test podman[106975]: 2025-10-14 08:55:34.468195917 +0000 UTC m=+0.098470533 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron)
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:55:34 np0005486759.ooo.test podman[106976]: 2025-10-14 08:55:34.557726452 +0000 UTC m=+0.182655642 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, managed_by=tripleo_ansible)
Oct 14 08:55:34 np0005486759.ooo.test podman[106976]: 2025-10-14 08:55:34.597259661 +0000 UTC m=+0.222188851 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, managed_by=tripleo_ansible, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=)
Oct 14 08:55:34 np0005486759.ooo.test podman[106976]: unhealthy
Oct 14 08:55:34 np0005486759.ooo.test podman[106977]: 2025-10-14 08:55:34.604910489 +0000 UTC m=+0.228062204 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:55:34 np0005486759.ooo.test podman[106977]: 2025-10-14 08:55:34.61716325 +0000 UTC m=+0.240314965 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, vendor=Red Hat, Inc., vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:55:34 np0005486759.ooo.test podman[106977]: unhealthy
Oct 14 08:55:34 np0005486759.ooo.test sudo[106972]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:55:34 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:55:34 np0005486759.ooo.test sudo[107039]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp597v_bg2/privsep.sock
Oct 14 08:55:34 np0005486759.ooo.test sudo[107039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:35 np0005486759.ooo.test sudo[107039]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:35 np0005486759.ooo.test systemd[1]: tmp-crun.XfeBbC.mount: Deactivated successfully.
Oct 14 08:55:35 np0005486759.ooo.test sudo[107056]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk018abw4/privsep.sock
Oct 14 08:55:35 np0005486759.ooo.test sudo[107056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:36 np0005486759.ooo.test sudo[107056]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:36 np0005486759.ooo.test sudo[107067]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgi_ktj5l/privsep.sock
Oct 14 08:55:36 np0005486759.ooo.test sudo[107067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:37 np0005486759.ooo.test sudo[107067]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:37 np0005486759.ooo.test sudo[107078]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnpqe5q2a/privsep.sock
Oct 14 08:55:37 np0005486759.ooo.test sudo[107078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:37 np0005486759.ooo.test sudo[107078]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:38 np0005486759.ooo.test sudo[107089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0a_yfjr2/privsep.sock
Oct 14 08:55:38 np0005486759.ooo.test sudo[107089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:38 np0005486759.ooo.test sudo[107089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:39 np0005486759.ooo.test sudo[107100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1lg3ickc/privsep.sock
Oct 14 08:55:39 np0005486759.ooo.test sudo[107100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:39 np0005486759.ooo.test sudo[107100]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:39 np0005486759.ooo.test sudo[107111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcmnsqsy9/privsep.sock
Oct 14 08:55:39 np0005486759.ooo.test sudo[107111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:40 np0005486759.ooo.test sudo[107111]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:40 np0005486759.ooo.test sudo[107125]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1b6kgu1/privsep.sock
Oct 14 08:55:40 np0005486759.ooo.test sudo[107125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:41 np0005486759.ooo.test sudo[107125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:41 np0005486759.ooo.test sudo[107139]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpciggy0sq/privsep.sock
Oct 14 08:55:41 np0005486759.ooo.test sudo[107139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:55:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:55:41 np0005486759.ooo.test systemd[1]: tmp-crun.mmy0Uq.mount: Deactivated successfully.
Oct 14 08:55:41 np0005486759.ooo.test podman[107142]: 2025-10-14 08:55:41.716297903 +0000 UTC m=+0.075925572 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, release=1)
Oct 14 08:55:41 np0005486759.ooo.test systemd[1]: tmp-crun.xBxXi9.mount: Deactivated successfully.
Oct 14 08:55:41 np0005486759.ooo.test podman[107141]: 2025-10-14 08:55:41.727581654 +0000 UTC m=+0.085503280 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:55:41 np0005486759.ooo.test podman[107141]: 2025-10-14 08:55:41.755205103 +0000 UTC m=+0.113126709 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9)
Oct 14 08:55:41 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:55:41 np0005486759.ooo.test podman[107142]: 2025-10-14 08:55:41.776361491 +0000 UTC m=+0.135989190 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, build-date=2025-07-21T13:28:44, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Oct 14 08:55:41 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:55:42 np0005486759.ooo.test sudo[107139]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:42 np0005486759.ooo.test sudo[107196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprxhoyp79/privsep.sock
Oct 14 08:55:42 np0005486759.ooo.test sudo[107196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:43 np0005486759.ooo.test sudo[107196]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:43 np0005486759.ooo.test sudo[107207]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ixjts9d/privsep.sock
Oct 14 08:55:43 np0005486759.ooo.test sudo[107207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:44 np0005486759.ooo.test sudo[107207]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:44 np0005486759.ooo.test sudo[107218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptqk_ua2b/privsep.sock
Oct 14 08:55:44 np0005486759.ooo.test sudo[107218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:44 np0005486759.ooo.test sudo[107218]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:45 np0005486759.ooo.test sudo[107229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqnqc_7rq/privsep.sock
Oct 14 08:55:45 np0005486759.ooo.test sudo[107229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:45 np0005486759.ooo.test sudo[107229]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:46 np0005486759.ooo.test sudo[107241]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyc1ytmg9/privsep.sock
Oct 14 08:55:46 np0005486759.ooo.test sudo[107241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:46 np0005486759.ooo.test sudo[107241]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:46 np0005486759.ooo.test sudo[107257]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvssvex4n/privsep.sock
Oct 14 08:55:46 np0005486759.ooo.test sudo[107257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:47 np0005486759.ooo.test sudo[107257]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:47 np0005486759.ooo.test sudo[107268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppf4afi8b/privsep.sock
Oct 14 08:55:47 np0005486759.ooo.test sudo[107268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:48 np0005486759.ooo.test sudo[107268]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:48 np0005486759.ooo.test sudo[107279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpykmxii5k/privsep.sock
Oct 14 08:55:48 np0005486759.ooo.test sudo[107279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:49 np0005486759.ooo.test sudo[107279]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:49 np0005486759.ooo.test sudo[107290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnzt9rlsn/privsep.sock
Oct 14 08:55:49 np0005486759.ooo.test sudo[107290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:49 np0005486759.ooo.test sudo[107290]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:50 np0005486759.ooo.test sudo[107301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo8urs84m/privsep.sock
Oct 14 08:55:50 np0005486759.ooo.test sudo[107301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:50 np0005486759.ooo.test sudo[107301]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:50 np0005486759.ooo.test sudo[107312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwukkr1bt/privsep.sock
Oct 14 08:55:50 np0005486759.ooo.test sudo[107312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:51 np0005486759.ooo.test sudo[107312]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:51 np0005486759.ooo.test sudo[107329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptvcdts9s/privsep.sock
Oct 14 08:55:51 np0005486759.ooo.test sudo[107329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:52 np0005486759.ooo.test sudo[107329]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:52 np0005486759.ooo.test sudo[107340]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7_ofkuu_/privsep.sock
Oct 14 08:55:52 np0005486759.ooo.test sudo[107340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:53 np0005486759.ooo.test sudo[107340]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:53 np0005486759.ooo.test sudo[107351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc5m0pv8i/privsep.sock
Oct 14 08:55:53 np0005486759.ooo.test sudo[107351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:54 np0005486759.ooo.test sudo[107351]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:54 np0005486759.ooo.test sudo[107362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu2lfy363/privsep.sock
Oct 14 08:55:54 np0005486759.ooo.test sudo[107362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:55 np0005486759.ooo.test sudo[107362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:55 np0005486759.ooo.test sudo[107373]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdgj_0njo/privsep.sock
Oct 14 08:55:55 np0005486759.ooo.test sudo[107373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:55 np0005486759.ooo.test sudo[107373]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:55:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:55:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:55:55 np0005486759.ooo.test systemd[1]: tmp-crun.XR6MCH.mount: Deactivated successfully.
Oct 14 08:55:55 np0005486759.ooo.test podman[107378]: 2025-10-14 08:55:55.996516068 +0000 UTC m=+0.080306489 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true)
Oct 14 08:55:56 np0005486759.ooo.test podman[107380]: 2025-10-14 08:55:56.054298455 +0000 UTC m=+0.134357579 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Oct 14 08:55:56 np0005486759.ooo.test podman[107378]: 2025-10-14 08:55:56.0827421 +0000 UTC m=+0.166532501 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, version=17.1.9, distribution-scope=public, name=rhosp17/openstack-iscsid, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, container_name=iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Oct 14 08:55:56 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:55:56 np0005486759.ooo.test podman[107381]: 2025-10-14 08:55:56.096829479 +0000 UTC m=+0.174842780 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9)
Oct 14 08:55:56 np0005486759.ooo.test podman[107381]: 2025-10-14 08:55:56.103809575 +0000 UTC m=+0.181822846 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 08:55:56 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:55:56 np0005486759.ooo.test podman[107380]: 2025-10-14 08:55:56.1296921 +0000 UTC m=+0.209751194 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step5, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:55:56 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:55:56 np0005486759.ooo.test sudo[107445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyzr2l5ax/privsep.sock
Oct 14 08:55:56 np0005486759.ooo.test sudo[107445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:56 np0005486759.ooo.test sudo[107445]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:57 np0005486759.ooo.test sudo[107461]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8yxzqnhb/privsep.sock
Oct 14 08:55:57 np0005486759.ooo.test sudo[107461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:57 np0005486759.ooo.test sudo[107461]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:57 np0005486759.ooo.test sudo[107473]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx0poxhrv/privsep.sock
Oct 14 08:55:57 np0005486759.ooo.test sudo[107473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:58 np0005486759.ooo.test sudo[107473]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:58 np0005486759.ooo.test sudo[107484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiajv11ck/privsep.sock
Oct 14 08:55:58 np0005486759.ooo.test sudo[107484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:55:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:55:59 np0005486759.ooo.test sudo[107484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:55:59 np0005486759.ooo.test systemd[1]: tmp-crun.9UisMz.mount: Deactivated successfully.
Oct 14 08:55:59 np0005486759.ooo.test podman[107488]: 2025-10-14 08:55:59.444052642 +0000 UTC m=+0.074053105 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Oct 14 08:55:59 np0005486759.ooo.test podman[107488]: 2025-10-14 08:55:59.635885648 +0000 UTC m=+0.265886071 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, build-date=2025-07-21T13:07:59, version=17.1.9)
Oct 14 08:55:59 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:55:59 np0005486759.ooo.test sudo[107524]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0y4wgi4g/privsep.sock
Oct 14 08:55:59 np0005486759.ooo.test sudo[107524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:00 np0005486759.ooo.test sudo[107524]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:56:00 np0005486759.ooo.test systemd[1]: tmp-crun.gKNvkE.mount: Deactivated successfully.
Oct 14 08:56:00 np0005486759.ooo.test podman[107530]: 2025-10-14 08:56:00.368145662 +0000 UTC m=+0.090999871 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, io.buildah.version=1.33.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:56:00 np0005486759.ooo.test sudo[107556]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuf1noj7c/privsep.sock
Oct 14 08:56:00 np0005486759.ooo.test sudo[107556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:00 np0005486759.ooo.test podman[107530]: 2025-10-14 08:56:00.768019539 +0000 UTC m=+0.490873728 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9)
Oct 14 08:56:00 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:56:01 np0005486759.ooo.test sudo[107556]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:01 np0005486759.ooo.test sudo[107570]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpooriw966/privsep.sock
Oct 14 08:56:01 np0005486759.ooo.test sudo[107570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:01 np0005486759.ooo.test sudo[107570]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:02 np0005486759.ooo.test sudo[107583]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgjfgyd3g/privsep.sock
Oct 14 08:56:02 np0005486759.ooo.test sudo[107583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:02 np0005486759.ooo.test sudo[107583]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:03 np0005486759.ooo.test sudo[107598]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcgx0dcue/privsep.sock
Oct 14 08:56:03 np0005486759.ooo.test sudo[107598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:03 np0005486759.ooo.test sudo[107598]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:04 np0005486759.ooo.test sudo[107609]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptujq_vgj/privsep.sock
Oct 14 08:56:04 np0005486759.ooo.test sudo[107609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:04 np0005486759.ooo.test sudo[107609]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:56:04 np0005486759.ooo.test podman[107615]: 2025-10-14 08:56:04.775300681 +0000 UTC m=+0.062317580 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, version=17.1.9, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, architecture=x86_64, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 08:56:04 np0005486759.ooo.test podman[107616]: 2025-10-14 08:56:04.825661338 +0000 UTC m=+0.108566108 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, architecture=x86_64, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git)
Oct 14 08:56:04 np0005486759.ooo.test podman[107620]: 2025-10-14 08:56:04.885313983 +0000 UTC m=+0.162282569 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, release=1, architecture=x86_64, version=17.1.9, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12)
Oct 14 08:56:04 np0005486759.ooo.test podman[107620]: 2025-10-14 08:56:04.901720713 +0000 UTC m=+0.178689289 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9)
Oct 14 08:56:04 np0005486759.ooo.test podman[107620]: unhealthy
Oct 14 08:56:04 np0005486759.ooo.test podman[107615]: 2025-10-14 08:56:04.907939636 +0000 UTC m=+0.194956505 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1, version=17.1.9, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, tcib_managed=true)
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:56:04 np0005486759.ooo.test podman[107616]: 2025-10-14 08:56:04.9161087 +0000 UTC m=+0.199013520 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, release=1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:56:04 np0005486759.ooo.test podman[107616]: unhealthy
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:56:04 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:56:04 np0005486759.ooo.test sudo[107677]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwmha406s/privsep.sock
Oct 14 08:56:04 np0005486759.ooo.test sudo[107677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:05 np0005486759.ooo.test sudo[107677]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:05 np0005486759.ooo.test systemd[1]: tmp-crun.9cFJ3w.mount: Deactivated successfully.
Oct 14 08:56:05 np0005486759.ooo.test sudo[107688]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9q8ykaln/privsep.sock
Oct 14 08:56:05 np0005486759.ooo.test sudo[107688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:06 np0005486759.ooo.test sudo[107688]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:06 np0005486759.ooo.test sudo[107699]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprjkwfcgy/privsep.sock
Oct 14 08:56:06 np0005486759.ooo.test sudo[107699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:07 np0005486759.ooo.test sudo[107699]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:07 np0005486759.ooo.test sudo[107710]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkv6aiswe/privsep.sock
Oct 14 08:56:07 np0005486759.ooo.test sudo[107710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:08 np0005486759.ooo.test sudo[107710]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:08 np0005486759.ooo.test sudo[107727]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgvb0011x/privsep.sock
Oct 14 08:56:08 np0005486759.ooo.test sudo[107727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:09 np0005486759.ooo.test sudo[107727]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:09 np0005486759.ooo.test sudo[107738]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw5ikav7a/privsep.sock
Oct 14 08:56:09 np0005486759.ooo.test sudo[107738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:10 np0005486759.ooo.test sudo[107738]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:10 np0005486759.ooo.test sudo[107749]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsb_71e1m/privsep.sock
Oct 14 08:56:10 np0005486759.ooo.test sudo[107749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:10 np0005486759.ooo.test sudo[107749]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:11 np0005486759.ooo.test sudo[107760]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnae06p1h/privsep.sock
Oct 14 08:56:11 np0005486759.ooo.test sudo[107760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:11 np0005486759.ooo.test sudo[107760]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:11 np0005486759.ooo.test sudo[107771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6fqw_gdj/privsep.sock
Oct 14 08:56:11 np0005486759.ooo.test sudo[107771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:56:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:56:12 np0005486759.ooo.test podman[107774]: 2025-10-14 08:56:12.069116739 +0000 UTC m=+0.068286515 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:56:12 np0005486759.ooo.test podman[107774]: 2025-10-14 08:56:12.113472988 +0000 UTC m=+0.112642794 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1, io.openshift.expose-services=)
Oct 14 08:56:12 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:56:12 np0005486759.ooo.test systemd[1]: tmp-crun.XhkUfv.mount: Deactivated successfully.
Oct 14 08:56:12 np0005486759.ooo.test podman[107773]: 2025-10-14 08:56:12.17204517 +0000 UTC m=+0.173544188 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9)
Oct 14 08:56:12 np0005486759.ooo.test podman[107773]: 2025-10-14 08:56:12.231288302 +0000 UTC m=+0.232787570 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1)
Oct 14 08:56:12 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:56:12 np0005486759.ooo.test sudo[107771]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:12 np0005486759.ooo.test sudo[107829]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1p189zie/privsep.sock
Oct 14 08:56:12 np0005486759.ooo.test sudo[107829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:13 np0005486759.ooo.test sudo[107829]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:13 np0005486759.ooo.test sudo[107846]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp33yso7fh/privsep.sock
Oct 14 08:56:13 np0005486759.ooo.test sudo[107846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:14 np0005486759.ooo.test sudo[107846]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:14 np0005486759.ooo.test sudo[107857]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1sm7_84o/privsep.sock
Oct 14 08:56:14 np0005486759.ooo.test sudo[107857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:15 np0005486759.ooo.test sudo[107857]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:15 np0005486759.ooo.test sudo[107868]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcdz1a_co/privsep.sock
Oct 14 08:56:15 np0005486759.ooo.test sudo[107868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:16 np0005486759.ooo.test sudo[107868]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:16 np0005486759.ooo.test sudo[107879]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1w63uch7/privsep.sock
Oct 14 08:56:16 np0005486759.ooo.test sudo[107879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:17 np0005486759.ooo.test sudo[107879]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:17 np0005486759.ooo.test sudo[107890]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa9xl3ery/privsep.sock
Oct 14 08:56:17 np0005486759.ooo.test sudo[107890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:17 np0005486759.ooo.test sudo[107890]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:18 np0005486759.ooo.test sudo[107901]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1u50htyb/privsep.sock
Oct 14 08:56:18 np0005486759.ooo.test sudo[107901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:18 np0005486759.ooo.test sudo[107901]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:18 np0005486759.ooo.test sudo[107918]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe93_gavf/privsep.sock
Oct 14 08:56:18 np0005486759.ooo.test sudo[107918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:19 np0005486759.ooo.test sudo[107918]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:19 np0005486759.ooo.test sudo[107929]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyso6mq9m/privsep.sock
Oct 14 08:56:19 np0005486759.ooo.test sudo[107929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:20 np0005486759.ooo.test sudo[107929]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:20 np0005486759.ooo.test sudo[107940]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv599l9cz/privsep.sock
Oct 14 08:56:20 np0005486759.ooo.test sudo[107940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:21 np0005486759.ooo.test sudo[107940]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:21 np0005486759.ooo.test sudo[107951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy7ftws8p/privsep.sock
Oct 14 08:56:21 np0005486759.ooo.test sudo[107951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:22 np0005486759.ooo.test sudo[107951]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:22 np0005486759.ooo.test sudo[107962]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphvm8axu_/privsep.sock
Oct 14 08:56:22 np0005486759.ooo.test sudo[107962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:23 np0005486759.ooo.test sudo[107962]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:23 np0005486759.ooo.test sudo[107973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp82lyjopa/privsep.sock
Oct 14 08:56:23 np0005486759.ooo.test sudo[107973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:23 np0005486759.ooo.test sudo[107973]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:24 np0005486759.ooo.test sudo[107989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwxafke9u/privsep.sock
Oct 14 08:56:24 np0005486759.ooo.test sudo[107989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:24 np0005486759.ooo.test sudo[107989]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:24 np0005486759.ooo.test sudo[108001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl1fj47bj/privsep.sock
Oct 14 08:56:24 np0005486759.ooo.test sudo[108001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:25 np0005486759.ooo.test sudo[108001]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:25 np0005486759.ooo.test sudo[108012]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0tqwp17y/privsep.sock
Oct 14 08:56:25 np0005486759.ooo.test sudo[108012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: tmp-crun.6KdNw6.mount: Deactivated successfully.
Oct 14 08:56:26 np0005486759.ooo.test podman[108015]: 2025-10-14 08:56:26.451280124 +0000 UTC m=+0.077731979 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:56:26 np0005486759.ooo.test sudo[108012]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:26 np0005486759.ooo.test podman[108015]: 2025-10-14 08:56:26.487361576 +0000 UTC m=+0.113813431 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, distribution-scope=public)
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:56:26 np0005486759.ooo.test podman[108017]: 2025-10-14 08:56:26.491830705 +0000 UTC m=+0.115018468 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, release=2, build-date=2025-07-21T13:04:03, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:56:26 np0005486759.ooo.test podman[108016]: 2025-10-14 08:56:26.55726696 +0000 UTC m=+0.180671720 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vcs-type=git, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 08:56:26 np0005486759.ooo.test podman[108017]: 2025-10-14 08:56:26.577271322 +0000 UTC m=+0.200459115 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Oct 14 08:56:26 np0005486759.ooo.test podman[108016]: 2025-10-14 08:56:26.586336054 +0000 UTC m=+0.209740814 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute, version=17.1.9, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, managed_by=tripleo_ansible)
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:56:26 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:56:26 np0005486759.ooo.test sudo[108089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoj0inxw8/privsep.sock
Oct 14 08:56:26 np0005486759.ooo.test sudo[108089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:27 np0005486759.ooo.test sudo[108089]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:27 np0005486759.ooo.test sudo[108100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph_3gke85/privsep.sock
Oct 14 08:56:27 np0005486759.ooo.test sudo[108100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:28 np0005486759.ooo.test sudo[108100]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:28 np0005486759.ooo.test sudo[108111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpivxh8aeg/privsep.sock
Oct 14 08:56:28 np0005486759.ooo.test sudo[108111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:29 np0005486759.ooo.test sudo[108111]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:29 np0005486759.ooo.test sudo[108127]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmw_nd_z4/privsep.sock
Oct 14 08:56:29 np0005486759.ooo.test sudo[108127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:30 np0005486759.ooo.test sudo[108127]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:56:30 np0005486759.ooo.test podman[108134]: 2025-10-14 08:56:30.225555629 +0000 UTC m=+0.085657525 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Oct 14 08:56:30 np0005486759.ooo.test sudo[108169]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0eypci5/privsep.sock
Oct 14 08:56:30 np0005486759.ooo.test sudo[108169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:30 np0005486759.ooo.test podman[108134]: 2025-10-14 08:56:30.460279579 +0000 UTC m=+0.320381485 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, tcib_managed=true)
Oct 14 08:56:30 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:56:30 np0005486759.ooo.test sudo[108169]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:56:31 np0005486759.ooo.test systemd[1]: tmp-crun.4QwgiN.mount: Deactivated successfully.
Oct 14 08:56:31 np0005486759.ooo.test podman[108174]: 2025-10-14 08:56:31.07410451 +0000 UTC m=+0.083827638 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, release=1, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9)
Oct 14 08:56:31 np0005486759.ooo.test sudo[108203]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd3uh7zdg/privsep.sock
Oct 14 08:56:31 np0005486759.ooo.test sudo[108203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:31 np0005486759.ooo.test podman[108174]: 2025-10-14 08:56:31.392871594 +0000 UTC m=+0.402594662 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vendor=Red Hat, Inc., version=17.1.9, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12)
Oct 14 08:56:31 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:56:31 np0005486759.ooo.test sudo[108203]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:32 np0005486759.ooo.test sudo[108214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy7hd3o52/privsep.sock
Oct 14 08:56:32 np0005486759.ooo.test sudo[108214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:32 np0005486759.ooo.test sudo[108214]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:33 np0005486759.ooo.test sudo[108225]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz8ejlln8/privsep.sock
Oct 14 08:56:33 np0005486759.ooo.test sudo[108225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:33 np0005486759.ooo.test sudo[108225]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:34 np0005486759.ooo.test sudo[108236]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw1745ozo/privsep.sock
Oct 14 08:56:34 np0005486759.ooo.test sudo[108236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:34 np0005486759.ooo.test sudo[108236]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:34 np0005486759.ooo.test sudo[108252]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0i0803xd/privsep.sock
Oct 14 08:56:34 np0005486759.ooo.test sudo[108252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:56:35 np0005486759.ooo.test sudo[108252]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:35 np0005486759.ooo.test podman[108258]: 2025-10-14 08:56:35.449417888 +0000 UTC m=+0.067423977 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:56:35 np0005486759.ooo.test podman[108258]: 2025-10-14 08:56:35.463813696 +0000 UTC m=+0.081819805 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:56:35 np0005486759.ooo.test podman[108258]: unhealthy
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: tmp-crun.tPOsIM.mount: Deactivated successfully.
Oct 14 08:56:35 np0005486759.ooo.test podman[108259]: 2025-10-14 08:56:35.518712254 +0000 UTC m=+0.135024430 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:56:35 np0005486759.ooo.test podman[108257]: 2025-10-14 08:56:35.565420657 +0000 UTC m=+0.190330651 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1)
Oct 14 08:56:35 np0005486759.ooo.test podman[108257]: 2025-10-14 08:56:35.574434027 +0000 UTC m=+0.199344021 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.33.12, version=17.1.9, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, batch=17.1_20250721.1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 08:56:35 np0005486759.ooo.test podman[108259]: 2025-10-14 08:56:35.583429657 +0000 UTC m=+0.199741883 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:56:35 np0005486759.ooo.test podman[108259]: unhealthy
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:56:35 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:56:35 np0005486759.ooo.test sudo[108321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3n8cftcw/privsep.sock
Oct 14 08:56:35 np0005486759.ooo.test sudo[108321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:36 np0005486759.ooo.test sudo[108321]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:36 np0005486759.ooo.test sudo[108332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdif7nr48/privsep.sock
Oct 14 08:56:36 np0005486759.ooo.test sudo[108332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:37 np0005486759.ooo.test sudo[108332]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:37 np0005486759.ooo.test sudo[108343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn0mwd328/privsep.sock
Oct 14 08:56:37 np0005486759.ooo.test sudo[108343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:37 np0005486759.ooo.test sudo[108343]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:38 np0005486759.ooo.test sudo[108354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfncdi1or/privsep.sock
Oct 14 08:56:38 np0005486759.ooo.test sudo[108354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:38 np0005486759.ooo.test sudo[108354]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:39 np0005486759.ooo.test sudo[108365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu30y61c1/privsep.sock
Oct 14 08:56:39 np0005486759.ooo.test sudo[108365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:39 np0005486759.ooo.test sudo[108365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:39 np0005486759.ooo.test sudo[108376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvqsqgdvo/privsep.sock
Oct 14 08:56:39 np0005486759.ooo.test sudo[108376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:40 np0005486759.ooo.test sudo[108376]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:40 np0005486759.ooo.test sudo[108393]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc95yagil/privsep.sock
Oct 14 08:56:40 np0005486759.ooo.test sudo[108393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:41 np0005486759.ooo.test sudo[108393]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:41 np0005486759.ooo.test sudo[108404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp5vuh8k2/privsep.sock
Oct 14 08:56:41 np0005486759.ooo.test sudo[108404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:42 np0005486759.ooo.test sudo[108404]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:56:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:56:42 np0005486759.ooo.test podman[108408]: 2025-10-14 08:56:42.390034082 +0000 UTC m=+0.066162649 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, release=1, version=17.1.9)
Oct 14 08:56:42 np0005486759.ooo.test systemd[1]: tmp-crun.Ls7D19.mount: Deactivated successfully.
Oct 14 08:56:42 np0005486759.ooo.test podman[108411]: 2025-10-14 08:56:42.448425508 +0000 UTC m=+0.120494798 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T13:28:44)
Oct 14 08:56:42 np0005486759.ooo.test podman[108408]: 2025-10-14 08:56:42.488268697 +0000 UTC m=+0.164397194 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:56:42 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:56:42 np0005486759.ooo.test podman[108411]: 2025-10-14 08:56:42.539298494 +0000 UTC m=+0.211367794 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller)
Oct 14 08:56:42 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Deactivated successfully.
Oct 14 08:56:42 np0005486759.ooo.test sudo[108465]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfvllsm01/privsep.sock
Oct 14 08:56:42 np0005486759.ooo.test sudo[108465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:43 np0005486759.ooo.test sudo[108465]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:43 np0005486759.ooo.test sudo[108476]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp94g9286_/privsep.sock
Oct 14 08:56:43 np0005486759.ooo.test sudo[108476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:44 np0005486759.ooo.test sudo[108476]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:44 np0005486759.ooo.test sudo[108487]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf0c3olq9/privsep.sock
Oct 14 08:56:44 np0005486759.ooo.test sudo[108487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:44 np0005486759.ooo.test sudo[108487]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:45 np0005486759.ooo.test sudo[108498]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_u21l9ry/privsep.sock
Oct 14 08:56:45 np0005486759.ooo.test sudo[108498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:45 np0005486759.ooo.test sudo[108498]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:45 np0005486759.ooo.test sudo[108515]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0nlwd6l9/privsep.sock
Oct 14 08:56:45 np0005486759.ooo.test sudo[108515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:46 np0005486759.ooo.test sudo[108515]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:46 np0005486759.ooo.test sudo[108526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0s1xvagw/privsep.sock
Oct 14 08:56:46 np0005486759.ooo.test sudo[108526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:47 np0005486759.ooo.test sudo[108526]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:47 np0005486759.ooo.test sudo[108537]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnis8c4k5/privsep.sock
Oct 14 08:56:47 np0005486759.ooo.test sudo[108537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:48 np0005486759.ooo.test sudo[108537]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:48 np0005486759.ooo.test sudo[108548]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppp1ro8v7/privsep.sock
Oct 14 08:56:48 np0005486759.ooo.test sudo[108548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:48 np0005486759.ooo.test sudo[108548]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:49 np0005486759.ooo.test sudo[108559]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1ogtlwq4/privsep.sock
Oct 14 08:56:49 np0005486759.ooo.test sudo[108559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:49 np0005486759.ooo.test sudo[108559]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:50 np0005486759.ooo.test sudo[108570]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0km64ghe/privsep.sock
Oct 14 08:56:50 np0005486759.ooo.test sudo[108570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:50 np0005486759.ooo.test sudo[108570]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:50 np0005486759.ooo.test sudo[108584]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw9r31zfb/privsep.sock
Oct 14 08:56:50 np0005486759.ooo.test sudo[108584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:51 np0005486759.ooo.test sudo[108584]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:51 np0005486759.ooo.test sudo[108598]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxnarhbgb/privsep.sock
Oct 14 08:56:51 np0005486759.ooo.test sudo[108598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:52 np0005486759.ooo.test sudo[108598]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:52 np0005486759.ooo.test sudo[108609]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpplipbmxz/privsep.sock
Oct 14 08:56:52 np0005486759.ooo.test sudo[108609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:53 np0005486759.ooo.test sudo[108609]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:53 np0005486759.ooo.test sudo[108620]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnv0bfflu/privsep.sock
Oct 14 08:56:53 np0005486759.ooo.test sudo[108620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:54 np0005486759.ooo.test sudo[108620]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:54 np0005486759.ooo.test sudo[108631]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1n1z6qr5/privsep.sock
Oct 14 08:56:54 np0005486759.ooo.test sudo[108631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:55 np0005486759.ooo.test sudo[108631]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:55 np0005486759.ooo.test sudo[108642]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcyfijby1/privsep.sock
Oct 14 08:56:55 np0005486759.ooo.test sudo[108642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:55 np0005486759.ooo.test sudo[108642]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:56 np0005486759.ooo.test sudo[108655]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsguohn2z/privsep.sock
Oct 14 08:56:56 np0005486759.ooo.test sudo[108655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:56 np0005486759.ooo.test sudo[108655]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:56:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:56:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:56:56 np0005486759.ooo.test podman[108667]: 2025-10-14 08:56:56.867393578 +0000 UTC m=+0.072540637 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, version=17.1.9, config_id=tripleo_step3, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:56:56 np0005486759.ooo.test podman[108667]: 2025-10-14 08:56:56.874318563 +0000 UTC m=+0.079465652 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03)
Oct 14 08:56:56 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:56:56 np0005486759.ooo.test podman[108666]: 2025-10-14 08:56:56.915937957 +0000 UTC m=+0.120260051 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20250721.1, version=17.1.9, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Oct 14 08:56:56 np0005486759.ooo.test podman[108666]: 2025-10-14 08:56:56.932461241 +0000 UTC m=+0.136783345 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.33.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5)
Oct 14 08:56:56 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:56:56 np0005486759.ooo.test podman[108664]: 2025-10-14 08:56:56.973099695 +0000 UTC m=+0.176196051 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1)
Oct 14 08:56:56 np0005486759.ooo.test podman[108664]: 2025-10-14 08:56:56.983199239 +0000 UTC m=+0.186295575 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Oct 14 08:56:56 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:56:57 np0005486759.ooo.test sudo[108731]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl2f4cidv/privsep.sock
Oct 14 08:56:57 np0005486759.ooo.test sudo[108731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:57 np0005486759.ooo.test sudo[108731]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:57 np0005486759.ooo.test sudo[108742]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6m0pl6vd/privsep.sock
Oct 14 08:56:57 np0005486759.ooo.test sudo[108742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:58 np0005486759.ooo.test sudo[108742]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:58 np0005486759.ooo.test sudo[108753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph_xhrj32/privsep.sock
Oct 14 08:56:58 np0005486759.ooo.test sudo[108753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:56:59 np0005486759.ooo.test sudo[108753]: pam_unix(sudo:session): session closed for user root
Oct 14 08:56:59 np0005486759.ooo.test sudo[108764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeu3sgn38/privsep.sock
Oct 14 08:56:59 np0005486759.ooo.test sudo[108764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:00 np0005486759.ooo.test sudo[108764]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:00 np0005486759.ooo.test sudo[108775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3wqyde79/privsep.sock
Oct 14 08:57:00 np0005486759.ooo.test sudo[108775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:57:00 np0005486759.ooo.test systemd[1]: tmp-crun.M8TTJW.mount: Deactivated successfully.
Oct 14 08:57:00 np0005486759.ooo.test podman[108777]: 2025-10-14 08:57:00.836029687 +0000 UTC m=+0.096355267 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step1, io.buildah.version=1.33.12, release=1)
Oct 14 08:57:01 np0005486759.ooo.test podman[108777]: 2025-10-14 08:57:01.007599044 +0000 UTC m=+0.267924614 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_id=tripleo_step1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 08:57:01 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:57:01 np0005486759.ooo.test sudo[108775]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:01 np0005486759.ooo.test sudo[108817]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1bn14x1w/privsep.sock
Oct 14 08:57:01 np0005486759.ooo.test sudo[108817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:57:01 np0005486759.ooo.test podman[108819]: 2025-10-14 08:57:01.681562765 +0000 UTC m=+0.069082960 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:57:01 np0005486759.ooo.test systemd[1]: tmp-crun.9azEQd.mount: Deactivated successfully.
Oct 14 08:57:02 np0005486759.ooo.test podman[108819]: 2025-10-14 08:57:02.102590039 +0000 UTC m=+0.490110234 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, container_name=nova_migration_target, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 08:57:02 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:57:02 np0005486759.ooo.test sudo[108817]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:02 np0005486759.ooo.test sudo[108855]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1wqh1n2g/privsep.sock
Oct 14 08:57:02 np0005486759.ooo.test sudo[108855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:03 np0005486759.ooo.test sudo[108855]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:03 np0005486759.ooo.test sudo[108866]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4l9jlu5e/privsep.sock
Oct 14 08:57:03 np0005486759.ooo.test sudo[108866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:04 np0005486759.ooo.test sudo[108866]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:04 np0005486759.ooo.test sudo[108877]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5di91q_w/privsep.sock
Oct 14 08:57:04 np0005486759.ooo.test sudo[108877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:05 np0005486759.ooo.test sudo[108877]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:05 np0005486759.ooo.test sudo[108888]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5913d_93/privsep.sock
Oct 14 08:57:05 np0005486759.ooo.test sudo[108888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:05 np0005486759.ooo.test sudo[108888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:57:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:57:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:57:06 np0005486759.ooo.test systemd[1]: tmp-crun.UrihJE.mount: Deactivated successfully.
Oct 14 08:57:06 np0005486759.ooo.test podman[108895]: 2025-10-14 08:57:06.014363481 +0000 UTC m=+0.090446164 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., release=1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Oct 14 08:57:06 np0005486759.ooo.test podman[108893]: 2025-10-14 08:57:06.057125801 +0000 UTC m=+0.137030163 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, release=1, config_id=tripleo_step4, container_name=logrotate_crond, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible)
Oct 14 08:57:06 np0005486759.ooo.test podman[108895]: 2025-10-14 08:57:06.08637103 +0000 UTC m=+0.162453753 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, vcs-type=git, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.9)
Oct 14 08:57:06 np0005486759.ooo.test podman[108895]: unhealthy
Oct 14 08:57:06 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:57:06 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:57:06 np0005486759.ooo.test sudo[108949]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmiji3gu0/privsep.sock
Oct 14 08:57:06 np0005486759.ooo.test podman[108893]: 2025-10-14 08:57:06.144401575 +0000 UTC m=+0.224305917 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, version=17.1.9, release=1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4)
Oct 14 08:57:06 np0005486759.ooo.test sudo[108949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:06 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:57:06 np0005486759.ooo.test podman[108896]: 2025-10-14 08:57:06.163123248 +0000 UTC m=+0.237863239 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, release=1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.9, distribution-scope=public)
Oct 14 08:57:06 np0005486759.ooo.test podman[108896]: 2025-10-14 08:57:06.183426589 +0000 UTC m=+0.258166560 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., version=17.1.9, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, architecture=x86_64)
Oct 14 08:57:06 np0005486759.ooo.test podman[108896]: unhealthy
Oct 14 08:57:06 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:57:06 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:57:06 np0005486759.ooo.test sudo[108949]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:07 np0005486759.ooo.test sudo[108969]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_76mgkzc/privsep.sock
Oct 14 08:57:07 np0005486759.ooo.test sudo[108969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:07 np0005486759.ooo.test sudo[108969]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:07 np0005486759.ooo.test sudo[108984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbzmlkkjg/privsep.sock
Oct 14 08:57:08 np0005486759.ooo.test sudo[108984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:08 np0005486759.ooo.test sudo[108984]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:08 np0005486759.ooo.test sudo[108995]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwf715asc/privsep.sock
Oct 14 08:57:08 np0005486759.ooo.test sudo[108995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:09 np0005486759.ooo.test sudo[108995]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:09 np0005486759.ooo.test sudo[109006]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmmywxp52/privsep.sock
Oct 14 08:57:09 np0005486759.ooo.test sudo[109006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:10 np0005486759.ooo.test sudo[109006]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:10 np0005486759.ooo.test sudo[109017]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwgv12otm/privsep.sock
Oct 14 08:57:10 np0005486759.ooo.test sudo[109017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:11 np0005486759.ooo.test sudo[109017]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:11 np0005486759.ooo.test sudo[109028]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp745px4h1/privsep.sock
Oct 14 08:57:11 np0005486759.ooo.test sudo[109028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:12 np0005486759.ooo.test sudo[109028]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:12 np0005486759.ooo.test sudo[109041]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqupo6shs/privsep.sock
Oct 14 08:57:12 np0005486759.ooo.test sudo[109041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:57:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:57:12 np0005486759.ooo.test systemd[1]: tmp-crun.WmeEIE.mount: Deactivated successfully.
Oct 14 08:57:12 np0005486759.ooo.test podman[109044]: 2025-10-14 08:57:12.68152948 +0000 UTC m=+0.116788834 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T16:28:53, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:57:12 np0005486759.ooo.test podman[109044]: 2025-10-14 08:57:12.712179853 +0000 UTC m=+0.147439167 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9)
Oct 14 08:57:12 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:57:12 np0005486759.ooo.test podman[109045]: 2025-10-14 08:57:12.747922694 +0000 UTC m=+0.178836162 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, release=1, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:57:12 np0005486759.ooo.test podman[109045]: 2025-10-14 08:57:12.796397762 +0000 UTC m=+0.227311230 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-07-21T13:28:44, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 08:57:12 np0005486759.ooo.test podman[109045]: unhealthy
Oct 14 08:57:12 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:57:12 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:57:13 np0005486759.ooo.test sudo[109041]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:13 np0005486759.ooo.test sudo[109103]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpde62w9jc/privsep.sock
Oct 14 08:57:13 np0005486759.ooo.test sudo[109103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:14 np0005486759.ooo.test sudo[109103]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:14 np0005486759.ooo.test sudo[109114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqtu_mhf7/privsep.sock
Oct 14 08:57:14 np0005486759.ooo.test sudo[109114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:14 np0005486759.ooo.test sudo[109114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:14 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:57:14 np0005486759.ooo.test recover_tripleo_nova_virtqemud[109121]: 47951
Oct 14 08:57:14 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:57:14 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:57:15 np0005486759.ooo.test sudo[109127]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg4w7oa3g/privsep.sock
Oct 14 08:57:15 np0005486759.ooo.test sudo[109127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:15 np0005486759.ooo.test sudo[109127]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:16 np0005486759.ooo.test sudo[109138]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp301bbz46/privsep.sock
Oct 14 08:57:16 np0005486759.ooo.test sudo[109138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:16 np0005486759.ooo.test sudo[109138]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:16 np0005486759.ooo.test sudo[109149]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqpqzao6o/privsep.sock
Oct 14 08:57:16 np0005486759.ooo.test sudo[109149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:17 np0005486759.ooo.test sudo[109149]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:17 np0005486759.ooo.test sudo[109160]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfbzpdiks/privsep.sock
Oct 14 08:57:17 np0005486759.ooo.test sudo[109160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:18 np0005486759.ooo.test sudo[109160]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:18 np0005486759.ooo.test sudo[109177]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd4do37zg/privsep.sock
Oct 14 08:57:18 np0005486759.ooo.test sudo[109177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:19 np0005486759.ooo.test sudo[109177]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:19 np0005486759.ooo.test sudo[109188]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc5ipi6ae/privsep.sock
Oct 14 08:57:19 np0005486759.ooo.test sudo[109188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:20 np0005486759.ooo.test sudo[109188]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:20 np0005486759.ooo.test sudo[109199]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpndusbnvn/privsep.sock
Oct 14 08:57:20 np0005486759.ooo.test sudo[109199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:21 np0005486759.ooo.test sudo[109199]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:21 np0005486759.ooo.test sudo[109210]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk547drkn/privsep.sock
Oct 14 08:57:21 np0005486759.ooo.test sudo[109210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:22 np0005486759.ooo.test sudo[109210]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:22 np0005486759.ooo.test sudo[109221]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqoa_omuk/privsep.sock
Oct 14 08:57:22 np0005486759.ooo.test sudo[109221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:22 np0005486759.ooo.test sudo[109221]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:23 np0005486759.ooo.test sudo[109235]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmso6pel9/privsep.sock
Oct 14 08:57:23 np0005486759.ooo.test sudo[109235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:23 np0005486759.ooo.test sudo[109235]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:24 np0005486759.ooo.test sudo[109249]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8b1zbjb4/privsep.sock
Oct 14 08:57:24 np0005486759.ooo.test sudo[109249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:24 np0005486759.ooo.test sudo[109249]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:25 np0005486759.ooo.test sudo[109260]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0r467brq/privsep.sock
Oct 14 08:57:25 np0005486759.ooo.test sudo[109260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:25 np0005486759.ooo.test sudo[109260]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:25 np0005486759.ooo.test sudo[109271]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpznkv3nyi/privsep.sock
Oct 14 08:57:25 np0005486759.ooo.test sudo[109271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:26 np0005486759.ooo.test sudo[109271]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:26 np0005486759.ooo.test sudo[109282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxidbfzs9/privsep.sock
Oct 14 08:57:26 np0005486759.ooo.test sudo[109282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:27 np0005486759.ooo.test sudo[109282]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: tmp-crun.NeeEVg.mount: Deactivated successfully.
Oct 14 08:57:27 np0005486759.ooo.test podman[109288]: 2025-10-14 08:57:27.439915816 +0000 UTC m=+0.075689245 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, distribution-scope=public, release=1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-iscsid-container)
Oct 14 08:57:27 np0005486759.ooo.test podman[109288]: 2025-10-14 08:57:27.445573532 +0000 UTC m=+0.081347021 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:57:27 np0005486759.ooo.test podman[109289]: 2025-10-14 08:57:27.498410605 +0000 UTC m=+0.131223142 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step5, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.9, build-date=2025-07-21T14:48:37)
Oct 14 08:57:27 np0005486759.ooo.test podman[109290]: 2025-10-14 08:57:27.45129391 +0000 UTC m=+0.081816286 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, version=17.1.9, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, release=2)
Oct 14 08:57:27 np0005486759.ooo.test podman[109289]: 2025-10-14 08:57:27.518725227 +0000 UTC m=+0.151537744 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.openshift.expose-services=)
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:57:27 np0005486759.ooo.test podman[109290]: 2025-10-14 08:57:27.535259761 +0000 UTC m=+0.165782077 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, release=2, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:57:27 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:57:27 np0005486759.ooo.test sudo[109351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc34_4jqw/privsep.sock
Oct 14 08:57:27 np0005486759.ooo.test sudo[109351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:28 np0005486759.ooo.test sudo[109351]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:28 np0005486759.ooo.test systemd[1]: tmp-crun.jK7H9g.mount: Deactivated successfully.
Oct 14 08:57:28 np0005486759.ooo.test sudo[109362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm6dfq8pw/privsep.sock
Oct 14 08:57:28 np0005486759.ooo.test sudo[109362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:29 np0005486759.ooo.test sudo[109362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:29 np0005486759.ooo.test sudo[109379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyutmmntf/privsep.sock
Oct 14 08:57:29 np0005486759.ooo.test sudo[109379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:29 np0005486759.ooo.test sudo[109379]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:30 np0005486759.ooo.test sudo[109390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpabeo7azm/privsep.sock
Oct 14 08:57:30 np0005486759.ooo.test sudo[109390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:30 np0005486759.ooo.test sudo[109390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:31 np0005486759.ooo.test sudo[109401]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpblymogsj/privsep.sock
Oct 14 08:57:31 np0005486759.ooo.test sudo[109401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:57:31 np0005486759.ooo.test podman[109402]: 2025-10-14 08:57:31.230094666 +0000 UTC m=+0.052638759 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, batch=17.1_20250721.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:57:31 np0005486759.ooo.test podman[109402]: 2025-10-14 08:57:31.413407947 +0000 UTC m=+0.235952110 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:57:31 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:57:31 np0005486759.ooo.test sudo[109401]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:31 np0005486759.ooo.test sudo[109441]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6p9ha5y1/privsep.sock
Oct 14 08:57:31 np0005486759.ooo.test sudo[109441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:57:32 np0005486759.ooo.test podman[109444]: 2025-10-14 08:57:32.448945874 +0000 UTC m=+0.080306969 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37)
Oct 14 08:57:32 np0005486759.ooo.test sudo[109441]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:32 np0005486759.ooo.test podman[109444]: 2025-10-14 08:57:32.783742967 +0000 UTC m=+0.415104082 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, release=1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:57:32 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:57:32 np0005486759.ooo.test sudo[109473]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmqi0gepq/privsep.sock
Oct 14 08:57:32 np0005486759.ooo.test sudo[109473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:33 np0005486759.ooo.test sudo[109473]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:33 np0005486759.ooo.test sudo[109484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppt4uw8ux/privsep.sock
Oct 14 08:57:33 np0005486759.ooo.test sudo[109484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:34 np0005486759.ooo.test sudo[109484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:34 np0005486759.ooo.test sudo[109501]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp0n7trca/privsep.sock
Oct 14 08:57:34 np0005486759.ooo.test sudo[109501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:35 np0005486759.ooo.test sudo[109501]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:35 np0005486759.ooo.test sudo[109512]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1g00m4wq/privsep.sock
Oct 14 08:57:35 np0005486759.ooo.test sudo[109512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:35 np0005486759.ooo.test sudo[109512]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:36 np0005486759.ooo.test sudo[109523]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_zb8l64j/privsep.sock
Oct 14 08:57:36 np0005486759.ooo.test sudo[109523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:57:36 np0005486759.ooo.test podman[109525]: 2025-10-14 08:57:36.346912745 +0000 UTC m=+0.061576356 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, release=1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:57:36 np0005486759.ooo.test podman[109525]: 2025-10-14 08:57:36.356134303 +0000 UTC m=+0.070797924 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, tcib_managed=true, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:57:36 np0005486759.ooo.test podman[109527]: 2025-10-14 08:57:36.417526652 +0000 UTC m=+0.126586238 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.9, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:57:36 np0005486759.ooo.test podman[109527]: 2025-10-14 08:57:36.432211289 +0000 UTC m=+0.141270855 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, maintainer=OpenStack TripleO Team, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible)
Oct 14 08:57:36 np0005486759.ooo.test podman[109527]: unhealthy
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:57:36 np0005486759.ooo.test podman[109526]: 2025-10-14 08:57:36.549178767 +0000 UTC m=+0.259886354 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, release=1, vcs-type=git, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.buildah.version=1.33.12)
Oct 14 08:57:36 np0005486759.ooo.test podman[109526]: 2025-10-14 08:57:36.589004435 +0000 UTC m=+0.299712032 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, release=1, architecture=x86_64, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1)
Oct 14 08:57:36 np0005486759.ooo.test podman[109526]: unhealthy
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:57:36 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:57:36 np0005486759.ooo.test sudo[109523]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:37 np0005486759.ooo.test sudo[109589]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp24fxkfc4/privsep.sock
Oct 14 08:57:37 np0005486759.ooo.test sudo[109589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:37 np0005486759.ooo.test sudo[109589]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:38 np0005486759.ooo.test sudo[109600]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgg9rjut5/privsep.sock
Oct 14 08:57:38 np0005486759.ooo.test sudo[109600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:38 np0005486759.ooo.test sudo[109600]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:38 np0005486759.ooo.test sudo[109611]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6w0dtbg8/privsep.sock
Oct 14 08:57:38 np0005486759.ooo.test sudo[109611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:39 np0005486759.ooo.test sudo[109611]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:39 np0005486759.ooo.test sudo[109628]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyjs7n4t4/privsep.sock
Oct 14 08:57:39 np0005486759.ooo.test sudo[109628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:40 np0005486759.ooo.test sudo[109628]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:40 np0005486759.ooo.test sudo[109639]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzxlyxyt7/privsep.sock
Oct 14 08:57:40 np0005486759.ooo.test sudo[109639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:41 np0005486759.ooo.test sudo[109639]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:41 np0005486759.ooo.test sudo[109650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp25_msykg/privsep.sock
Oct 14 08:57:41 np0005486759.ooo.test sudo[109650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:42 np0005486759.ooo.test sudo[109650]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:42 np0005486759.ooo.test sudo[109661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqfx679ak/privsep.sock
Oct 14 08:57:42 np0005486759.ooo.test sudo[109661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:43 np0005486759.ooo.test sudo[109661]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:57:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:57:43 np0005486759.ooo.test systemd[1]: tmp-crun.aV17A5.mount: Deactivated successfully.
Oct 14 08:57:43 np0005486759.ooo.test podman[109667]: 2025-10-14 08:57:43.247663379 +0000 UTC m=+0.097592836 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.9)
Oct 14 08:57:43 np0005486759.ooo.test podman[109668]: 2025-10-14 08:57:43.206078866 +0000 UTC m=+0.058970005 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1, version=17.1.9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller)
Oct 14 08:57:43 np0005486759.ooo.test podman[109667]: 2025-10-14 08:57:43.287626642 +0000 UTC m=+0.137556029 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, batch=17.1_20250721.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=)
Oct 14 08:57:43 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Deactivated successfully.
Oct 14 08:57:43 np0005486759.ooo.test podman[109668]: 2025-10-14 08:57:43.343727307 +0000 UTC m=+0.196618446 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:57:43 np0005486759.ooo.test podman[109668]: unhealthy
Oct 14 08:57:43 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:57:43 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:57:43 np0005486759.ooo.test sudo[109721]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi4b3__rk/privsep.sock
Oct 14 08:57:43 np0005486759.ooo.test sudo[109721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:44 np0005486759.ooo.test sudo[109721]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:44 np0005486759.ooo.test sudo[109732]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpigi40go1/privsep.sock
Oct 14 08:57:44 np0005486759.ooo.test sudo[109732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:44 np0005486759.ooo.test sudo[109732]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:45 np0005486759.ooo.test sudo[109749]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0oxxpzeo/privsep.sock
Oct 14 08:57:45 np0005486759.ooo.test sudo[109749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:45 np0005486759.ooo.test sudo[109749]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:46 np0005486759.ooo.test sudo[109760]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx7v4bmdw/privsep.sock
Oct 14 08:57:46 np0005486759.ooo.test sudo[109760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:46 np0005486759.ooo.test sudo[109760]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:46 np0005486759.ooo.test sudo[109771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeejbd293/privsep.sock
Oct 14 08:57:46 np0005486759.ooo.test sudo[109771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:47 np0005486759.ooo.test sudo[109771]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:47 np0005486759.ooo.test sudo[109782]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxf7h1vms/privsep.sock
Oct 14 08:57:47 np0005486759.ooo.test sudo[109782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:48 np0005486759.ooo.test sudo[109782]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:48 np0005486759.ooo.test sudo[109793]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt_calyya/privsep.sock
Oct 14 08:57:48 np0005486759.ooo.test sudo[109793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:49 np0005486759.ooo.test sudo[109793]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:49 np0005486759.ooo.test sudo[109804]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptc_48uym/privsep.sock
Oct 14 08:57:49 np0005486759.ooo.test sudo[109804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:50 np0005486759.ooo.test sudo[109804]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:50 np0005486759.ooo.test sudo[109821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr00a939t/privsep.sock
Oct 14 08:57:50 np0005486759.ooo.test sudo[109821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:51 np0005486759.ooo.test sudo[109821]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:51 np0005486759.ooo.test sudo[109832]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphiz2t0ac/privsep.sock
Oct 14 08:57:51 np0005486759.ooo.test sudo[109832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:52 np0005486759.ooo.test sudo[109832]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:52 np0005486759.ooo.test sudo[109843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv2kmbv24/privsep.sock
Oct 14 08:57:52 np0005486759.ooo.test sudo[109843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:52 np0005486759.ooo.test sudo[109843]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:53 np0005486759.ooo.test sudo[109854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp101ebbsv/privsep.sock
Oct 14 08:57:53 np0005486759.ooo.test sudo[109854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:53 np0005486759.ooo.test sudo[109854]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:54 np0005486759.ooo.test sudo[109865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphnqifbtf/privsep.sock
Oct 14 08:57:54 np0005486759.ooo.test sudo[109865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:54 np0005486759.ooo.test sudo[109865]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:54 np0005486759.ooo.test sudo[109876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgpleci7k/privsep.sock
Oct 14 08:57:54 np0005486759.ooo.test sudo[109876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:55 np0005486759.ooo.test sudo[109876]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:55 np0005486759.ooo.test sudo[109892]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphimyadks/privsep.sock
Oct 14 08:57:55 np0005486759.ooo.test sudo[109892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:56 np0005486759.ooo.test sudo[109892]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:56 np0005486759.ooo.test sudo[109904]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuwlx4vdk/privsep.sock
Oct 14 08:57:56 np0005486759.ooo.test sudo[109904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:57 np0005486759.ooo.test sudo[109904]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:57 np0005486759.ooo.test sudo[109915]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7zo1_xm6/privsep.sock
Oct 14 08:57:57 np0005486759.ooo.test sudo[109915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:58 np0005486759.ooo.test sudo[109915]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:57:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:57:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:57:58 np0005486759.ooo.test podman[109921]: 2025-10-14 08:57:58.146049059 +0000 UTC m=+0.068223663 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, vcs-type=git, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:57:58 np0005486759.ooo.test podman[109921]: 2025-10-14 08:57:58.152429208 +0000 UTC m=+0.074603832 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12)
Oct 14 08:57:58 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:57:58 np0005486759.ooo.test podman[109922]: 2025-10-14 08:57:58.195633992 +0000 UTC m=+0.116121824 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 08:57:58 np0005486759.ooo.test podman[109922]: 2025-10-14 08:57:58.211707272 +0000 UTC m=+0.132195124 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.openshift.expose-services=, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=17.1.9, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, release=1, container_name=nova_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container)
Oct 14 08:57:58 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:57:58 np0005486759.ooo.test podman[109923]: 2025-10-14 08:57:58.260631264 +0000 UTC m=+0.178845675 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, container_name=collectd, version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, release=2, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 08:57:58 np0005486759.ooo.test podman[109923]: 2025-10-14 08:57:58.265697592 +0000 UTC m=+0.183911983 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, version=17.1.9)
Oct 14 08:57:58 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:57:58 np0005486759.ooo.test sudo[109989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9ckv26is/privsep.sock
Oct 14 08:57:58 np0005486759.ooo.test sudo[109989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:58 np0005486759.ooo.test sudo[109989]: pam_unix(sudo:session): session closed for user root
Oct 14 08:57:59 np0005486759.ooo.test sudo[110000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphpyaeb4x/privsep.sock
Oct 14 08:57:59 np0005486759.ooo.test sudo[110000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:57:59 np0005486759.ooo.test sudo[110000]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:00 np0005486759.ooo.test sudo[110011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmlonivcg/privsep.sock
Oct 14 08:58:00 np0005486759.ooo.test sudo[110011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:00 np0005486759.ooo.test sudo[110011]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:00 np0005486759.ooo.test sudo[110022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkp0j1etp/privsep.sock
Oct 14 08:58:00 np0005486759.ooo.test sudo[110022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:01 np0005486759.ooo.test sudo[110022]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:58:01 np0005486759.ooo.test podman[110033]: 2025-10-14 08:58:01.600213179 +0000 UTC m=+0.061014839 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr)
Oct 14 08:58:01 np0005486759.ooo.test podman[110033]: 2025-10-14 08:58:01.754915321 +0000 UTC m=+0.215716860 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.openshift.expose-services=)
Oct 14 08:58:01 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:58:01 np0005486759.ooo.test sudo[110070]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj3bi3zcb/privsep.sock
Oct 14 08:58:01 np0005486759.ooo.test sudo[110070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:02 np0005486759.ooo.test sudo[110070]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:02 np0005486759.ooo.test sudo[110081]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe7aue5os/privsep.sock
Oct 14 08:58:02 np0005486759.ooo.test sudo[110081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:03 np0005486759.ooo.test sudo[110081]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:58:03 np0005486759.ooo.test podman[110085]: 2025-10-14 08:58:03.314993262 +0000 UTC m=+0.062808785 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, release=1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true)
Oct 14 08:58:03 np0005486759.ooo.test sudo[110114]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0potle1a/privsep.sock
Oct 14 08:58:03 np0005486759.ooo.test sudo[110114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:03 np0005486759.ooo.test podman[110085]: 2025-10-14 08:58:03.645692407 +0000 UTC m=+0.393507900 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, version=17.1.9, container_name=nova_migration_target)
Oct 14 08:58:03 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:58:04 np0005486759.ooo.test sudo[110114]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:04 np0005486759.ooo.test sudo[110125]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi676r0dy/privsep.sock
Oct 14 08:58:04 np0005486759.ooo.test sudo[110125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:04 np0005486759.ooo.test sudo[110125]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:05 np0005486759.ooo.test sudo[110136]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp27_naasa/privsep.sock
Oct 14 08:58:05 np0005486759.ooo.test sudo[110136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:05 np0005486759.ooo.test sudo[110136]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:06 np0005486759.ooo.test sudo[110147]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwffqozxa/privsep.sock
Oct 14 08:58:06 np0005486759.ooo.test sudo[110147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:06 np0005486759.ooo.test sudo[110147]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: tmp-crun.iH3fqV.mount: Deactivated successfully.
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:58:06 np0005486759.ooo.test podman[110156]: 2025-10-14 08:58:06.670039578 +0000 UTC m=+0.071136754 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, release=1, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:58:06 np0005486759.ooo.test podman[110158]: 2025-10-14 08:58:06.681543246 +0000 UTC m=+0.076804920 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:58:06 np0005486759.ooo.test podman[110158]: 2025-10-14 08:58:06.694259251 +0000 UTC m=+0.089520925 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T15:29:47, vcs-type=git, release=1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 08:58:06 np0005486759.ooo.test podman[110158]: unhealthy
Oct 14 08:58:06 np0005486759.ooo.test podman[110156]: 2025-10-14 08:58:06.703451478 +0000 UTC m=+0.104548634 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, version=17.1.9, com.redhat.component=openstack-cron-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.buildah.version=1.33.12, container_name=logrotate_crond, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:58:06 np0005486759.ooo.test podman[110187]: 2025-10-14 08:58:06.812946433 +0000 UTC m=+0.135699712 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 08:58:06 np0005486759.ooo.test podman[110187]: 2025-10-14 08:58:06.824247125 +0000 UTC m=+0.147000404 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, release=1, version=17.1.9)
Oct 14 08:58:06 np0005486759.ooo.test podman[110187]: unhealthy
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:06 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:58:06 np0005486759.ooo.test sudo[110219]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqw5k9zwc/privsep.sock
Oct 14 08:58:06 np0005486759.ooo.test sudo[110219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:07 np0005486759.ooo.test sudo[110219]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:07 np0005486759.ooo.test sudo[110230]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyfo0xyeb/privsep.sock
Oct 14 08:58:07 np0005486759.ooo.test sudo[110230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:08 np0005486759.ooo.test sudo[110230]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:08 np0005486759.ooo.test sudo[110241]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp28uoigdf/privsep.sock
Oct 14 08:58:08 np0005486759.ooo.test sudo[110241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:09 np0005486759.ooo.test sudo[110241]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:09 np0005486759.ooo.test sudo[110252]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4j5qfvo6/privsep.sock
Oct 14 08:58:09 np0005486759.ooo.test sudo[110252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:10 np0005486759.ooo.test sudo[110252]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:10 np0005486759.ooo.test sudo[110263]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjqgxsexv/privsep.sock
Oct 14 08:58:10 np0005486759.ooo.test sudo[110263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:10 np0005486759.ooo.test sudo[110263]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:11 np0005486759.ooo.test sudo[110274]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpebz9wry4/privsep.sock
Oct 14 08:58:11 np0005486759.ooo.test sudo[110274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:11 np0005486759.ooo.test sudo[110274]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:12 np0005486759.ooo.test sudo[110291]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp535gp4wl/privsep.sock
Oct 14 08:58:12 np0005486759.ooo.test sudo[110291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:12 np0005486759.ooo.test sudo[110291]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:13 np0005486759.ooo.test sudo[110302]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqb0qdm63/privsep.sock
Oct 14 08:58:13 np0005486759.ooo.test sudo[110302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: tmp-crun.2uf0W7.mount: Deactivated successfully.
Oct 14 08:58:13 np0005486759.ooo.test podman[110305]: 2025-10-14 08:58:13.446116674 +0000 UTC m=+0.068956496 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:58:13 np0005486759.ooo.test podman[110306]: 2025-10-14 08:58:13.456095774 +0000 UTC m=+0.074675364 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.openshift.expose-services=, release=1)
Oct 14 08:58:13 np0005486759.ooo.test podman[110305]: 2025-10-14 08:58:13.468226391 +0000 UTC m=+0.091066223 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:58:13 np0005486759.ooo.test podman[110305]: unhealthy
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 08:58:13 np0005486759.ooo.test podman[110306]: 2025-10-14 08:58:13.493250689 +0000 UTC m=+0.111830279 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, version=17.1.9, name=rhosp17/openstack-ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 08:58:13 np0005486759.ooo.test podman[110306]: unhealthy
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:13 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:58:13 np0005486759.ooo.test sudo[110302]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:14 np0005486759.ooo.test sudo[110351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5eqi21zt/privsep.sock
Oct 14 08:58:14 np0005486759.ooo.test sudo[110351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:14 np0005486759.ooo.test systemd[1]: tmp-crun.LvIyzG.mount: Deactivated successfully.
Oct 14 08:58:14 np0005486759.ooo.test sudo[110351]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:14 np0005486759.ooo.test sudo[110362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjcjc4gnp/privsep.sock
Oct 14 08:58:14 np0005486759.ooo.test sudo[110362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:15 np0005486759.ooo.test sudo[110362]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:15 np0005486759.ooo.test sudo[110373]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu177mpsi/privsep.sock
Oct 14 08:58:15 np0005486759.ooo.test sudo[110373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:16 np0005486759.ooo.test sudo[110373]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:16 np0005486759.ooo.test sudo[110384]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcuvuy3uy/privsep.sock
Oct 14 08:58:16 np0005486759.ooo.test sudo[110384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:17 np0005486759.ooo.test sudo[110384]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:17 np0005486759.ooo.test sudo[110401]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphj7t38_i/privsep.sock
Oct 14 08:58:17 np0005486759.ooo.test sudo[110401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:18 np0005486759.ooo.test sudo[110401]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:18 np0005486759.ooo.test sudo[110412]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_6gly1k6/privsep.sock
Oct 14 08:58:18 np0005486759.ooo.test sudo[110412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:18 np0005486759.ooo.test sudo[110412]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:19 np0005486759.ooo.test sudo[110423]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjcno6vui/privsep.sock
Oct 14 08:58:19 np0005486759.ooo.test sudo[110423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:19 np0005486759.ooo.test sudo[110423]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:20 np0005486759.ooo.test sudo[110434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvw0nvjpc/privsep.sock
Oct 14 08:58:20 np0005486759.ooo.test sudo[110434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:20 np0005486759.ooo.test sudo[110434]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:21 np0005486759.ooo.test sudo[110445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj047gvt2/privsep.sock
Oct 14 08:58:21 np0005486759.ooo.test sudo[110445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:21 np0005486759.ooo.test sudo[110445]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:21 np0005486759.ooo.test sudo[110456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvldd6sh9/privsep.sock
Oct 14 08:58:21 np0005486759.ooo.test sudo[110456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:22 np0005486759.ooo.test sudo[110456]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:22 np0005486759.ooo.test sudo[110472]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp665znaek/privsep.sock
Oct 14 08:58:22 np0005486759.ooo.test sudo[110472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:23 np0005486759.ooo.test sudo[110472]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:23 np0005486759.ooo.test sudo[110484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdr03yr4s/privsep.sock
Oct 14 08:58:23 np0005486759.ooo.test sudo[110484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:24 np0005486759.ooo.test sudo[110484]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:24 np0005486759.ooo.test sudo[110495]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbxpc71gy/privsep.sock
Oct 14 08:58:24 np0005486759.ooo.test sudo[110495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:25 np0005486759.ooo.test sudo[110495]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:25 np0005486759.ooo.test sudo[110506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpafi29m_i/privsep.sock
Oct 14 08:58:25 np0005486759.ooo.test sudo[110506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:25 np0005486759.ooo.test sudo[110506]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:26 np0005486759.ooo.test sudo[110517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqb9wmial/privsep.sock
Oct 14 08:58:26 np0005486759.ooo.test sudo[110517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:26 np0005486759.ooo.test sudo[110517]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:27 np0005486759.ooo.test sudo[110528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps8zzzs0l/privsep.sock
Oct 14 08:58:27 np0005486759.ooo.test sudo[110528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:27 np0005486759.ooo.test sudo[110528]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:28 np0005486759.ooo.test sudo[110541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0xz8ag3/privsep.sock
Oct 14 08:58:28 np0005486759.ooo.test sudo[110541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:58:28 np0005486759.ooo.test podman[110548]: 2025-10-14 08:58:28.427252918 +0000 UTC m=+0.056032564 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, build-date=2025-07-21T13:27:15, release=1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: tmp-crun.aRvGCH.mount: Deactivated successfully.
Oct 14 08:58:28 np0005486759.ooo.test podman[110549]: 2025-10-14 08:58:28.4662208 +0000 UTC m=+0.090600798 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:58:28 np0005486759.ooo.test podman[110550]: 2025-10-14 08:58:28.509986952 +0000 UTC m=+0.131943465 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1)
Oct 14 08:58:28 np0005486759.ooo.test podman[110549]: 2025-10-14 08:58:28.521328594 +0000 UTC m=+0.145708602 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:58:28 np0005486759.ooo.test podman[110548]: 2025-10-14 08:58:28.560708359 +0000 UTC m=+0.189488065 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, release=1, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, config_id=tripleo_step3)
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:58:28 np0005486759.ooo.test podman[110550]: 2025-10-14 08:58:28.573132675 +0000 UTC m=+0.195089218 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, batch=17.1_20250721.1, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=2, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-type=git)
Oct 14 08:58:28 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:58:28 np0005486759.ooo.test sudo[110541]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:29 np0005486759.ooo.test sudo[110619]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9zs8ybcp/privsep.sock
Oct 14 08:58:29 np0005486759.ooo.test sudo[110619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:29 np0005486759.ooo.test sudo[110619]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:29 np0005486759.ooo.test sudo[110630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnynyug2d/privsep.sock
Oct 14 08:58:29 np0005486759.ooo.test sudo[110630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:30 np0005486759.ooo.test sudo[110630]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:30 np0005486759.ooo.test sudo[110641]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0pd2vnxm/privsep.sock
Oct 14 08:58:30 np0005486759.ooo.test sudo[110641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:31 np0005486759.ooo.test sudo[110641]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:31 np0005486759.ooo.test sudo[110652]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4hwsaxv4/privsep.sock
Oct 14 08:58:31 np0005486759.ooo.test sudo[110652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:32 np0005486759.ooo.test sudo[110652]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:58:32 np0005486759.ooo.test podman[110656]: 2025-10-14 08:58:32.462079598 +0000 UTC m=+0.087297127 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20250721.1, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container)
Oct 14 08:58:32 np0005486759.ooo.test sudo[110692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp36fx7axq/privsep.sock
Oct 14 08:58:32 np0005486759.ooo.test sudo[110692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:32 np0005486759.ooo.test podman[110656]: 2025-10-14 08:58:32.665403461 +0000 UTC m=+0.290620960 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, release=1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 08:58:32 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:58:33 np0005486759.ooo.test sudo[110692]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:33 np0005486759.ooo.test sudo[110707]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe14u22_0/privsep.sock
Oct 14 08:58:33 np0005486759.ooo.test sudo[110707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:34 np0005486759.ooo.test sudo[110707]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:58:34 np0005486759.ooo.test systemd[1]: tmp-crun.ZDDdG1.mount: Deactivated successfully.
Oct 14 08:58:34 np0005486759.ooo.test podman[110716]: 2025-10-14 08:58:34.286140059 +0000 UTC m=+0.091413585 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.9, com.redhat.component=openstack-nova-compute-container, release=1, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team)
Oct 14 08:58:34 np0005486759.ooo.test sudo[110743]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp29wjm78v/privsep.sock
Oct 14 08:58:34 np0005486759.ooo.test sudo[110743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:34 np0005486759.ooo.test podman[110716]: 2025-10-14 08:58:34.690491134 +0000 UTC m=+0.495764650 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 08:58:34 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:58:35 np0005486759.ooo.test sudo[110743]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:35 np0005486759.ooo.test sudo[110755]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3k0m350g/privsep.sock
Oct 14 08:58:35 np0005486759.ooo.test sudo[110755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:36 np0005486759.ooo.test sudo[110755]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:36 np0005486759.ooo.test sudo[110766]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphsrul7mr/privsep.sock
Oct 14 08:58:36 np0005486759.ooo.test sudo[110766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:37 np0005486759.ooo.test sudo[110766]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:58:37 np0005486759.ooo.test podman[110774]: 2025-10-14 08:58:37.142488835 +0000 UTC m=+0.071288688 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, release=1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Oct 14 08:58:37 np0005486759.ooo.test podman[110774]: 2025-10-14 08:58:37.155416177 +0000 UTC m=+0.084216050 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 08:58:37 np0005486759.ooo.test podman[110774]: unhealthy
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:58:37 np0005486759.ooo.test podman[110772]: 2025-10-14 08:58:37.196543667 +0000 UTC m=+0.131814211 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, tcib_managed=true, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:58:37 np0005486759.ooo.test podman[110773]: 2025-10-14 08:58:37.263745137 +0000 UTC m=+0.191036304 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:58:37 np0005486759.ooo.test podman[110772]: 2025-10-14 08:58:37.277970369 +0000 UTC m=+0.213240923 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20250721.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:58:37 np0005486759.ooo.test podman[110773]: 2025-10-14 08:58:37.300655584 +0000 UTC m=+0.227946721 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.9, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible)
Oct 14 08:58:37 np0005486759.ooo.test podman[110773]: unhealthy
Oct 14 08:58:37 np0005486759.ooo.test sudo[110838]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp81hf46w2/privsep.sock
Oct 14 08:58:37 np0005486759.ooo.test sudo[110838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:37 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:58:37 np0005486759.ooo.test sudo[110838]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:38 np0005486759.ooo.test sudo[110849]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxfor55op/privsep.sock
Oct 14 08:58:38 np0005486759.ooo.test sudo[110849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:38 np0005486759.ooo.test sudo[110849]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:39 np0005486759.ooo.test sudo[110866]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4pvwb4du/privsep.sock
Oct 14 08:58:39 np0005486759.ooo.test sudo[110866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:39 np0005486759.ooo.test sudo[110866]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:39 np0005486759.ooo.test sudo[110877]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptdnesjkp/privsep.sock
Oct 14 08:58:39 np0005486759.ooo.test sudo[110877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:40 np0005486759.ooo.test sudo[110877]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:40 np0005486759.ooo.test sudo[110888]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3zkvdg5i/privsep.sock
Oct 14 08:58:40 np0005486759.ooo.test sudo[110888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:41 np0005486759.ooo.test sudo[110888]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:41 np0005486759.ooo.test sudo[110899]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3ubxl34c/privsep.sock
Oct 14 08:58:41 np0005486759.ooo.test sudo[110899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:42 np0005486759.ooo.test sudo[110899]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:42 np0005486759.ooo.test sudo[110910]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq0htl_s0/privsep.sock
Oct 14 08:58:42 np0005486759.ooo.test sudo[110910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:43 np0005486759.ooo.test sudo[110910]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:43 np0005486759.ooo.test sudo[110921]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr3sdq9ku/privsep.sock
Oct 14 08:58:43 np0005486759.ooo.test sudo[110921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:58:43 np0005486759.ooo.test podman[110923]: 2025-10-14 08:58:43.598255989 +0000 UTC m=+0.052368730 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true)
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: tmp-crun.pgwfGv.mount: Deactivated successfully.
Oct 14 08:58:43 np0005486759.ooo.test podman[110924]: 2025-10-14 08:58:43.671772625 +0000 UTC m=+0.120101907 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1, vcs-type=git, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 08:58:43 np0005486759.ooo.test podman[110924]: 2025-10-14 08:58:43.686380019 +0000 UTC m=+0.134709301 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, container_name=ovn_controller, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 08:58:43 np0005486759.ooo.test podman[110924]: unhealthy
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:58:43 np0005486759.ooo.test podman[110923]: 2025-10-14 08:58:43.73750623 +0000 UTC m=+0.191618941 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=ovn_metadata_agent, release=1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 08:58:43 np0005486759.ooo.test podman[110923]: unhealthy
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:58:43 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 08:58:44 np0005486759.ooo.test sudo[110921]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:44 np0005486759.ooo.test sudo[110976]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgjs6z7ik/privsep.sock
Oct 14 08:58:44 np0005486759.ooo.test sudo[110976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:44 np0005486759.ooo.test sudo[110976]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:45 np0005486759.ooo.test sudo[110988]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfux7vnie/privsep.sock
Oct 14 08:58:45 np0005486759.ooo.test sudo[110988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:45 np0005486759.ooo.test sudo[110988]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:46 np0005486759.ooo.test sudo[110999]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp750_ydhm/privsep.sock
Oct 14 08:58:46 np0005486759.ooo.test sudo[110999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:46 np0005486759.ooo.test sudo[110999]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:46 np0005486759.ooo.test sudo[111010]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq1yzcw42/privsep.sock
Oct 14 08:58:46 np0005486759.ooo.test sudo[111010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:47 np0005486759.ooo.test sudo[111010]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:47 np0005486759.ooo.test sudo[111021]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkhutr0ki/privsep.sock
Oct 14 08:58:47 np0005486759.ooo.test sudo[111021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:48 np0005486759.ooo.test sudo[111021]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:48 np0005486759.ooo.test sudo[111032]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd8nl9guv/privsep.sock
Oct 14 08:58:48 np0005486759.ooo.test sudo[111032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:49 np0005486759.ooo.test sudo[111032]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:49 np0005486759.ooo.test sudo[111043]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpissnyfls/privsep.sock
Oct 14 08:58:49 np0005486759.ooo.test sudo[111043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:50 np0005486759.ooo.test sudo[111043]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:50 np0005486759.ooo.test sudo[111060]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9h3f77vg/privsep.sock
Oct 14 08:58:50 np0005486759.ooo.test sudo[111060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:50 np0005486759.ooo.test sudo[111060]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:51 np0005486759.ooo.test sudo[111071]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplmf6r7u8/privsep.sock
Oct 14 08:58:51 np0005486759.ooo.test sudo[111071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:51 np0005486759.ooo.test sudo[111071]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:51 np0005486759.ooo.test sudo[111082]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1yu_zkzc/privsep.sock
Oct 14 08:58:51 np0005486759.ooo.test sudo[111082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:52 np0005486759.ooo.test sudo[111082]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:52 np0005486759.ooo.test sudo[111093]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa7g9x0g6/privsep.sock
Oct 14 08:58:52 np0005486759.ooo.test sudo[111093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:53 np0005486759.ooo.test sudo[111093]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:53 np0005486759.ooo.test sudo[111104]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppgja86vl/privsep.sock
Oct 14 08:58:53 np0005486759.ooo.test sudo[111104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:54 np0005486759.ooo.test sudo[111104]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:54 np0005486759.ooo.test sudo[111115]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr_ttsqm0/privsep.sock
Oct 14 08:58:54 np0005486759.ooo.test sudo[111115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:55 np0005486759.ooo.test sudo[111115]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:55 np0005486759.ooo.test sudo[111132]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe3wspyh_/privsep.sock
Oct 14 08:58:55 np0005486759.ooo.test sudo[111132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:56 np0005486759.ooo.test sudo[111132]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:56 np0005486759.ooo.test sudo[111143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp3qlfive/privsep.sock
Oct 14 08:58:56 np0005486759.ooo.test sudo[111143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:56 np0005486759.ooo.test sudo[111143]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:57 np0005486759.ooo.test sudo[111154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjltpj5s8/privsep.sock
Oct 14 08:58:57 np0005486759.ooo.test sudo[111154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:57 np0005486759.ooo.test sudo[111154]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:58 np0005486759.ooo.test sudo[111165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0hibvfe/privsep.sock
Oct 14 08:58:58 np0005486759.ooo.test sudo[111165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:58 np0005486759.ooo.test sudo[111165]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:58:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:58:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:58:58 np0005486759.ooo.test systemd[1]: tmp-crun.9LfnM3.mount: Deactivated successfully.
Oct 14 08:58:58 np0005486759.ooo.test podman[111173]: 2025-10-14 08:58:58.807457406 +0000 UTC m=+0.077989877 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 08:58:58 np0005486759.ooo.test podman[111173]: 2025-10-14 08:58:58.816247949 +0000 UTC m=+0.086780430 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:04:03, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64)
Oct 14 08:58:58 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:58:58 np0005486759.ooo.test podman[111171]: 2025-10-14 08:58:58.910290615 +0000 UTC m=+0.183599572 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, build-date=2025-07-21T13:27:15, container_name=iscsid, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Oct 14 08:58:58 np0005486759.ooo.test podman[111172]: 2025-10-14 08:58:58.872077896 +0000 UTC m=+0.141492001 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step5, tcib_managed=true, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Oct 14 08:58:58 np0005486759.ooo.test podman[111171]: 2025-10-14 08:58:58.948401399 +0000 UTC m=+0.221710296 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true)
Oct 14 08:58:58 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:58:58 np0005486759.ooo.test sudo[111238]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnhavkiqk/privsep.sock
Oct 14 08:58:58 np0005486759.ooo.test sudo[111238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:58:59 np0005486759.ooo.test podman[111172]: 2025-10-14 08:58:59.001676526 +0000 UTC m=+0.271090641 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container)
Oct 14 08:58:59 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:58:59 np0005486759.ooo.test sudo[111238]: pam_unix(sudo:session): session closed for user root
Oct 14 08:58:59 np0005486759.ooo.test sudo[111249]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4uj95uwk/privsep.sock
Oct 14 08:58:59 np0005486759.ooo.test sudo[111249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:00 np0005486759.ooo.test sudo[111249]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:00 np0005486759.ooo.test sudo[111266]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt1tzo9me/privsep.sock
Oct 14 08:59:00 np0005486759.ooo.test sudo[111266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:01 np0005486759.ooo.test sudo[111266]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:01 np0005486759.ooo.test sudo[111277]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd3wlcwno/privsep.sock
Oct 14 08:59:01 np0005486759.ooo.test sudo[111277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:02 np0005486759.ooo.test sudo[111277]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:02 np0005486759.ooo.test sudo[111288]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpai79pi55/privsep.sock
Oct 14 08:59:02 np0005486759.ooo.test sudo[111288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:03 np0005486759.ooo.test sudo[111288]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:59:03 np0005486759.ooo.test systemd[1]: tmp-crun.spugTn.mount: Deactivated successfully.
Oct 14 08:59:03 np0005486759.ooo.test podman[111293]: 2025-10-14 08:59:03.217276517 +0000 UTC m=+0.096617495 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, version=17.1.9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 08:59:03 np0005486759.ooo.test sudo[111329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppwlzju45/privsep.sock
Oct 14 08:59:03 np0005486759.ooo.test sudo[111329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:03 np0005486759.ooo.test podman[111293]: 2025-10-14 08:59:03.397234314 +0000 UTC m=+0.276575232 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Oct 14 08:59:03 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:59:04 np0005486759.ooo.test sudo[111329]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:04 np0005486759.ooo.test sudo[111340]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_9dzf2ds/privsep.sock
Oct 14 08:59:04 np0005486759.ooo.test sudo[111340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:04 np0005486759.ooo.test sudo[111340]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:59:05 np0005486759.ooo.test systemd[1]: tmp-crun.GSVXRi.mount: Deactivated successfully.
Oct 14 08:59:05 np0005486759.ooo.test podman[111345]: 2025-10-14 08:59:05.030215922 +0000 UTC m=+0.084961943 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Oct 14 08:59:05 np0005486759.ooo.test sudo[111373]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeaunrdtq/privsep.sock
Oct 14 08:59:05 np0005486759.ooo.test sudo[111373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:05 np0005486759.ooo.test podman[111345]: 2025-10-14 08:59:05.396449173 +0000 UTC m=+0.451195224 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.9, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1)
Oct 14 08:59:05 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:59:05 np0005486759.ooo.test sudo[111373]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:06 np0005486759.ooo.test sudo[111390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpokuqw7w6/privsep.sock
Oct 14 08:59:06 np0005486759.ooo.test sudo[111390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:06 np0005486759.ooo.test sudo[111390]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:07 np0005486759.ooo.test sudo[111402]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq658rgp2/privsep.sock
Oct 14 08:59:07 np0005486759.ooo.test sudo[111402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 08:59:07 np0005486759.ooo.test recover_tripleo_nova_virtqemud[111414]: 47951
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 08:59:07 np0005486759.ooo.test podman[111406]: 2025-10-14 08:59:07.461128067 +0000 UTC m=+0.079010108 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.9, tcib_managed=true, batch=17.1_20250721.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 08:59:07 np0005486759.ooo.test podman[111406]: 2025-10-14 08:59:07.473215843 +0000 UTC m=+0.091097884 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-ceilometer-compute)
Oct 14 08:59:07 np0005486759.ooo.test podman[111406]: unhealthy
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: tmp-crun.5XoLas.mount: Deactivated successfully.
Oct 14 08:59:07 np0005486759.ooo.test podman[111405]: 2025-10-14 08:59:07.524013043 +0000 UTC m=+0.144549276 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, container_name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12)
Oct 14 08:59:07 np0005486759.ooo.test podman[111407]: 2025-10-14 08:59:07.546605276 +0000 UTC m=+0.165386085 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47)
Oct 14 08:59:07 np0005486759.ooo.test podman[111407]: 2025-10-14 08:59:07.559234159 +0000 UTC m=+0.178014968 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, release=1, batch=17.1_20250721.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 08:59:07 np0005486759.ooo.test podman[111407]: unhealthy
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:59:07 np0005486759.ooo.test podman[111405]: 2025-10-14 08:59:07.608373507 +0000 UTC m=+0.228909790 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond)
Oct 14 08:59:07 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:59:07 np0005486759.ooo.test sudo[111402]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:07 np0005486759.ooo.test sudo[111470]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoc76z0eu/privsep.sock
Oct 14 08:59:07 np0005486759.ooo.test sudo[111470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:08 np0005486759.ooo.test sudo[111470]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:08 np0005486759.ooo.test sudo[111481]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4agokjgk/privsep.sock
Oct 14 08:59:08 np0005486759.ooo.test sudo[111481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:09 np0005486759.ooo.test sudo[111481]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:09 np0005486759.ooo.test sudo[111492]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcf6okdn2/privsep.sock
Oct 14 08:59:09 np0005486759.ooo.test sudo[111492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:10 np0005486759.ooo.test sudo[111492]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:10 np0005486759.ooo.test sudo[111503]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphr7wqo0a/privsep.sock
Oct 14 08:59:10 np0005486759.ooo.test sudo[111503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:11 np0005486759.ooo.test sudo[111503]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:11 np0005486759.ooo.test sudo[111520]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprfdaq4d8/privsep.sock
Oct 14 08:59:11 np0005486759.ooo.test sudo[111520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:12 np0005486759.ooo.test sudo[111520]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:12 np0005486759.ooo.test sudo[111531]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo456q1py/privsep.sock
Oct 14 08:59:12 np0005486759.ooo.test sudo[111531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:13 np0005486759.ooo.test sudo[111531]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:13 np0005486759.ooo.test sudo[111542]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp6ul1g0r/privsep.sock
Oct 14 08:59:13 np0005486759.ooo.test sudo[111542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:13 np0005486759.ooo.test sudo[111542]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:59:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:59:14 np0005486759.ooo.test systemd[1]: tmp-crun.XcSWd0.mount: Deactivated successfully.
Oct 14 08:59:14 np0005486759.ooo.test podman[111548]: 2025-10-14 08:59:14.082008596 +0000 UTC m=+0.093803038 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, release=1, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 08:59:14 np0005486759.ooo.test systemd[1]: tmp-crun.g0jsJU.mount: Deactivated successfully.
Oct 14 08:59:14 np0005486759.ooo.test podman[111549]: 2025-10-14 08:59:14.097840138 +0000 UTC m=+0.103024305 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, release=1, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9)
Oct 14 08:59:14 np0005486759.ooo.test podman[111548]: 2025-10-14 08:59:14.121215995 +0000 UTC m=+0.133010397 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64)
Oct 14 08:59:14 np0005486759.ooo.test podman[111548]: unhealthy
Oct 14 08:59:14 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:14 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 08:59:14 np0005486759.ooo.test podman[111549]: 2025-10-14 08:59:14.137388078 +0000 UTC m=+0.142572215 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, batch=17.1_20250721.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, tcib_managed=true)
Oct 14 08:59:14 np0005486759.ooo.test podman[111549]: unhealthy
Oct 14 08:59:14 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:14 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:59:14 np0005486759.ooo.test sudo[111593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcbr209pl/privsep.sock
Oct 14 08:59:14 np0005486759.ooo.test sudo[111593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:14 np0005486759.ooo.test sudo[111593]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:15 np0005486759.ooo.test sudo[111604]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnl0d1wat/privsep.sock
Oct 14 08:59:15 np0005486759.ooo.test sudo[111604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:15 np0005486759.ooo.test sudo[111604]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:15 np0005486759.ooo.test sudo[111615]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptgjek4i9/privsep.sock
Oct 14 08:59:15 np0005486759.ooo.test sudo[111615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:16 np0005486759.ooo.test sudo[111615]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:16 np0005486759.ooo.test sudo[111628]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp66oobroj/privsep.sock
Oct 14 08:59:16 np0005486759.ooo.test sudo[111628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:17 np0005486759.ooo.test sudo[111628]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:17 np0005486759.ooo.test sudo[111643]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp67vsf1br/privsep.sock
Oct 14 08:59:17 np0005486759.ooo.test sudo[111643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:18 np0005486759.ooo.test sudo[111643]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:18 np0005486759.ooo.test sudo[111654]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp27ti2ibq/privsep.sock
Oct 14 08:59:18 np0005486759.ooo.test sudo[111654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:19 np0005486759.ooo.test sudo[111654]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:19 np0005486759.ooo.test sudo[111665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppfj_b9ly/privsep.sock
Oct 14 08:59:19 np0005486759.ooo.test sudo[111665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:19 np0005486759.ooo.test sudo[111665]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:20 np0005486759.ooo.test sudo[111676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp38mltrm7/privsep.sock
Oct 14 08:59:20 np0005486759.ooo.test sudo[111676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:20 np0005486759.ooo.test sudo[111676]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:21 np0005486759.ooo.test sudo[111687]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9ulfvuzd/privsep.sock
Oct 14 08:59:21 np0005486759.ooo.test sudo[111687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:21 np0005486759.ooo.test sudo[111687]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:21 np0005486759.ooo.test sudo[111698]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8f2m952y/privsep.sock
Oct 14 08:59:22 np0005486759.ooo.test sudo[111698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:22 np0005486759.ooo.test sudo[111698]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:22 np0005486759.ooo.test sudo[111715]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjlw6ddsm/privsep.sock
Oct 14 08:59:22 np0005486759.ooo.test sudo[111715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:23 np0005486759.ooo.test sudo[111715]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:23 np0005486759.ooo.test sudo[111726]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmyvevpk4/privsep.sock
Oct 14 08:59:23 np0005486759.ooo.test sudo[111726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:24 np0005486759.ooo.test sudo[111726]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:24 np0005486759.ooo.test sudo[111737]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaybuyw91/privsep.sock
Oct 14 08:59:24 np0005486759.ooo.test sudo[111737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:25 np0005486759.ooo.test sudo[111737]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:25 np0005486759.ooo.test sudo[111748]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1svct_br/privsep.sock
Oct 14 08:59:25 np0005486759.ooo.test sudo[111748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:26 np0005486759.ooo.test sudo[111748]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:26 np0005486759.ooo.test sudo[111759]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp63338hw_/privsep.sock
Oct 14 08:59:26 np0005486759.ooo.test sudo[111759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:27 np0005486759.ooo.test sudo[111759]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:27 np0005486759.ooo.test sudo[111770]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph54kaqdc/privsep.sock
Oct 14 08:59:27 np0005486759.ooo.test sudo[111770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:28 np0005486759.ooo.test sudo[111770]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:28 np0005486759.ooo.test sudo[111787]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpej5dee3r/privsep.sock
Oct 14 08:59:28 np0005486759.ooo.test sudo[111787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:28 np0005486759.ooo.test sudo[111787]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:59:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:59:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:59:29 np0005486759.ooo.test podman[111794]: 2025-10-14 08:59:29.107861681 +0000 UTC m=+0.088556125 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, distribution-scope=public, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 08:59:29 np0005486759.ooo.test podman[111794]: 2025-10-14 08:59:29.12229167 +0000 UTC m=+0.102986164 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=2, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:04:03)
Oct 14 08:59:29 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:59:29 np0005486759.ooo.test podman[111793]: 2025-10-14 08:59:29.085535597 +0000 UTC m=+0.074777377 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, container_name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Oct 14 08:59:29 np0005486759.ooo.test podman[111821]: 2025-10-14 08:59:29.174763892 +0000 UTC m=+0.085354535 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1)
Oct 14 08:59:29 np0005486759.ooo.test podman[111793]: 2025-10-14 08:59:29.226825121 +0000 UTC m=+0.216066941 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, config_id=tripleo_step3, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:59:29 np0005486759.ooo.test podman[111821]: 2025-10-14 08:59:29.227435 +0000 UTC m=+0.138025633 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.12, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Oct 14 08:59:29 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:59:29 np0005486759.ooo.test sudo[111862]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe4hp1222/privsep.sock
Oct 14 08:59:29 np0005486759.ooo.test sudo[111862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:29 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:59:29 np0005486759.ooo.test sudo[111862]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:30 np0005486759.ooo.test systemd[1]: tmp-crun.3ISnv0.mount: Deactivated successfully.
Oct 14 08:59:30 np0005486759.ooo.test sudo[111873]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuulqdzb3/privsep.sock
Oct 14 08:59:30 np0005486759.ooo.test sudo[111873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:30 np0005486759.ooo.test sudo[111873]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:31 np0005486759.ooo.test sudo[111884]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb_yw04xs/privsep.sock
Oct 14 08:59:31 np0005486759.ooo.test sudo[111884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:31 np0005486759.ooo.test sudo[111884]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:31 np0005486759.ooo.test sudo[111895]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdflo_o82/privsep.sock
Oct 14 08:59:31 np0005486759.ooo.test sudo[111895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:32 np0005486759.ooo.test sudo[111895]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:32 np0005486759.ooo.test sudo[111906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp475g1a5y/privsep.sock
Oct 14 08:59:32 np0005486759.ooo.test sudo[111906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:33 np0005486759.ooo.test sudo[111906]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 08:59:33 np0005486759.ooo.test systemd[1]: tmp-crun.8EPh2U.mount: Deactivated successfully.
Oct 14 08:59:33 np0005486759.ooo.test podman[111916]: 2025-10-14 08:59:33.545409995 +0000 UTC m=+0.086567284 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.33.12, batch=17.1_20250721.1)
Oct 14 08:59:33 np0005486759.ooo.test sudo[111952]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppzqfbxfq/privsep.sock
Oct 14 08:59:33 np0005486759.ooo.test sudo[111952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:33 np0005486759.ooo.test podman[111916]: 2025-10-14 08:59:33.763444386 +0000 UTC m=+0.304601655 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 08:59:33 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 08:59:34 np0005486759.ooo.test sudo[111952]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:34 np0005486759.ooo.test sudo[111963]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_i9ng3v/privsep.sock
Oct 14 08:59:34 np0005486759.ooo.test sudo[111963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:35 np0005486759.ooo.test sudo[111963]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:35 np0005486759.ooo.test sudo[111974]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpciesu_4l/privsep.sock
Oct 14 08:59:35 np0005486759.ooo.test sudo[111974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 08:59:35 np0005486759.ooo.test podman[111976]: 2025-10-14 08:59:35.581883962 +0000 UTC m=+0.064551569 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 08:59:35 np0005486759.ooo.test podman[111976]: 2025-10-14 08:59:35.972382728 +0000 UTC m=+0.455050335 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, build-date=2025-07-21T14:48:37)
Oct 14 08:59:35 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 08:59:36 np0005486759.ooo.test sudo[111974]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:36 np0005486759.ooo.test sudo[112008]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3upi907c/privsep.sock
Oct 14 08:59:36 np0005486759.ooo.test sudo[112008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:36 np0005486759.ooo.test sudo[112008]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:37 np0005486759.ooo.test sudo[112019]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwe9zqs0l/privsep.sock
Oct 14 08:59:37 np0005486759.ooo.test sudo[112019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:37 np0005486759.ooo.test sudo[112019]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 08:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 08:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 08:59:37 np0005486759.ooo.test podman[112024]: 2025-10-14 08:59:37.918840364 +0000 UTC m=+0.081462734 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20250721.1, release=1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:59:37 np0005486759.ooo.test podman[112024]: 2025-10-14 08:59:37.954492323 +0000 UTC m=+0.117114694 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, release=1, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 08:59:37 np0005486759.ooo.test systemd[1]: tmp-crun.RXUwBq.mount: Deactivated successfully.
Oct 14 08:59:37 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 08:59:37 np0005486759.ooo.test podman[112026]: 2025-10-14 08:59:37.971192723 +0000 UTC m=+0.130831940 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1, version=17.1.9, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 08:59:38 np0005486759.ooo.test systemd[1]: tmp-crun.vdVqub.mount: Deactivated successfully.
Oct 14 08:59:38 np0005486759.ooo.test podman[112026]: 2025-10-14 08:59:38.017353818 +0000 UTC m=+0.176993065 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, release=1, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute)
Oct 14 08:59:38 np0005486759.ooo.test podman[112026]: unhealthy
Oct 14 08:59:38 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:38 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 08:59:38 np0005486759.ooo.test podman[112027]: 2025-10-14 08:59:38.021046034 +0000 UTC m=+0.180987191 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 08:59:38 np0005486759.ooo.test sudo[112090]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi7c_ts6q/privsep.sock
Oct 14 08:59:38 np0005486759.ooo.test sudo[112090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:38 np0005486759.ooo.test podman[112027]: 2025-10-14 08:59:38.107251045 +0000 UTC m=+0.267192182 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, release=1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47)
Oct 14 08:59:38 np0005486759.ooo.test podman[112027]: unhealthy
Oct 14 08:59:38 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:38 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 08:59:38 np0005486759.ooo.test sudo[112090]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:38 np0005486759.ooo.test sudo[112107]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0ah5deos/privsep.sock
Oct 14 08:59:39 np0005486759.ooo.test sudo[112107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:39 np0005486759.ooo.test sudo[112107]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:39 np0005486759.ooo.test sudo[112118]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91dzff72/privsep.sock
Oct 14 08:59:39 np0005486759.ooo.test sudo[112118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:40 np0005486759.ooo.test sudo[112118]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:40 np0005486759.ooo.test sudo[112129]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6tbdrgl6/privsep.sock
Oct 14 08:59:40 np0005486759.ooo.test sudo[112129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:41 np0005486759.ooo.test sudo[112129]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:41 np0005486759.ooo.test sudo[112140]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2h40mvul/privsep.sock
Oct 14 08:59:41 np0005486759.ooo.test sudo[112140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:42 np0005486759.ooo.test sudo[112140]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:42 np0005486759.ooo.test sudo[112151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl_qt9hv7/privsep.sock
Oct 14 08:59:42 np0005486759.ooo.test sudo[112151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:43 np0005486759.ooo.test sudo[112151]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:43 np0005486759.ooo.test sudo[112162]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4q751znf/privsep.sock
Oct 14 08:59:43 np0005486759.ooo.test sudo[112162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:44 np0005486759.ooo.test sudo[112162]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:44 np0005486759.ooo.test sudo[112179]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7vcmo6l2/privsep.sock
Oct 14 08:59:44 np0005486759.ooo.test sudo[112179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 08:59:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 08:59:44 np0005486759.ooo.test podman[112181]: 2025-10-14 08:59:44.395148377 +0000 UTC m=+0.083995974 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:59:44 np0005486759.ooo.test podman[112181]: 2025-10-14 08:59:44.410209146 +0000 UTC m=+0.099056763 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1, tcib_managed=true, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 08:59:44 np0005486759.ooo.test podman[112181]: unhealthy
Oct 14 08:59:44 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:44 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 08:59:44 np0005486759.ooo.test podman[112182]: 2025-10-14 08:59:44.445831903 +0000 UTC m=+0.128924811 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-type=git, distribution-scope=public, release=1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 08:59:44 np0005486759.ooo.test podman[112182]: 2025-10-14 08:59:44.489557443 +0000 UTC m=+0.172650332 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, version=17.1.9, vendor=Red Hat, Inc., distribution-scope=public)
Oct 14 08:59:44 np0005486759.ooo.test podman[112182]: unhealthy
Oct 14 08:59:44 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 08:59:44 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 08:59:44 np0005486759.ooo.test sudo[112179]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:45 np0005486759.ooo.test sudo[112232]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph9pwttbi/privsep.sock
Oct 14 08:59:45 np0005486759.ooo.test sudo[112232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:45 np0005486759.ooo.test sudo[112232]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:46 np0005486759.ooo.test sudo[112243]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsk7_1z7_/privsep.sock
Oct 14 08:59:46 np0005486759.ooo.test sudo[112243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:46 np0005486759.ooo.test sudo[112243]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:46 np0005486759.ooo.test sudo[112254]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg4epaqeh/privsep.sock
Oct 14 08:59:46 np0005486759.ooo.test sudo[112254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:47 np0005486759.ooo.test sudo[112254]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:47 np0005486759.ooo.test sudo[112265]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp2csmp7x/privsep.sock
Oct 14 08:59:47 np0005486759.ooo.test sudo[112265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:48 np0005486759.ooo.test sudo[112265]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:48 np0005486759.ooo.test sudo[112276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdj19mtxa/privsep.sock
Oct 14 08:59:48 np0005486759.ooo.test sudo[112276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:49 np0005486759.ooo.test sudo[112276]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:49 np0005486759.ooo.test sudo[112293]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1fu_7s4f/privsep.sock
Oct 14 08:59:49 np0005486759.ooo.test sudo[112293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:50 np0005486759.ooo.test sudo[112293]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:50 np0005486759.ooo.test sudo[112304]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqq5fgs8z/privsep.sock
Oct 14 08:59:50 np0005486759.ooo.test sudo[112304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:51 np0005486759.ooo.test sudo[112304]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:51 np0005486759.ooo.test sudo[112315]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkf9rldaf/privsep.sock
Oct 14 08:59:51 np0005486759.ooo.test sudo[112315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:52 np0005486759.ooo.test sudo[112315]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:52 np0005486759.ooo.test sudo[112326]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6vqmyezh/privsep.sock
Oct 14 08:59:52 np0005486759.ooo.test sudo[112326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:53 np0005486759.ooo.test sudo[112326]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:53 np0005486759.ooo.test sudo[112337]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxf7rlfta/privsep.sock
Oct 14 08:59:53 np0005486759.ooo.test sudo[112337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:54 np0005486759.ooo.test sudo[112337]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:54 np0005486759.ooo.test sudo[112348]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkr_ifbaw/privsep.sock
Oct 14 08:59:54 np0005486759.ooo.test sudo[112348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:54 np0005486759.ooo.test sudo[112348]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:55 np0005486759.ooo.test sudo[112365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprtlt9y6o/privsep.sock
Oct 14 08:59:55 np0005486759.ooo.test sudo[112365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:55 np0005486759.ooo.test sudo[112365]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:56 np0005486759.ooo.test sudo[112376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7efvmuhb/privsep.sock
Oct 14 08:59:56 np0005486759.ooo.test sudo[112376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:56 np0005486759.ooo.test sudo[112376]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:56 np0005486759.ooo.test sudo[112387]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpppuyxmn6/privsep.sock
Oct 14 08:59:56 np0005486759.ooo.test sudo[112387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:57 np0005486759.ooo.test sudo[112387]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:57 np0005486759.ooo.test sudo[112398]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphwhkftuf/privsep.sock
Oct 14 08:59:57 np0005486759.ooo.test sudo[112398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:58 np0005486759.ooo.test sudo[112398]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:58 np0005486759.ooo.test sudo[112409]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_pmjehm4/privsep.sock
Oct 14 08:59:58 np0005486759.ooo.test sudo[112409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 08:59:59 np0005486759.ooo.test sudo[112409]: pam_unix(sudo:session): session closed for user root
Oct 14 08:59:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 08:59:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 08:59:59 np0005486759.ooo.test podman[112414]: 2025-10-14 08:59:59.334613465 +0000 UTC m=+0.066791968 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, build-date=2025-07-21T14:48:37, container_name=nova_compute, io.openshift.expose-services=, config_id=tripleo_step5, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12)
Oct 14 08:59:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 08:59:59 np0005486759.ooo.test podman[112416]: 2025-10-14 08:59:59.388852872 +0000 UTC m=+0.117669221 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=2, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git)
Oct 14 08:59:59 np0005486759.ooo.test podman[112416]: 2025-10-14 08:59:59.423017375 +0000 UTC m=+0.151833714 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12)
Oct 14 08:59:59 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 08:59:59 np0005486759.ooo.test podman[112444]: 2025-10-14 08:59:59.439650253 +0000 UTC m=+0.082313092 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 08:59:59 np0005486759.ooo.test podman[112444]: 2025-10-14 08:59:59.449390765 +0000 UTC m=+0.092053454 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, release=1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20250721.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.openshift.expose-services=)
Oct 14 08:59:59 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 08:59:59 np0005486759.ooo.test podman[112414]: 2025-10-14 08:59:59.46562999 +0000 UTC m=+0.197808543 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, batch=17.1_20250721.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, build-date=2025-07-21T14:48:37, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Oct 14 08:59:59 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 08:59:59 np0005486759.ooo.test sudo[112484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa9kzqg4w/privsep.sock
Oct 14 08:59:59 np0005486759.ooo.test sudo[112484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:00 np0005486759.ooo.test sudo[112484]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:00 np0005486759.ooo.test sudo[112501]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwjhq1dii/privsep.sock
Oct 14 09:00:00 np0005486759.ooo.test sudo[112501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:01 np0005486759.ooo.test sudo[112501]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:01 np0005486759.ooo.test CROND[112508]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Oct 14 09:00:01 np0005486759.ooo.test sudo[112516]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdifbf324/privsep.sock
Oct 14 09:00:01 np0005486759.ooo.test sudo[112516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:01 np0005486759.ooo.test sudo[112516]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:02 np0005486759.ooo.test sudo[112527]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw2eogccm/privsep.sock
Oct 14 09:00:02 np0005486759.ooo.test sudo[112527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:02 np0005486759.ooo.test sudo[112527]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:03 np0005486759.ooo.test sudo[112538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps3lnbfkb/privsep.sock
Oct 14 09:00:03 np0005486759.ooo.test sudo[112538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:03 np0005486759.ooo.test sudo[112538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:04 np0005486759.ooo.test sudo[112549]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyhwyfdp5/privsep.sock
Oct 14 09:00:04 np0005486759.ooo.test sudo[112549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:00:04 np0005486759.ooo.test podman[112551]: 2025-10-14 09:00:04.114102404 +0000 UTC m=+0.080881476 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Oct 14 09:00:04 np0005486759.ooo.test podman[112551]: 2025-10-14 09:00:04.301331637 +0000 UTC m=+0.268110769 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 09:00:04 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:00:04 np0005486759.ooo.test sudo[112549]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:04 np0005486759.ooo.test sudo[112590]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7d46109_/privsep.sock
Oct 14 09:00:04 np0005486759.ooo.test sudo[112590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:05 np0005486759.ooo.test sudo[112590]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:05 np0005486759.ooo.test sudo[112607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbixbplya/privsep.sock
Oct 14 09:00:05 np0005486759.ooo.test sudo[112607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:00:06 np0005486759.ooo.test podman[112610]: 2025-10-14 09:00:06.444784571 +0000 UTC m=+0.076042706 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37)
Oct 14 09:00:06 np0005486759.ooo.test sudo[112607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:06 np0005486759.ooo.test sudo[112641]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp243wjlzb/privsep.sock
Oct 14 09:00:06 np0005486759.ooo.test sudo[112641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:06 np0005486759.ooo.test podman[112610]: 2025-10-14 09:00:06.847388333 +0000 UTC m=+0.478646438 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20250721.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Oct 14 09:00:06 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:00:07 np0005486759.ooo.test sudo[112641]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:07 np0005486759.ooo.test sudo[112652]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjsptz0fc/privsep.sock
Oct 14 09:00:07 np0005486759.ooo.test sudo[112652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:08 np0005486759.ooo.test sudo[112652]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:00:08 np0005486759.ooo.test podman[112660]: 2025-10-14 09:00:08.338129737 +0000 UTC m=+0.081210697 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 09:00:08 np0005486759.ooo.test podman[112659]: 2025-10-14 09:00:08.319742654 +0000 UTC m=+0.068387137 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, release=1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 09:00:08 np0005486759.ooo.test podman[112656]: 2025-10-14 09:00:08.376763898 +0000 UTC m=+0.128125606 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, release=1, vendor=Red Hat, Inc.)
Oct 14 09:00:08 np0005486759.ooo.test podman[112656]: 2025-10-14 09:00:08.383699744 +0000 UTC m=+0.135061402 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, batch=17.1_20250721.1, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, vcs-type=git)
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:00:08 np0005486759.ooo.test podman[112660]: 2025-10-14 09:00:08.397770541 +0000 UTC m=+0.140851501 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Oct 14 09:00:08 np0005486759.ooo.test podman[112660]: unhealthy
Oct 14 09:00:08 np0005486759.ooo.test podman[112659]: 2025-10-14 09:00:08.405401689 +0000 UTC m=+0.154046182 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:00:08 np0005486759.ooo.test podman[112659]: unhealthy
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:08 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:00:08 np0005486759.ooo.test sudo[112719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpifdur19t/privsep.sock
Oct 14 09:00:08 np0005486759.ooo.test sudo[112719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:09 np0005486759.ooo.test sudo[112719]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:09 np0005486759.ooo.test sudo[112730]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf1qcxpur/privsep.sock
Oct 14 09:00:09 np0005486759.ooo.test sudo[112730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:10 np0005486759.ooo.test sudo[112730]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:10 np0005486759.ooo.test sudo[112741]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpli7nkj50/privsep.sock
Oct 14 09:00:10 np0005486759.ooo.test sudo[112741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:11 np0005486759.ooo.test sudo[112741]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:11 np0005486759.ooo.test sudo[112758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd0f9yjow/privsep.sock
Oct 14 09:00:11 np0005486759.ooo.test sudo[112758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:11 np0005486759.ooo.test sudo[112758]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:12 np0005486759.ooo.test sudo[112769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvd3zyale/privsep.sock
Oct 14 09:00:12 np0005486759.ooo.test sudo[112769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:12 np0005486759.ooo.test sudo[112769]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:12 np0005486759.ooo.test sudo[112780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptq766qbf/privsep.sock
Oct 14 09:00:12 np0005486759.ooo.test sudo[112780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:13 np0005486759.ooo.test sudo[112780]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:13 np0005486759.ooo.test sudo[112791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0sdlb34n/privsep.sock
Oct 14 09:00:13 np0005486759.ooo.test sudo[112791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:14 np0005486759.ooo.test sudo[112791]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:14 np0005486759.ooo.test sudo[112802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgh64vxd0/privsep.sock
Oct 14 09:00:14 np0005486759.ooo.test sudo[112802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: tmp-crun.Wj339q.mount: Deactivated successfully.
Oct 14 09:00:14 np0005486759.ooo.test podman[112804]: 2025-10-14 09:00:14.719421224 +0000 UTC m=+0.081570198 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 14 09:00:14 np0005486759.ooo.test podman[112804]: 2025-10-14 09:00:14.761317487 +0000 UTC m=+0.123466471 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, version=17.1.9, release=1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team)
Oct 14 09:00:14 np0005486759.ooo.test podman[112804]: unhealthy
Oct 14 09:00:14 np0005486759.ooo.test podman[112805]: 2025-10-14 09:00:14.774494247 +0000 UTC m=+0.134276927 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:00:14 np0005486759.ooo.test podman[112805]: 2025-10-14 09:00:14.813383796 +0000 UTC m=+0.173166466 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:00:14 np0005486759.ooo.test podman[112805]: unhealthy
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:14 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:00:15 np0005486759.ooo.test sudo[112802]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:15 np0005486759.ooo.test sudo[112850]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0ssrtfta/privsep.sock
Oct 14 09:00:15 np0005486759.ooo.test sudo[112850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:15 np0005486759.ooo.test systemd[1]: tmp-crun.lHVnoR.mount: Deactivated successfully.
Oct 14 09:00:16 np0005486759.ooo.test sudo[112850]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:16 np0005486759.ooo.test sudo[112867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbu856fmc/privsep.sock
Oct 14 09:00:16 np0005486759.ooo.test sudo[112867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:16 np0005486759.ooo.test sudo[112867]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:17 np0005486759.ooo.test sudo[112878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp7dodc48/privsep.sock
Oct 14 09:00:17 np0005486759.ooo.test sudo[112878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:17 np0005486759.ooo.test sudo[112878]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:18 np0005486759.ooo.test sudo[112889]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdg63iq4f/privsep.sock
Oct 14 09:00:18 np0005486759.ooo.test sudo[112889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:18 np0005486759.ooo.test sudo[112889]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:19 np0005486759.ooo.test sudo[112900]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc100z5pw/privsep.sock
Oct 14 09:00:19 np0005486759.ooo.test sudo[112900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:19 np0005486759.ooo.test sudo[112900]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:19 np0005486759.ooo.test sudo[112911]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphgtxk4ma/privsep.sock
Oct 14 09:00:19 np0005486759.ooo.test sudo[112911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:20 np0005486759.ooo.test sudo[112911]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:20 np0005486759.ooo.test sudo[112922]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpql1ytc3s/privsep.sock
Oct 14 09:00:20 np0005486759.ooo.test sudo[112922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:21 np0005486759.ooo.test sudo[112922]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:21 np0005486759.ooo.test sudo[112936]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5ajd0vn/privsep.sock
Oct 14 09:00:21 np0005486759.ooo.test sudo[112936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:22 np0005486759.ooo.test sudo[112936]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:22 np0005486759.ooo.test sudo[112950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxh5wpqex/privsep.sock
Oct 14 09:00:22 np0005486759.ooo.test sudo[112950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:23 np0005486759.ooo.test sudo[112950]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:23 np0005486759.ooo.test sudo[112961]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpniihxvrt/privsep.sock
Oct 14 09:00:23 np0005486759.ooo.test sudo[112961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:24 np0005486759.ooo.test sudo[112961]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:24 np0005486759.ooo.test sudo[112972]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp15qp5d9f/privsep.sock
Oct 14 09:00:24 np0005486759.ooo.test sudo[112972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:25 np0005486759.ooo.test sudo[112972]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:25 np0005486759.ooo.test sudo[112983]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt24dc1av/privsep.sock
Oct 14 09:00:25 np0005486759.ooo.test sudo[112983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:25 np0005486759.ooo.test sudo[112983]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:26 np0005486759.ooo.test sudo[112994]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkmwv1_gn/privsep.sock
Oct 14 09:00:26 np0005486759.ooo.test sudo[112994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:26 np0005486759.ooo.test sudo[112994]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:27 np0005486759.ooo.test sudo[113010]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbnipzmsr/privsep.sock
Oct 14 09:00:27 np0005486759.ooo.test sudo[113010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:27 np0005486759.ooo.test sudo[113010]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:27 np0005486759.ooo.test sudo[113022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpagbp9usm/privsep.sock
Oct 14 09:00:27 np0005486759.ooo.test sudo[113022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:28 np0005486759.ooo.test sudo[113022]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:28 np0005486759.ooo.test sudo[113033]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpde187h47/privsep.sock
Oct 14 09:00:28 np0005486759.ooo.test sudo[113033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:29 np0005486759.ooo.test sudo[113033]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:29 np0005486759.ooo.test sudo[113044]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp2bsnqym/privsep.sock
Oct 14 09:00:29 np0005486759.ooo.test sudo[113044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: tmp-crun.KvHCds.mount: Deactivated successfully.
Oct 14 09:00:29 np0005486759.ooo.test podman[113047]: 2025-10-14 09:00:29.594015675 +0000 UTC m=+0.073200069 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true)
Oct 14 09:00:29 np0005486759.ooo.test podman[113047]: 2025-10-14 09:00:29.625544575 +0000 UTC m=+0.104728969 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step5, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: tmp-crun.lY1yld.mount: Deactivated successfully.
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:00:29 np0005486759.ooo.test podman[113046]: 2025-10-14 09:00:29.646015512 +0000 UTC m=+0.126294949 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, release=1, io.openshift.expose-services=)
Oct 14 09:00:29 np0005486759.ooo.test podman[113046]: 2025-10-14 09:00:29.682343552 +0000 UTC m=+0.162623019 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-iscsid, container_name=iscsid, batch=17.1_20250721.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, maintainer=OpenStack TripleO Team)
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:00:29 np0005486759.ooo.test podman[113048]: 2025-10-14 09:00:29.695690617 +0000 UTC m=+0.171349630 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=2)
Oct 14 09:00:29 np0005486759.ooo.test podman[113048]: 2025-10-14 09:00:29.732375988 +0000 UTC m=+0.208034971 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true)
Oct 14 09:00:29 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:00:30 np0005486759.ooo.test sudo[113044]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:30 np0005486759.ooo.test sudo[113117]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprvdcda01/privsep.sock
Oct 14 09:00:30 np0005486759.ooo.test sudo[113117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:30 np0005486759.ooo.test sudo[113117]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:31 np0005486759.ooo.test sudo[113128]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp71sttfm9/privsep.sock
Oct 14 09:00:31 np0005486759.ooo.test sudo[113128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:31 np0005486759.ooo.test sudo[113128]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:32 np0005486759.ooo.test sudo[113139]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprv9xw8k8/privsep.sock
Oct 14 09:00:32 np0005486759.ooo.test sudo[113139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:32 np0005486759.ooo.test sudo[113139]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:32 np0005486759.ooo.test sudo[113156]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl1fi0it3/privsep.sock
Oct 14 09:00:32 np0005486759.ooo.test sudo[113156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:33 np0005486759.ooo.test sudo[113156]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:33 np0005486759.ooo.test sudo[113167]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8sdx65be/privsep.sock
Oct 14 09:00:33 np0005486759.ooo.test sudo[113167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:34 np0005486759.ooo.test sudo[113167]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:00:34 np0005486759.ooo.test systemd[1]: tmp-crun.S1GTFw.mount: Deactivated successfully.
Oct 14 09:00:34 np0005486759.ooo.test podman[113173]: 2025-10-14 09:00:34.426259334 +0000 UTC m=+0.070946588 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., container_name=metrics_qdr, version=17.1.9, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible)
Oct 14 09:00:34 np0005486759.ooo.test sudo[113206]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq_9i_cj8/privsep.sock
Oct 14 09:00:34 np0005486759.ooo.test sudo[113206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:34 np0005486759.ooo.test podman[113173]: 2025-10-14 09:00:34.648361152 +0000 UTC m=+0.293048436 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, name=rhosp17/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., container_name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible)
Oct 14 09:00:34 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:00:35 np0005486759.ooo.test sudo[113206]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:35 np0005486759.ooo.test sudo[113217]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpps_ubo79/privsep.sock
Oct 14 09:00:35 np0005486759.ooo.test sudo[113217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:35 np0005486759.ooo.test sudo[113217]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:36 np0005486759.ooo.test sudo[113228]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr4t9x2h1/privsep.sock
Oct 14 09:00:36 np0005486759.ooo.test sudo[113228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:36 np0005486759.ooo.test sudo[113228]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:00:37 np0005486759.ooo.test systemd[1]: tmp-crun.3lqLsd.mount: Deactivated successfully.
Oct 14 09:00:37 np0005486759.ooo.test podman[113233]: 2025-10-14 09:00:37.016906207 +0000 UTC m=+0.098356140 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, build-date=2025-07-21T14:48:37, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Oct 14 09:00:37 np0005486759.ooo.test sudo[113260]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu_ofp9_j/privsep.sock
Oct 14 09:00:37 np0005486759.ooo.test sudo[113260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:37 np0005486759.ooo.test podman[113233]: 2025-10-14 09:00:37.396189763 +0000 UTC m=+0.477639676 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1)
Oct 14 09:00:37 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:00:37 np0005486759.ooo.test sudo[113260]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:38 np0005486759.ooo.test sudo[113278]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgpwxauab/privsep.sock
Oct 14 09:00:38 np0005486759.ooo.test sudo[113278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:38 np0005486759.ooo.test sudo[113278]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:00:38 np0005486759.ooo.test podman[113285]: 2025-10-14 09:00:38.820012425 +0000 UTC m=+0.071262618 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, release=1, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, build-date=2025-07-21T14:45:33)
Oct 14 09:00:38 np0005486759.ooo.test podman[113284]: 2025-10-14 09:00:38.795199203 +0000 UTC m=+0.054732582 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true)
Oct 14 09:00:38 np0005486759.ooo.test podman[113291]: 2025-10-14 09:00:38.853309151 +0000 UTC m=+0.104648626 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:00:38 np0005486759.ooo.test podman[113291]: 2025-10-14 09:00:38.866055227 +0000 UTC m=+0.117394722 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vcs-type=git, build-date=2025-07-21T15:29:47, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, distribution-scope=public, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 09:00:38 np0005486759.ooo.test podman[113291]: unhealthy
Oct 14 09:00:38 np0005486759.ooo.test podman[113284]: 2025-10-14 09:00:38.873383905 +0000 UTC m=+0.132917294 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=)
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:00:38 np0005486759.ooo.test podman[113285]: 2025-10-14 09:00:38.905503814 +0000 UTC m=+0.156753997 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1)
Oct 14 09:00:38 np0005486759.ooo.test podman[113285]: unhealthy
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:38 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:00:38 np0005486759.ooo.test sudo[113344]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprifzd8mz/privsep.sock
Oct 14 09:00:38 np0005486759.ooo.test sudo[113344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:39 np0005486759.ooo.test sudo[113344]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:39 np0005486759.ooo.test sudo[113355]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz2edfpms/privsep.sock
Oct 14 09:00:39 np0005486759.ooo.test sudo[113355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:40 np0005486759.ooo.test sudo[113355]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:40 np0005486759.ooo.test sudo[113366]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo2c10mbj/privsep.sock
Oct 14 09:00:40 np0005486759.ooo.test sudo[113366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:41 np0005486759.ooo.test sudo[113366]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:41 np0005486759.ooo.test sudo[113377]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3fal5zeq/privsep.sock
Oct 14 09:00:41 np0005486759.ooo.test sudo[113377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:42 np0005486759.ooo.test sudo[113377]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:42 np0005486759.ooo.test sudo[113388]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp27ca0cvz/privsep.sock
Oct 14 09:00:42 np0005486759.ooo.test sudo[113388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:43 np0005486759.ooo.test sudo[113388]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:43 np0005486759.ooo.test sudo[113405]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv8n0iike/privsep.sock
Oct 14 09:00:43 np0005486759.ooo.test sudo[113405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:43 np0005486759.ooo.test sudo[113405]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:44 np0005486759.ooo.test sudo[113416]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpssfvnekp/privsep.sock
Oct 14 09:00:44 np0005486759.ooo.test sudo[113416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:44 np0005486759.ooo.test sudo[113416]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:00:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:00:44 np0005486759.ooo.test systemd[1]: tmp-crun.ijPCQE.mount: Deactivated successfully.
Oct 14 09:00:44 np0005486759.ooo.test podman[113422]: 2025-10-14 09:00:44.917216066 +0000 UTC m=+0.078870293 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:00:44 np0005486759.ooo.test podman[113423]: 2025-10-14 09:00:44.955672813 +0000 UTC m=+0.112613393 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, container_name=ovn_controller, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:00:44 np0005486759.ooo.test podman[113422]: 2025-10-14 09:00:44.962070671 +0000 UTC m=+0.123724898 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:00:44 np0005486759.ooo.test podman[113422]: unhealthy
Oct 14 09:00:44 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:44 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:00:45 np0005486759.ooo.test podman[113423]: 2025-10-14 09:00:45.019191639 +0000 UTC m=+0.176132259 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1, container_name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 09:00:45 np0005486759.ooo.test podman[113423]: unhealthy
Oct 14 09:00:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:00:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:00:45 np0005486759.ooo.test sudo[113465]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp42xy0hus/privsep.sock
Oct 14 09:00:45 np0005486759.ooo.test sudo[113465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:45 np0005486759.ooo.test sudo[113465]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:45 np0005486759.ooo.test sudo[113476]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa_r5v66t/privsep.sock
Oct 14 09:00:45 np0005486759.ooo.test sudo[113476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:46 np0005486759.ooo.test sudo[113476]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:46 np0005486759.ooo.test sudo[113487]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzvjinrtb/privsep.sock
Oct 14 09:00:46 np0005486759.ooo.test sudo[113487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:47 np0005486759.ooo.test sudo[113487]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:47 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:00:47 np0005486759.ooo.test recover_tripleo_nova_virtqemud[113494]: 47951
Oct 14 09:00:47 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:00:47 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:00:47 np0005486759.ooo.test sudo[113500]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxo0x7xrd/privsep.sock
Oct 14 09:00:47 np0005486759.ooo.test sudo[113500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:48 np0005486759.ooo.test sudo[113500]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:48 np0005486759.ooo.test sudo[113514]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7y9wgihm/privsep.sock
Oct 14 09:00:48 np0005486759.ooo.test sudo[113514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:49 np0005486759.ooo.test sudo[113514]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:49 np0005486759.ooo.test sudo[113528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpljbibq0_/privsep.sock
Oct 14 09:00:49 np0005486759.ooo.test sudo[113528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:49 np0005486759.ooo.test sudo[113528]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:50 np0005486759.ooo.test sudo[113539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqdy9kc_f/privsep.sock
Oct 14 09:00:50 np0005486759.ooo.test sudo[113539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:50 np0005486759.ooo.test sudo[113539]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:51 np0005486759.ooo.test sudo[113550]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplogc9sbs/privsep.sock
Oct 14 09:00:51 np0005486759.ooo.test sudo[113550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:51 np0005486759.ooo.test sudo[113550]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:51 np0005486759.ooo.test sudo[113561]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsq5n7gpv/privsep.sock
Oct 14 09:00:51 np0005486759.ooo.test sudo[113561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:52 np0005486759.ooo.test sudo[113561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:52 np0005486759.ooo.test sudo[113572]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpti4ee5vo/privsep.sock
Oct 14 09:00:52 np0005486759.ooo.test sudo[113572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:53 np0005486759.ooo.test sudo[113572]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:53 np0005486759.ooo.test sudo[113583]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv8ip6qy2/privsep.sock
Oct 14 09:00:53 np0005486759.ooo.test sudo[113583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:54 np0005486759.ooo.test sudo[113583]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:54 np0005486759.ooo.test sudo[113600]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpifb4ztzg/privsep.sock
Oct 14 09:00:54 np0005486759.ooo.test sudo[113600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:55 np0005486759.ooo.test sudo[113600]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:55 np0005486759.ooo.test sudo[113611]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyngqsgen/privsep.sock
Oct 14 09:00:55 np0005486759.ooo.test sudo[113611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:56 np0005486759.ooo.test sudo[113611]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:56 np0005486759.ooo.test sudo[113622]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3brv9ecg/privsep.sock
Oct 14 09:00:56 np0005486759.ooo.test sudo[113622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:56 np0005486759.ooo.test sudo[113622]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:57 np0005486759.ooo.test sudo[113633]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps7j7jihd/privsep.sock
Oct 14 09:00:57 np0005486759.ooo.test sudo[113633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:57 np0005486759.ooo.test sudo[113633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:58 np0005486759.ooo.test sudo[113646]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzlo97l62/privsep.sock
Oct 14 09:00:58 np0005486759.ooo.test sudo[113646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:58 np0005486759.ooo.test sudo[113646]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:58 np0005486759.ooo.test sudo[113657]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe0zu85n3/privsep.sock
Oct 14 09:00:58 np0005486759.ooo.test sudo[113657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:59 np0005486759.ooo.test CROND[112507]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Oct 14 09:00:59 np0005486759.ooo.test sudo[113657]: pam_unix(sudo:session): session closed for user root
Oct 14 09:00:59 np0005486759.ooo.test sudo[113674]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_5u1d79b/privsep.sock
Oct 14 09:00:59 np0005486759.ooo.test sudo[113674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:00:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:00:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:00:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:00:59 np0005486759.ooo.test systemd[1]: tmp-crun.Idjscp.mount: Deactivated successfully.
Oct 14 09:00:59 np0005486759.ooo.test podman[113677]: 2025-10-14 09:00:59.910742637 +0000 UTC m=+0.090629750 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 09:00:59 np0005486759.ooo.test podman[113678]: 2025-10-14 09:00:59.939522082 +0000 UTC m=+0.117061692 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=2, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:00:59 np0005486759.ooo.test podman[113678]: 2025-10-14 09:00:59.973603182 +0000 UTC m=+0.151142792 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, managed_by=tripleo_ansible, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9)
Oct 14 09:00:59 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:00:59 np0005486759.ooo.test podman[113677]: 2025-10-14 09:00:59.995151882 +0000 UTC m=+0.175039015 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Oct 14 09:01:00 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:01:00 np0005486759.ooo.test podman[113676]: 2025-10-14 09:01:00.045197129 +0000 UTC m=+0.224329439 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 09:01:00 np0005486759.ooo.test podman[113676]: 2025-10-14 09:01:00.056368436 +0000 UTC m=+0.235500706 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-07-21T13:27:15, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, architecture=x86_64, release=1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:01:00 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:01:00 np0005486759.ooo.test sudo[113674]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:00 np0005486759.ooo.test sudo[113752]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgccokrql/privsep.sock
Oct 14 09:01:00 np0005486759.ooo.test sudo[113752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:01 np0005486759.ooo.test sudo[113752]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:01 np0005486759.ooo.test CROND[113759]: (root) CMD (run-parts /etc/cron.hourly)
Oct 14 09:01:01 np0005486759.ooo.test run-parts[113762]: (/etc/cron.hourly) starting 0anacron
Oct 14 09:01:01 np0005486759.ooo.test anacron[113770]: Anacron started on 2025-10-14
Oct 14 09:01:01 np0005486759.ooo.test anacron[113770]: Will run job `cron.daily' in 46 min.
Oct 14 09:01:01 np0005486759.ooo.test anacron[113770]: Will run job `cron.weekly' in 66 min.
Oct 14 09:01:01 np0005486759.ooo.test anacron[113770]: Will run job `cron.monthly' in 86 min.
Oct 14 09:01:01 np0005486759.ooo.test anacron[113770]: Jobs will be executed sequentially
Oct 14 09:01:01 np0005486759.ooo.test run-parts[113772]: (/etc/cron.hourly) finished 0anacron
Oct 14 09:01:01 np0005486759.ooo.test CROND[113758]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 14 09:01:01 np0005486759.ooo.test sudo[113778]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw6cbakum/privsep.sock
Oct 14 09:01:01 np0005486759.ooo.test sudo[113778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:01 np0005486759.ooo.test CROND[113782]: (root) CMD (run-parts /etc/cron.hourly)
Oct 14 09:01:01 np0005486759.ooo.test run-parts[113785]: (/etc/cron.hourly) starting 0anacron
Oct 14 09:01:01 np0005486759.ooo.test run-parts[113791]: (/etc/cron.hourly) finished 0anacron
Oct 14 09:01:01 np0005486759.ooo.test CROND[113781]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 14 09:01:02 np0005486759.ooo.test sudo[113778]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:02 np0005486759.ooo.test sudo[113800]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp362cdvd/privsep.sock
Oct 14 09:01:02 np0005486759.ooo.test sudo[113800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:03 np0005486759.ooo.test sudo[113800]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:03 np0005486759.ooo.test sudo[113811]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsy90e27e/privsep.sock
Oct 14 09:01:03 np0005486759.ooo.test sudo[113811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:03 np0005486759.ooo.test sudo[113811]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:04 np0005486759.ooo.test sudo[113822]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2si3fpsn/privsep.sock
Oct 14 09:01:04 np0005486759.ooo.test sudo[113822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:04 np0005486759.ooo.test sudo[113822]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:01:04 np0005486759.ooo.test podman[113830]: 2025-10-14 09:01:04.904096877 +0000 UTC m=+0.076066667 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12)
Oct 14 09:01:05 np0005486759.ooo.test sudo[113867]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9fg4joba/privsep.sock
Oct 14 09:01:05 np0005486759.ooo.test sudo[113867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:05 np0005486759.ooo.test podman[113830]: 2025-10-14 09:01:05.099203526 +0000 UTC m=+0.271173306 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, architecture=x86_64, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 09:01:05 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:01:05 np0005486759.ooo.test sudo[113867]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:05 np0005486759.ooo.test sudo[113878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpel1atge4/privsep.sock
Oct 14 09:01:05 np0005486759.ooo.test sudo[113878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:06 np0005486759.ooo.test sudo[113878]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:06 np0005486759.ooo.test sudo[113889]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplu0cmso0/privsep.sock
Oct 14 09:01:06 np0005486759.ooo.test sudo[113889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:07 np0005486759.ooo.test sudo[113889]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:01:07 np0005486759.ooo.test systemd[1]: tmp-crun.gAumv0.mount: Deactivated successfully.
Oct 14 09:01:07 np0005486759.ooo.test podman[113894]: 2025-10-14 09:01:07.553403624 +0000 UTC m=+0.084343374 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:01:07 np0005486759.ooo.test sudo[113923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp594fshj4/privsep.sock
Oct 14 09:01:07 np0005486759.ooo.test sudo[113923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:07 np0005486759.ooo.test podman[113894]: 2025-10-14 09:01:07.931196635 +0000 UTC m=+0.462136325 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4)
Oct 14 09:01:07 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:01:08 np0005486759.ooo.test sudo[113923]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:08 np0005486759.ooo.test sudo[113934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3yjcqtc8/privsep.sock
Oct 14 09:01:08 np0005486759.ooo.test sudo[113934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:09 np0005486759.ooo.test sudo[113934]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:01:09 np0005486759.ooo.test podman[113941]: 2025-10-14 09:01:09.423441795 +0000 UTC m=+0.088183033 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, release=1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 09:01:09 np0005486759.ooo.test podman[113941]: 2025-10-14 09:01:09.462287613 +0000 UTC m=+0.127028921 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, release=1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=)
Oct 14 09:01:09 np0005486759.ooo.test podman[113941]: unhealthy
Oct 14 09:01:09 np0005486759.ooo.test podman[113940]: 2025-10-14 09:01:09.475638298 +0000 UTC m=+0.143707530 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, vcs-type=git, version=17.1.9, config_id=tripleo_step4, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1)
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:01:09 np0005486759.ooo.test podman[113940]: 2025-10-14 09:01:09.51136197 +0000 UTC m=+0.179431172 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-07-21T13:07:52, release=1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: tmp-crun.M8M9GR.mount: Deactivated successfully.
Oct 14 09:01:09 np0005486759.ooo.test podman[113942]: 2025-10-14 09:01:09.528545974 +0000 UTC m=+0.191460566 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, tcib_managed=true)
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:01:09 np0005486759.ooo.test podman[113942]: 2025-10-14 09:01:09.592547405 +0000 UTC m=+0.255461937 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, version=17.1.9, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 09:01:09 np0005486759.ooo.test podman[113942]: unhealthy
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:09 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:01:09 np0005486759.ooo.test sudo[114008]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0n7gs68z/privsep.sock
Oct 14 09:01:09 np0005486759.ooo.test sudo[114008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:10 np0005486759.ooo.test sudo[114008]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:10 np0005486759.ooo.test sudo[114025]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpihnbc_ky/privsep.sock
Oct 14 09:01:10 np0005486759.ooo.test sudo[114025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:11 np0005486759.ooo.test sudo[114025]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:11 np0005486759.ooo.test sudo[114036]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpldcec_pl/privsep.sock
Oct 14 09:01:11 np0005486759.ooo.test sudo[114036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:12 np0005486759.ooo.test sudo[114036]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:12 np0005486759.ooo.test sudo[114047]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx3grs10r/privsep.sock
Oct 14 09:01:12 np0005486759.ooo.test sudo[114047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:13 np0005486759.ooo.test sudo[114047]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:13 np0005486759.ooo.test sudo[114058]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6bq94irr/privsep.sock
Oct 14 09:01:13 np0005486759.ooo.test sudo[114058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:13 np0005486759.ooo.test sudo[114058]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:14 np0005486759.ooo.test sudo[114069]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp10i35m60/privsep.sock
Oct 14 09:01:14 np0005486759.ooo.test sudo[114069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:14 np0005486759.ooo.test sudo[114069]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:15 np0005486759.ooo.test sudo[114080]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdxjcv03m/privsep.sock
Oct 14 09:01:15 np0005486759.ooo.test sudo[114080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: tmp-crun.nnP5pI.mount: Deactivated successfully.
Oct 14 09:01:15 np0005486759.ooo.test podman[114083]: 2025-10-14 09:01:15.127088806 +0000 UTC m=+0.079222675 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc.)
Oct 14 09:01:15 np0005486759.ooo.test podman[114083]: 2025-10-14 09:01:15.17350387 +0000 UTC m=+0.125637759 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, release=1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.9, build-date=2025-07-21T13:28:44)
Oct 14 09:01:15 np0005486759.ooo.test podman[114083]: unhealthy
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:01:15 np0005486759.ooo.test podman[114081]: 2025-10-14 09:01:15.174391077 +0000 UTC m=+0.128015433 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, vcs-type=git, release=1)
Oct 14 09:01:15 np0005486759.ooo.test podman[114081]: 2025-10-14 09:01:15.257402749 +0000 UTC m=+0.211027115 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, vcs-type=git, architecture=x86_64)
Oct 14 09:01:15 np0005486759.ooo.test podman[114081]: unhealthy
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:01:15 np0005486759.ooo.test sudo[114080]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:15 np0005486759.ooo.test sudo[114137]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpytv9w9ie/privsep.sock
Oct 14 09:01:15 np0005486759.ooo.test sudo[114137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:16 np0005486759.ooo.test sudo[114137]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:16 np0005486759.ooo.test sudo[114148]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpamsnwmfs/privsep.sock
Oct 14 09:01:16 np0005486759.ooo.test sudo[114148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:17 np0005486759.ooo.test sudo[114148]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:17 np0005486759.ooo.test sudo[114159]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyf2n2i7y/privsep.sock
Oct 14 09:01:17 np0005486759.ooo.test sudo[114159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:18 np0005486759.ooo.test sudo[114159]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:18 np0005486759.ooo.test sudo[114170]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpai_l8jl4/privsep.sock
Oct 14 09:01:18 np0005486759.ooo.test sudo[114170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:19 np0005486759.ooo.test sudo[114170]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:19 np0005486759.ooo.test sudo[114181]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7h3_7xxr/privsep.sock
Oct 14 09:01:19 np0005486759.ooo.test sudo[114181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:19 np0005486759.ooo.test sudo[114181]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:20 np0005486759.ooo.test sudo[114192]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptkwzw7_7/privsep.sock
Oct 14 09:01:20 np0005486759.ooo.test sudo[114192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:20 np0005486759.ooo.test sudo[114192]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:21 np0005486759.ooo.test sudo[114206]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_ch6gdsq/privsep.sock
Oct 14 09:01:21 np0005486759.ooo.test sudo[114206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:21 np0005486759.ooo.test sudo[114206]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:22 np0005486759.ooo.test sudo[114220]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpps4p2r__/privsep.sock
Oct 14 09:01:22 np0005486759.ooo.test sudo[114220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:22 np0005486759.ooo.test sudo[114220]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:22 np0005486759.ooo.test sudo[114231]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkpcxtg60/privsep.sock
Oct 14 09:01:22 np0005486759.ooo.test sudo[114231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:23 np0005486759.ooo.test sudo[114231]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:23 np0005486759.ooo.test sudo[114242]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvmbst68v/privsep.sock
Oct 14 09:01:23 np0005486759.ooo.test sudo[114242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:24 np0005486759.ooo.test sudo[114242]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:24 np0005486759.ooo.test sudo[114253]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2vtmyruu/privsep.sock
Oct 14 09:01:24 np0005486759.ooo.test sudo[114253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:25 np0005486759.ooo.test sudo[114253]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:25 np0005486759.ooo.test sudo[114264]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxvf32ri9/privsep.sock
Oct 14 09:01:25 np0005486759.ooo.test sudo[114264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:25 np0005486759.ooo.test sudo[114264]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:26 np0005486759.ooo.test sudo[114275]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpysoo59ao/privsep.sock
Oct 14 09:01:26 np0005486759.ooo.test sudo[114275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:26 np0005486759.ooo.test sudo[114275]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:27 np0005486759.ooo.test sudo[114292]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqnk4_03/privsep.sock
Oct 14 09:01:27 np0005486759.ooo.test sudo[114292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:27 np0005486759.ooo.test sudo[114292]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:27 np0005486759.ooo.test sudo[114303]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3u_k3pk9/privsep.sock
Oct 14 09:01:27 np0005486759.ooo.test sudo[114303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:28 np0005486759.ooo.test sudo[114303]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:28 np0005486759.ooo.test sudo[114314]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptlk1t1hf/privsep.sock
Oct 14 09:01:28 np0005486759.ooo.test sudo[114314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:29 np0005486759.ooo.test sudo[114314]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:29 np0005486759.ooo.test sudo[114325]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2mt2ammu/privsep.sock
Oct 14 09:01:29 np0005486759.ooo.test sudo[114325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:30 np0005486759.ooo.test sudo[114325]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: tmp-crun.OzIvqB.mount: Deactivated successfully.
Oct 14 09:01:30 np0005486759.ooo.test podman[114332]: 2025-10-14 09:01:30.318721977 +0000 UTC m=+0.112990705 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:01:30 np0005486759.ooo.test podman[114329]: 2025-10-14 09:01:30.271347653 +0000 UTC m=+0.068413728 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, architecture=x86_64, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 09:01:30 np0005486759.ooo.test podman[114333]: 2025-10-14 09:01:30.295848766 +0000 UTC m=+0.084922932 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, name=rhosp17/openstack-collectd, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, architecture=x86_64, release=2, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:01:30 np0005486759.ooo.test podman[114329]: 2025-10-14 09:01:30.354266003 +0000 UTC m=+0.151332098 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, container_name=iscsid, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, maintainer=OpenStack TripleO Team)
Oct 14 09:01:30 np0005486759.ooo.test podman[114332]: 2025-10-14 09:01:30.362093856 +0000 UTC m=+0.156362554 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute)
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:01:30 np0005486759.ooo.test podman[114333]: 2025-10-14 09:01:30.379434165 +0000 UTC m=+0.168508341 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=2, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, version=17.1.9, container_name=collectd, batch=17.1_20250721.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container)
Oct 14 09:01:30 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:01:30 np0005486759.ooo.test sudo[114397]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1qdu6nsu/privsep.sock
Oct 14 09:01:30 np0005486759.ooo.test sudo[114397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:31 np0005486759.ooo.test sudo[114397]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:31 np0005486759.ooo.test systemd[1]: tmp-crun.2drV1M.mount: Deactivated successfully.
Oct 14 09:01:31 np0005486759.ooo.test sudo[114408]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiqoacz41/privsep.sock
Oct 14 09:01:31 np0005486759.ooo.test sudo[114408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:31 np0005486759.ooo.test sudo[114408]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:32 np0005486759.ooo.test sudo[114425]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpytn564kw/privsep.sock
Oct 14 09:01:32 np0005486759.ooo.test sudo[114425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:32 np0005486759.ooo.test sudo[114425]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:33 np0005486759.ooo.test sudo[114436]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppqjac7b8/privsep.sock
Oct 14 09:01:33 np0005486759.ooo.test sudo[114436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:33 np0005486759.ooo.test sudo[114436]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:33 np0005486759.ooo.test sudo[114447]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_wr_2576/privsep.sock
Oct 14 09:01:33 np0005486759.ooo.test sudo[114447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:34 np0005486759.ooo.test sudo[114447]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:34 np0005486759.ooo.test sudo[114458]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgp2f4hlm/privsep.sock
Oct 14 09:01:34 np0005486759.ooo.test sudo[114458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:01:35 np0005486759.ooo.test systemd[1]: tmp-crun.HQ1WqS.mount: Deactivated successfully.
Oct 14 09:01:35 np0005486759.ooo.test podman[114461]: 2025-10-14 09:01:35.472431963 +0000 UTC m=+0.099903413 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, container_name=metrics_qdr, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, architecture=x86_64, vcs-type=git, config_id=tripleo_step1)
Oct 14 09:01:35 np0005486759.ooo.test sudo[114458]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:35 np0005486759.ooo.test podman[114461]: 2025-10-14 09:01:35.63868448 +0000 UTC m=+0.266155910 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, distribution-scope=public, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 09:01:35 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:01:35 np0005486759.ooo.test sudo[114497]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpobmv5jh9/privsep.sock
Oct 14 09:01:35 np0005486759.ooo.test sudo[114497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:36 np0005486759.ooo.test sudo[114497]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:36 np0005486759.ooo.test sudo[114508]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3pwja0to/privsep.sock
Oct 14 09:01:36 np0005486759.ooo.test sudo[114508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:37 np0005486759.ooo.test sudo[114508]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:37 np0005486759.ooo.test sudo[114525]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqm5on3xv/privsep.sock
Oct 14 09:01:37 np0005486759.ooo.test sudo[114525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:38 np0005486759.ooo.test sudo[114525]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:01:38 np0005486759.ooo.test systemd[1]: tmp-crun.T7EBWn.mount: Deactivated successfully.
Oct 14 09:01:38 np0005486759.ooo.test podman[114529]: 2025-10-14 09:01:38.19505267 +0000 UTC m=+0.094124633 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, version=17.1.9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=)
Oct 14 09:01:38 np0005486759.ooo.test sudo[114559]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ks5p8r3/privsep.sock
Oct 14 09:01:38 np0005486759.ooo.test sudo[114559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:38 np0005486759.ooo.test podman[114529]: 2025-10-14 09:01:38.601644263 +0000 UTC m=+0.500716206 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:01:38 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:01:38 np0005486759.ooo.test sudo[114559]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:39 np0005486759.ooo.test sudo[114571]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdry4yvm1/privsep.sock
Oct 14 09:01:39 np0005486759.ooo.test sudo[114571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:39 np0005486759.ooo.test sudo[114571]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:01:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:01:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:01:39 np0005486759.ooo.test podman[114575]: 2025-10-14 09:01:39.892746695 +0000 UTC m=+0.078575058 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, name=rhosp17/openstack-cron, architecture=x86_64)
Oct 14 09:01:39 np0005486759.ooo.test podman[114575]: 2025-10-14 09:01:39.926836837 +0000 UTC m=+0.112665220 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public)
Oct 14 09:01:39 np0005486759.ooo.test podman[114578]: 2025-10-14 09:01:39.935398733 +0000 UTC m=+0.119279786 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, release=1)
Oct 14 09:01:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:01:39 np0005486759.ooo.test podman[114578]: 2025-10-14 09:01:39.9423736 +0000 UTC m=+0.126254633 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 09:01:39 np0005486759.ooo.test podman[114578]: unhealthy
Oct 14 09:01:39 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:39 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:01:39 np0005486759.ooo.test podman[114579]: 2025-10-14 09:01:39.877263002 +0000 UTC m=+0.062716404 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Oct 14 09:01:40 np0005486759.ooo.test podman[114579]: 2025-10-14 09:01:40.00527814 +0000 UTC m=+0.190731532 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4)
Oct 14 09:01:40 np0005486759.ooo.test podman[114579]: unhealthy
Oct 14 09:01:40 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:40 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:01:40 np0005486759.ooo.test sudo[114639]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp57t87lrd/privsep.sock
Oct 14 09:01:40 np0005486759.ooo.test sudo[114639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:40 np0005486759.ooo.test sudo[114639]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:40 np0005486759.ooo.test sudo[114650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmirhi2cn/privsep.sock
Oct 14 09:01:40 np0005486759.ooo.test sudo[114650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:41 np0005486759.ooo.test sudo[114650]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:41 np0005486759.ooo.test sudo[114661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprtq0phin/privsep.sock
Oct 14 09:01:41 np0005486759.ooo.test sudo[114661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:42 np0005486759.ooo.test sudo[114661]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:42 np0005486759.ooo.test sudo[114677]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp57q9_66h/privsep.sock
Oct 14 09:01:42 np0005486759.ooo.test sudo[114677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:43 np0005486759.ooo.test sudo[114677]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:43 np0005486759.ooo.test sudo[114689]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqt1_zws/privsep.sock
Oct 14 09:01:43 np0005486759.ooo.test sudo[114689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:44 np0005486759.ooo.test sudo[114689]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:44 np0005486759.ooo.test sudo[114700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvshhsymc/privsep.sock
Oct 14 09:01:44 np0005486759.ooo.test sudo[114700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:45 np0005486759.ooo.test sudo[114700]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:45 np0005486759.ooo.test sudo[114711]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyc6zdc2_/privsep.sock
Oct 14 09:01:45 np0005486759.ooo.test sudo[114711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: tmp-crun.bx5je5.mount: Deactivated successfully.
Oct 14 09:01:45 np0005486759.ooo.test podman[114713]: 2025-10-14 09:01:45.364161685 +0000 UTC m=+0.072618763 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, release=1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.9, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:01:45 np0005486759.ooo.test podman[114713]: 2025-10-14 09:01:45.374412923 +0000 UTC m=+0.082870021 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, batch=17.1_20250721.1, vcs-type=git, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:01:45 np0005486759.ooo.test podman[114713]: unhealthy
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: tmp-crun.AQdFML.mount: Deactivated successfully.
Oct 14 09:01:45 np0005486759.ooo.test podman[114714]: 2025-10-14 09:01:45.413428919 +0000 UTC m=+0.116776908 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Oct 14 09:01:45 np0005486759.ooo.test podman[114714]: 2025-10-14 09:01:45.425124354 +0000 UTC m=+0.128472363 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.9, managed_by=tripleo_ansible, release=1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:01:45 np0005486759.ooo.test podman[114714]: unhealthy
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:01:45 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:01:45 np0005486759.ooo.test sudo[114711]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:46 np0005486759.ooo.test sudo[114757]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7nva918i/privsep.sock
Oct 14 09:01:46 np0005486759.ooo.test sudo[114757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:46 np0005486759.ooo.test sudo[114757]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:46 np0005486759.ooo.test sudo[114768]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg16orky_/privsep.sock
Oct 14 09:01:46 np0005486759.ooo.test sudo[114768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:47 np0005486759.ooo.test sudo[114768]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:47 np0005486759.ooo.test sudo[114779]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp30qtqigz/privsep.sock
Oct 14 09:01:47 np0005486759.ooo.test sudo[114779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:48 np0005486759.ooo.test sudo[114779]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:48 np0005486759.ooo.test sudo[114796]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd_2ah_ik/privsep.sock
Oct 14 09:01:48 np0005486759.ooo.test sudo[114796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:49 np0005486759.ooo.test sudo[114796]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:49 np0005486759.ooo.test sudo[114807]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmvewlw8/privsep.sock
Oct 14 09:01:49 np0005486759.ooo.test sudo[114807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:50 np0005486759.ooo.test sudo[114807]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:50 np0005486759.ooo.test sudo[114818]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7k6qc80a/privsep.sock
Oct 14 09:01:50 np0005486759.ooo.test sudo[114818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:50 np0005486759.ooo.test sudo[114818]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:51 np0005486759.ooo.test sudo[114829]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpga1zv14c/privsep.sock
Oct 14 09:01:51 np0005486759.ooo.test sudo[114829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:51 np0005486759.ooo.test sudo[114829]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:52 np0005486759.ooo.test sudo[114840]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp030xyrxu/privsep.sock
Oct 14 09:01:52 np0005486759.ooo.test sudo[114840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:52 np0005486759.ooo.test sudo[114840]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:52 np0005486759.ooo.test sudo[114851]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp86wkhbw3/privsep.sock
Oct 14 09:01:52 np0005486759.ooo.test sudo[114851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:53 np0005486759.ooo.test sudo[114851]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:53 np0005486759.ooo.test sudo[114868]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjxukhuem/privsep.sock
Oct 14 09:01:53 np0005486759.ooo.test sudo[114868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:54 np0005486759.ooo.test sudo[114868]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:54 np0005486759.ooo.test sudo[114879]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa_lzegei/privsep.sock
Oct 14 09:01:54 np0005486759.ooo.test sudo[114879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:55 np0005486759.ooo.test sudo[114879]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:55 np0005486759.ooo.test sudo[114890]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0r54588i/privsep.sock
Oct 14 09:01:55 np0005486759.ooo.test sudo[114890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:56 np0005486759.ooo.test sudo[114890]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:56 np0005486759.ooo.test sudo[114901]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd3xbxy3w/privsep.sock
Oct 14 09:01:56 np0005486759.ooo.test sudo[114901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:57 np0005486759.ooo.test sudo[114901]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:57 np0005486759.ooo.test sudo[114912]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnq42scdf/privsep.sock
Oct 14 09:01:57 np0005486759.ooo.test sudo[114912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:57 np0005486759.ooo.test sudo[114912]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:58 np0005486759.ooo.test sudo[114923]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvn5211st/privsep.sock
Oct 14 09:01:58 np0005486759.ooo.test sudo[114923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:58 np0005486759.ooo.test sudo[114923]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:59 np0005486759.ooo.test sudo[114940]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp19p4tggv/privsep.sock
Oct 14 09:01:59 np0005486759.ooo.test sudo[114940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:01:59 np0005486759.ooo.test sudo[114940]: pam_unix(sudo:session): session closed for user root
Oct 14 09:01:59 np0005486759.ooo.test sudo[114951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0w7kv8oo/privsep.sock
Oct 14 09:01:59 np0005486759.ooo.test sudo[114951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:00 np0005486759.ooo.test sudo[114951]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:02:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:02:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:02:00 np0005486759.ooo.test podman[114958]: 2025-10-14 09:02:00.60803727 +0000 UTC m=+0.072112526 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, build-date=2025-07-21T14:48:37)
Oct 14 09:02:00 np0005486759.ooo.test podman[114956]: 2025-10-14 09:02:00.672133607 +0000 UTC m=+0.136294036 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.9, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container)
Oct 14 09:02:00 np0005486759.ooo.test podman[114956]: 2025-10-14 09:02:00.684693368 +0000 UTC m=+0.148853797 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.buildah.version=1.33.12, vcs-type=git, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid)
Oct 14 09:02:00 np0005486759.ooo.test podman[114959]: 2025-10-14 09:02:00.633765782 +0000 UTC m=+0.091159000 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, io.buildah.version=1.33.12, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, tcib_managed=true, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 09:02:00 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:02:00 np0005486759.ooo.test podman[114958]: 2025-10-14 09:02:00.743775458 +0000 UTC m=+0.207850704 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, container_name=nova_compute, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git)
Oct 14 09:02:00 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:02:00 np0005486759.ooo.test podman[114959]: 2025-10-14 09:02:00.768982993 +0000 UTC m=+0.226376191 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, container_name=collectd, name=rhosp17/openstack-collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, io.openshift.expose-services=)
Oct 14 09:02:00 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:02:00 np0005486759.ooo.test sudo[115026]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqkvt6ot8/privsep.sock
Oct 14 09:02:00 np0005486759.ooo.test sudo[115026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:01 np0005486759.ooo.test sudo[115026]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:01 np0005486759.ooo.test sudo[115037]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpazorfuqc/privsep.sock
Oct 14 09:02:01 np0005486759.ooo.test sudo[115037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:02 np0005486759.ooo.test sudo[115037]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:02 np0005486759.ooo.test sudo[115048]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbtyileyg/privsep.sock
Oct 14 09:02:02 np0005486759.ooo.test sudo[115048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:03 np0005486759.ooo.test sudo[115048]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:03 np0005486759.ooo.test sudo[115059]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe97bjks7/privsep.sock
Oct 14 09:02:03 np0005486759.ooo.test sudo[115059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:04 np0005486759.ooo.test sudo[115059]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:04 np0005486759.ooo.test sudo[115076]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplx71_ouh/privsep.sock
Oct 14 09:02:04 np0005486759.ooo.test sudo[115076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:05 np0005486759.ooo.test sudo[115076]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:05 np0005486759.ooo.test sudo[115087]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps0t9lgwo/privsep.sock
Oct 14 09:02:05 np0005486759.ooo.test sudo[115087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:05 np0005486759.ooo.test sudo[115087]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:02:05 np0005486759.ooo.test podman[115092]: 2025-10-14 09:02:05.991014395 +0000 UTC m=+0.067393480 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 09:02:06 np0005486759.ooo.test sudo[115127]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0y85uau1/privsep.sock
Oct 14 09:02:06 np0005486759.ooo.test sudo[115127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:06 np0005486759.ooo.test podman[115092]: 2025-10-14 09:02:06.241249209 +0000 UTC m=+0.317628294 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, config_id=tripleo_step1)
Oct 14 09:02:06 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:02:06 np0005486759.ooo.test sudo[115127]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:06 np0005486759.ooo.test sudo[115138]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzh1ib41h/privsep.sock
Oct 14 09:02:06 np0005486759.ooo.test sudo[115138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:07 np0005486759.ooo.test sudo[115138]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:07 np0005486759.ooo.test sudo[115149]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpasqd2__1/privsep.sock
Oct 14 09:02:07 np0005486759.ooo.test sudo[115149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:08 np0005486759.ooo.test sudo[115149]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:08 np0005486759.ooo.test sudo[115160]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl20yspd3/privsep.sock
Oct 14 09:02:08 np0005486759.ooo.test sudo[115160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:02:08 np0005486759.ooo.test podman[115162]: 2025-10-14 09:02:08.821041067 +0000 UTC m=+0.092379658 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 09:02:09 np0005486759.ooo.test podman[115162]: 2025-10-14 09:02:09.192733744 +0000 UTC m=+0.464072285 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Oct 14 09:02:09 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:02:09 np0005486759.ooo.test sudo[115160]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:09 np0005486759.ooo.test sudo[115199]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuua9ynou/privsep.sock
Oct 14 09:02:09 np0005486759.ooo.test sudo[115199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:10 np0005486759.ooo.test sudo[115199]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: tmp-crun.eMFAKe.mount: Deactivated successfully.
Oct 14 09:02:10 np0005486759.ooo.test podman[115205]: 2025-10-14 09:02:10.271441221 +0000 UTC m=+0.076556565 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, tcib_managed=true, container_name=logrotate_crond, release=1, name=rhosp17/openstack-cron, distribution-scope=public, managed_by=tripleo_ansible)
Oct 14 09:02:10 np0005486759.ooo.test podman[115207]: 2025-10-14 09:02:10.329251601 +0000 UTC m=+0.128327077 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, managed_by=tripleo_ansible)
Oct 14 09:02:10 np0005486759.ooo.test podman[115207]: 2025-10-14 09:02:10.34012142 +0000 UTC m=+0.139196906 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 09:02:10 np0005486759.ooo.test podman[115207]: unhealthy
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:02:10 np0005486759.ooo.test podman[115205]: 2025-10-14 09:02:10.352624709 +0000 UTC m=+0.157740013 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, batch=17.1_20250721.1, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, tcib_managed=true, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, config_id=tripleo_step4, release=1)
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:02:10 np0005486759.ooo.test podman[115208]: 2025-10-14 09:02:10.307320429 +0000 UTC m=+0.102141003 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, tcib_managed=true, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47)
Oct 14 09:02:10 np0005486759.ooo.test podman[115208]: 2025-10-14 09:02:10.440489616 +0000 UTC m=+0.235310190 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1, tcib_managed=true, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 09:02:10 np0005486759.ooo.test podman[115208]: unhealthy
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:10 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:02:10 np0005486759.ooo.test sudo[115267]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpssy461ob/privsep.sock
Oct 14 09:02:10 np0005486759.ooo.test sudo[115267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:11 np0005486759.ooo.test sudo[115267]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:11 np0005486759.ooo.test sudo[115278]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr0ewihoj/privsep.sock
Oct 14 09:02:11 np0005486759.ooo.test sudo[115278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:11 np0005486759.ooo.test sudo[115278]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:12 np0005486759.ooo.test sudo[115289]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3vz8_vuy/privsep.sock
Oct 14 09:02:12 np0005486759.ooo.test sudo[115289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:12 np0005486759.ooo.test sudo[115289]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:13 np0005486759.ooo.test sudo[115300]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpecw5tak4/privsep.sock
Oct 14 09:02:13 np0005486759.ooo.test sudo[115300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:13 np0005486759.ooo.test sudo[115300]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:13 np0005486759.ooo.test sudo[115311]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4zqcben3/privsep.sock
Oct 14 09:02:13 np0005486759.ooo.test sudo[115311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:14 np0005486759.ooo.test sudo[115311]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:14 np0005486759.ooo.test sudo[115324]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk_m1y5ou/privsep.sock
Oct 14 09:02:14 np0005486759.ooo.test sudo[115324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:15 np0005486759.ooo.test sudo[115324]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:15 np0005486759.ooo.test sudo[115339]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt0x5vbxa/privsep.sock
Oct 14 09:02:15 np0005486759.ooo.test sudo[115339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:02:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:02:15 np0005486759.ooo.test podman[115341]: 2025-10-14 09:02:15.662278261 +0000 UTC m=+0.041995389 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4)
Oct 14 09:02:15 np0005486759.ooo.test podman[115342]: 2025-10-14 09:02:15.68986203 +0000 UTC m=+0.063660354 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.9, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:02:15 np0005486759.ooo.test podman[115341]: 2025-10-14 09:02:15.725323335 +0000 UTC m=+0.105040453 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent)
Oct 14 09:02:15 np0005486759.ooo.test podman[115341]: unhealthy
Oct 14 09:02:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:02:15 np0005486759.ooo.test podman[115342]: 2025-10-14 09:02:15.778072218 +0000 UTC m=+0.151870602 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, io.openshift.expose-services=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 09:02:15 np0005486759.ooo.test podman[115342]: unhealthy
Oct 14 09:02:15 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:15 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:02:16 np0005486759.ooo.test sudo[115339]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:16 np0005486759.ooo.test sudo[115389]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo5lrlhl5/privsep.sock
Oct 14 09:02:16 np0005486759.ooo.test sudo[115389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:17 np0005486759.ooo.test sudo[115389]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:17 np0005486759.ooo.test sudo[115400]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2_br_ixn/privsep.sock
Oct 14 09:02:17 np0005486759.ooo.test sudo[115400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:17 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:02:17 np0005486759.ooo.test recover_tripleo_nova_virtqemud[115403]: 47951
Oct 14 09:02:17 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:02:17 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:02:17 np0005486759.ooo.test sudo[115400]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:18 np0005486759.ooo.test sudo[115413]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgotcvhys/privsep.sock
Oct 14 09:02:18 np0005486759.ooo.test sudo[115413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:18 np0005486759.ooo.test sudo[115413]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:19 np0005486759.ooo.test sudo[115424]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_33s3f6v/privsep.sock
Oct 14 09:02:19 np0005486759.ooo.test sudo[115424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:19 np0005486759.ooo.test sudo[115424]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:19 np0005486759.ooo.test sudo[115435]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp48afptiq/privsep.sock
Oct 14 09:02:19 np0005486759.ooo.test sudo[115435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:20 np0005486759.ooo.test sudo[115435]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:20 np0005486759.ooo.test sudo[115452]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp375qfuog/privsep.sock
Oct 14 09:02:20 np0005486759.ooo.test sudo[115452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:21 np0005486759.ooo.test sudo[115452]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:21 np0005486759.ooo.test sudo[115463]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp46cv9qw9/privsep.sock
Oct 14 09:02:21 np0005486759.ooo.test sudo[115463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:22 np0005486759.ooo.test sudo[115463]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:22 np0005486759.ooo.test sudo[115474]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpso3e3xtd/privsep.sock
Oct 14 09:02:22 np0005486759.ooo.test sudo[115474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:23 np0005486759.ooo.test sudo[115474]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:23 np0005486759.ooo.test sudo[115485]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj41foiiz/privsep.sock
Oct 14 09:02:23 np0005486759.ooo.test sudo[115485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:23 np0005486759.ooo.test sudo[115485]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:24 np0005486759.ooo.test sudo[115496]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp0gr3gcl/privsep.sock
Oct 14 09:02:24 np0005486759.ooo.test sudo[115496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:24 np0005486759.ooo.test sudo[115496]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:24 np0005486759.ooo.test sudo[115507]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqoe2ncbb/privsep.sock
Oct 14 09:02:24 np0005486759.ooo.test sudo[115507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:25 np0005486759.ooo.test sudo[115507]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:25 np0005486759.ooo.test sudo[115523]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5o2hmbqg/privsep.sock
Oct 14 09:02:25 np0005486759.ooo.test sudo[115523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:26 np0005486759.ooo.test sudo[115523]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:26 np0005486759.ooo.test sudo[115535]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpze6yqptg/privsep.sock
Oct 14 09:02:26 np0005486759.ooo.test sudo[115535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:27 np0005486759.ooo.test sudo[115535]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:27 np0005486759.ooo.test sudo[115546]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0t2466cx/privsep.sock
Oct 14 09:02:27 np0005486759.ooo.test sudo[115546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:28 np0005486759.ooo.test sudo[115546]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:28 np0005486759.ooo.test sudo[115557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0zdz90kh/privsep.sock
Oct 14 09:02:28 np0005486759.ooo.test sudo[115557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:29 np0005486759.ooo.test sudo[115557]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:29 np0005486759.ooo.test sudo[115568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsap6_vgk/privsep.sock
Oct 14 09:02:29 np0005486759.ooo.test sudo[115568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:29 np0005486759.ooo.test sudo[115568]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:30 np0005486759.ooo.test sudo[115579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptelu8gcd/privsep.sock
Oct 14 09:02:30 np0005486759.ooo.test sudo[115579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:30 np0005486759.ooo.test sudo[115579]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:02:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:02:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:02:30 np0005486759.ooo.test podman[115584]: 2025-10-14 09:02:30.899067997 +0000 UTC m=+0.082511330 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.buildah.version=1.33.12)
Oct 14 09:02:30 np0005486759.ooo.test podman[115584]: 2025-10-14 09:02:30.910375779 +0000 UTC m=+0.093819122 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, version=17.1.9, release=1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public)
Oct 14 09:02:30 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:02:31 np0005486759.ooo.test podman[115587]: 2025-10-14 09:02:31.006556085 +0000 UTC m=+0.183420144 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, batch=17.1_20250721.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, release=2, name=rhosp17/openstack-collectd, container_name=collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:02:31 np0005486759.ooo.test podman[115587]: 2025-10-14 09:02:31.014417589 +0000 UTC m=+0.191281638 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-07-21T13:04:03, version=17.1.9, batch=17.1_20250721.1, release=2, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3)
Oct 14 09:02:31 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:02:31 np0005486759.ooo.test podman[115586]: 2025-10-14 09:02:31.105782845 +0000 UTC m=+0.283679596 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, version=17.1.9, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, vcs-type=git, name=rhosp17/openstack-nova-compute)
Oct 14 09:02:31 np0005486759.ooo.test sudo[115641]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp00c69x63/privsep.sock
Oct 14 09:02:31 np0005486759.ooo.test sudo[115641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:31 np0005486759.ooo.test podman[115586]: 2025-10-14 09:02:31.136206213 +0000 UTC m=+0.314102914 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, release=1, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, maintainer=OpenStack TripleO Team)
Oct 14 09:02:31 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:02:31 np0005486759.ooo.test sudo[115641]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:31 np0005486759.ooo.test sudo[115670]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpic6kxkab/privsep.sock
Oct 14 09:02:31 np0005486759.ooo.test sudo[115670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:32 np0005486759.ooo.test sudo[115670]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:32 np0005486759.ooo.test sudo[115681]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzw7xhx01/privsep.sock
Oct 14 09:02:32 np0005486759.ooo.test sudo[115681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:33 np0005486759.ooo.test sudo[115681]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:33 np0005486759.ooo.test sudo[115692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphyh6ydl_/privsep.sock
Oct 14 09:02:33 np0005486759.ooo.test sudo[115692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:34 np0005486759.ooo.test sudo[115692]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:34 np0005486759.ooo.test sudo[115703]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc552j521/privsep.sock
Oct 14 09:02:34 np0005486759.ooo.test sudo[115703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:35 np0005486759.ooo.test sudo[115703]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:35 np0005486759.ooo.test sudo[115714]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjwrwl8_b/privsep.sock
Oct 14 09:02:35 np0005486759.ooo.test sudo[115714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:35 np0005486759.ooo.test sudo[115714]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:36 np0005486759.ooo.test sudo[115725]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfsheqww5/privsep.sock
Oct 14 09:02:36 np0005486759.ooo.test sudo[115725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:02:36 np0005486759.ooo.test systemd[1]: tmp-crun.ikv6HY.mount: Deactivated successfully.
Oct 14 09:02:36 np0005486759.ooo.test podman[115728]: 2025-10-14 09:02:36.512535051 +0000 UTC m=+0.134314565 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 09:02:36 np0005486759.ooo.test podman[115728]: 2025-10-14 09:02:36.715205353 +0000 UTC m=+0.336984827 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, architecture=x86_64, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Oct 14 09:02:36 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:02:36 np0005486759.ooo.test sudo[115725]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:37 np0005486759.ooo.test sudo[115771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpscppvz63/privsep.sock
Oct 14 09:02:37 np0005486759.ooo.test sudo[115771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:37 np0005486759.ooo.test sudo[115771]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:37 np0005486759.ooo.test sudo[115782]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6gzmqnhf/privsep.sock
Oct 14 09:02:37 np0005486759.ooo.test sudo[115782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:38 np0005486759.ooo.test sudo[115782]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:38 np0005486759.ooo.test sudo[115793]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzt3c3fqq/privsep.sock
Oct 14 09:02:38 np0005486759.ooo.test sudo[115793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:02:39 np0005486759.ooo.test systemd[1]: tmp-crun.cNTaHp.mount: Deactivated successfully.
Oct 14 09:02:39 np0005486759.ooo.test podman[115796]: 2025-10-14 09:02:39.448940876 +0000 UTC m=+0.086496895 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team)
Oct 14 09:02:39 np0005486759.ooo.test sudo[115793]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:39 np0005486759.ooo.test sudo[115827]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_kjloi8y/privsep.sock
Oct 14 09:02:39 np0005486759.ooo.test sudo[115827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:39 np0005486759.ooo.test podman[115796]: 2025-10-14 09:02:39.847431547 +0000 UTC m=+0.484987516 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step4)
Oct 14 09:02:39 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:02:40 np0005486759.ooo.test sudo[115827]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:40 np0005486759.ooo.test sudo[115838]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkqjwl178/privsep.sock
Oct 14 09:02:40 np0005486759.ooo.test sudo[115838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: tmp-crun.o3zCBl.mount: Deactivated successfully.
Oct 14 09:02:40 np0005486759.ooo.test podman[115842]: 2025-10-14 09:02:40.659705575 +0000 UTC m=+0.083098688 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1)
Oct 14 09:02:40 np0005486759.ooo.test podman[115842]: 2025-10-14 09:02:40.701341682 +0000 UTC m=+0.124734795 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:02:40 np0005486759.ooo.test podman[115842]: unhealthy
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:02:40 np0005486759.ooo.test podman[115840]: 2025-10-14 09:02:40.702772957 +0000 UTC m=+0.132611801 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1)
Oct 14 09:02:40 np0005486759.ooo.test podman[115840]: 2025-10-14 09:02:40.783721539 +0000 UTC m=+0.213560353 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container)
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:02:40 np0005486759.ooo.test podman[115841]: 2025-10-14 09:02:40.756523981 +0000 UTC m=+0.182650059 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9)
Oct 14 09:02:40 np0005486759.ooo.test podman[115841]: 2025-10-14 09:02:40.839287269 +0000 UTC m=+0.265413367 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20250721.1, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute)
Oct 14 09:02:40 np0005486759.ooo.test podman[115841]: unhealthy
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:40 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:02:41 np0005486759.ooo.test sudo[115838]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:41 np0005486759.ooo.test sudo[115903]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp41g0qvbe/privsep.sock
Oct 14 09:02:41 np0005486759.ooo.test sudo[115903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:41 np0005486759.ooo.test sudo[115903]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:42 np0005486759.ooo.test sudo[115920]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeedij61c/privsep.sock
Oct 14 09:02:42 np0005486759.ooo.test sudo[115920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:42 np0005486759.ooo.test sudo[115920]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:43 np0005486759.ooo.test sudo[115931]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6mg8wrfs/privsep.sock
Oct 14 09:02:43 np0005486759.ooo.test sudo[115931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:43 np0005486759.ooo.test sudo[115931]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:44 np0005486759.ooo.test sudo[115942]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn7ypl7qh/privsep.sock
Oct 14 09:02:44 np0005486759.ooo.test sudo[115942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:44 np0005486759.ooo.test sudo[115942]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:44 np0005486759.ooo.test sudo[115953]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkssjijuq/privsep.sock
Oct 14 09:02:44 np0005486759.ooo.test sudo[115953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:45 np0005486759.ooo.test sudo[115953]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:45 np0005486759.ooo.test sudo[115964]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsoazpi4v/privsep.sock
Oct 14 09:02:45 np0005486759.ooo.test sudo[115964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:02:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:02:45 np0005486759.ooo.test podman[115965]: 2025-10-14 09:02:45.892527484 +0000 UTC m=+0.063455167 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:02:45 np0005486759.ooo.test podman[115965]: 2025-10-14 09:02:45.903086913 +0000 UTC m=+0.074014596 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, tcib_managed=true, container_name=ovn_metadata_agent, version=17.1.9)
Oct 14 09:02:45 np0005486759.ooo.test podman[115965]: unhealthy
Oct 14 09:02:45 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:45 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:02:45 np0005486759.ooo.test podman[115967]: 2025-10-14 09:02:45.940949943 +0000 UTC m=+0.109284425 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, release=1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Oct 14 09:02:45 np0005486759.ooo.test podman[115967]: 2025-10-14 09:02:45.949323893 +0000 UTC m=+0.117644785 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, container_name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1)
Oct 14 09:02:45 np0005486759.ooo.test podman[115967]: unhealthy
Oct 14 09:02:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:02:45 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:02:46 np0005486759.ooo.test sudo[115964]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:46 np0005486759.ooo.test sudo[116013]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl11ri36l/privsep.sock
Oct 14 09:02:46 np0005486759.ooo.test sudo[116013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:47 np0005486759.ooo.test sudo[116013]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:47 np0005486759.ooo.test sudo[116030]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpekywxtzb/privsep.sock
Oct 14 09:02:47 np0005486759.ooo.test sudo[116030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:48 np0005486759.ooo.test sudo[116030]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:48 np0005486759.ooo.test sudo[116041]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplxxy88rb/privsep.sock
Oct 14 09:02:48 np0005486759.ooo.test sudo[116041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:48 np0005486759.ooo.test sudo[116041]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:49 np0005486759.ooo.test sudo[116052]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq5a903nc/privsep.sock
Oct 14 09:02:49 np0005486759.ooo.test sudo[116052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:49 np0005486759.ooo.test sudo[116052]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:50 np0005486759.ooo.test sudo[116063]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9j2pk9z6/privsep.sock
Oct 14 09:02:50 np0005486759.ooo.test sudo[116063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:50 np0005486759.ooo.test sudo[116063]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:50 np0005486759.ooo.test sudo[116074]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdw9pgzfi/privsep.sock
Oct 14 09:02:50 np0005486759.ooo.test sudo[116074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:51 np0005486759.ooo.test sudo[116074]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:51 np0005486759.ooo.test sudo[116085]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzf_5knbf/privsep.sock
Oct 14 09:02:51 np0005486759.ooo.test sudo[116085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:52 np0005486759.ooo.test sudo[116085]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:52 np0005486759.ooo.test sudo[116099]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaol5td7f/privsep.sock
Oct 14 09:02:52 np0005486759.ooo.test sudo[116099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:53 np0005486759.ooo.test sudo[116099]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:53 np0005486759.ooo.test sudo[116113]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzruykoxf/privsep.sock
Oct 14 09:02:53 np0005486759.ooo.test sudo[116113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:54 np0005486759.ooo.test sudo[116113]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:54 np0005486759.ooo.test sudo[116124]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0dqtzf1y/privsep.sock
Oct 14 09:02:54 np0005486759.ooo.test sudo[116124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:54 np0005486759.ooo.test sudo[116124]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:55 np0005486759.ooo.test sudo[116135]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppkjm6kor/privsep.sock
Oct 14 09:02:55 np0005486759.ooo.test sudo[116135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:55 np0005486759.ooo.test sudo[116135]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:56 np0005486759.ooo.test sudo[116146]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0mcbtxt2/privsep.sock
Oct 14 09:02:56 np0005486759.ooo.test sudo[116146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:56 np0005486759.ooo.test sudo[116146]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:56 np0005486759.ooo.test sudo[116157]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0lqe9359/privsep.sock
Oct 14 09:02:56 np0005486759.ooo.test sudo[116157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:57 np0005486759.ooo.test sudo[116157]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:57 np0005486759.ooo.test sudo[116168]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmped6v1tru/privsep.sock
Oct 14 09:02:57 np0005486759.ooo.test sudo[116168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:58 np0005486759.ooo.test sudo[116168]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:58 np0005486759.ooo.test sudo[116185]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp34zcdeei/privsep.sock
Oct 14 09:02:58 np0005486759.ooo.test sudo[116185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:02:59 np0005486759.ooo.test sudo[116185]: pam_unix(sudo:session): session closed for user root
Oct 14 09:02:59 np0005486759.ooo.test sudo[116196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5d5fg_1y/privsep.sock
Oct 14 09:02:59 np0005486759.ooo.test sudo[116196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:00 np0005486759.ooo.test sudo[116196]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:00 np0005486759.ooo.test sudo[116207]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcqhdp_gz/privsep.sock
Oct 14 09:03:00 np0005486759.ooo.test sudo[116207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:01 np0005486759.ooo.test sudo[116207]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:03:01 np0005486759.ooo.test podman[116214]: 2025-10-14 09:03:01.13571546 +0000 UTC m=+0.072294003 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=2, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, architecture=x86_64, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, io.openshift.expose-services=)
Oct 14 09:03:01 np0005486759.ooo.test podman[116214]: 2025-10-14 09:03:01.146792585 +0000 UTC m=+0.083371078 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.9, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=2, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, build-date=2025-07-21T13:04:03)
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: tmp-crun.6WjSAy.mount: Deactivated successfully.
Oct 14 09:03:01 np0005486759.ooo.test podman[116211]: 2025-10-14 09:03:01.198425193 +0000 UTC m=+0.136106101 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, com.redhat.component=openstack-iscsid-container, release=1, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible)
Oct 14 09:03:01 np0005486759.ooo.test podman[116211]: 2025-10-14 09:03:01.206426612 +0000 UTC m=+0.144107510 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, distribution-scope=public, config_id=tripleo_step3)
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:03:01 np0005486759.ooo.test podman[116245]: 2025-10-14 09:03:01.25932178 +0000 UTC m=+0.072003444 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, version=17.1.9)
Oct 14 09:03:01 np0005486759.ooo.test podman[116245]: 2025-10-14 09:03:01.314332222 +0000 UTC m=+0.127013866 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc.)
Oct 14 09:03:01 np0005486759.ooo.test sudo[116282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6fjwd8ij/privsep.sock
Oct 14 09:03:01 np0005486759.ooo.test sudo[116282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:01 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:03:01 np0005486759.ooo.test sudo[116282]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:02 np0005486759.ooo.test sudo[116293]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp748ynjkx/privsep.sock
Oct 14 09:03:02 np0005486759.ooo.test sudo[116293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:02 np0005486759.ooo.test sudo[116293]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:03 np0005486759.ooo.test sudo[116304]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5qyr8ips/privsep.sock
Oct 14 09:03:03 np0005486759.ooo.test sudo[116304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:03 np0005486759.ooo.test sudo[116304]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:03 np0005486759.ooo.test sudo[116321]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvp7fs32n/privsep.sock
Oct 14 09:03:03 np0005486759.ooo.test sudo[116321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:04 np0005486759.ooo.test sudo[116321]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:04 np0005486759.ooo.test sudo[116332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1vyeh3z6/privsep.sock
Oct 14 09:03:04 np0005486759.ooo.test sudo[116332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:05 np0005486759.ooo.test sudo[116332]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:05 np0005486759.ooo.test sudo[116343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnx0mzzba/privsep.sock
Oct 14 09:03:05 np0005486759.ooo.test sudo[116343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:06 np0005486759.ooo.test sudo[116343]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:06 np0005486759.ooo.test sudo[116354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_xzkgsm2/privsep.sock
Oct 14 09:03:06 np0005486759.ooo.test sudo[116354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:07 np0005486759.ooo.test sudo[116354]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:03:07 np0005486759.ooo.test podman[116360]: 2025-10-14 09:03:07.245374888 +0000 UTC m=+0.081986575 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, version=17.1.9, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Oct 14 09:03:07 np0005486759.ooo.test sudo[116392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6crt4zjn/privsep.sock
Oct 14 09:03:07 np0005486759.ooo.test sudo[116392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:07 np0005486759.ooo.test podman[116360]: 2025-10-14 09:03:07.46637031 +0000 UTC m=+0.302981987 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, version=17.1.9, architecture=x86_64, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:03:07 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:03:07 np0005486759.ooo.test sudo[116392]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:08 np0005486759.ooo.test sudo[116404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsk3eypvl/privsep.sock
Oct 14 09:03:08 np0005486759.ooo.test sudo[116404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:08 np0005486759.ooo.test sudo[116404]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:09 np0005486759.ooo.test sudo[116421]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg5fzneb2/privsep.sock
Oct 14 09:03:09 np0005486759.ooo.test sudo[116421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:09 np0005486759.ooo.test sudo[116421]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:10 np0005486759.ooo.test sudo[116432]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsdvmp4ol/privsep.sock
Oct 14 09:03:10 np0005486759.ooo.test sudo[116432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:03:10 np0005486759.ooo.test systemd[1]: tmp-crun.9m4ANp.mount: Deactivated successfully.
Oct 14 09:03:10 np0005486759.ooo.test podman[116434]: 2025-10-14 09:03:10.1717453 +0000 UTC m=+0.086368711 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 09:03:10 np0005486759.ooo.test podman[116434]: 2025-10-14 09:03:10.483238752 +0000 UTC m=+0.397862153 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, release=1, architecture=x86_64, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 09:03:10 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:03:10 np0005486759.ooo.test sudo[116432]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:10 np0005486759.ooo.test sudo[116467]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0xo1g_74/privsep.sock
Oct 14 09:03:10 np0005486759.ooo.test sudo[116467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:03:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:03:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:03:10 np0005486759.ooo.test podman[116470]: 2025-10-14 09:03:10.990626434 +0000 UTC m=+0.062738504 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:03:11 np0005486759.ooo.test podman[116471]: 2025-10-14 09:03:11.007702457 +0000 UTC m=+0.073983926 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, tcib_managed=true, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.9, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4)
Oct 14 09:03:11 np0005486759.ooo.test podman[116471]: 2025-10-14 09:03:11.025378747 +0000 UTC m=+0.091660286 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, version=17.1.9, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:03:11 np0005486759.ooo.test podman[116471]: unhealthy
Oct 14 09:03:11 np0005486759.ooo.test podman[116470]: 2025-10-14 09:03:11.035466091 +0000 UTC m=+0.107577741 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, version=17.1.9, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:03:11 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:11 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:03:11 np0005486759.ooo.test podman[116470]: unhealthy
Oct 14 09:03:11 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:11 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:03:11 np0005486759.ooo.test podman[116469]: 2025-10-14 09:03:11.115481113 +0000 UTC m=+0.187138599 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, distribution-scope=public, release=1, vcs-type=git, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:03:11 np0005486759.ooo.test podman[116469]: 2025-10-14 09:03:11.15100954 +0000 UTC m=+0.222667016 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, managed_by=tripleo_ansible, container_name=logrotate_crond, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:52)
Oct 14 09:03:11 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:03:11 np0005486759.ooo.test sudo[116467]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:11 np0005486759.ooo.test sudo[116537]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcnnr71ai/privsep.sock
Oct 14 09:03:11 np0005486759.ooo.test sudo[116537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:12 np0005486759.ooo.test sudo[116537]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:12 np0005486759.ooo.test sudo[116548]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7v9kkrr9/privsep.sock
Oct 14 09:03:12 np0005486759.ooo.test sudo[116548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:13 np0005486759.ooo.test sudo[116548]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:13 np0005486759.ooo.test sudo[116559]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpguf6jcx6/privsep.sock
Oct 14 09:03:13 np0005486759.ooo.test sudo[116559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:14 np0005486759.ooo.test sudo[116559]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:14 np0005486759.ooo.test sudo[116575]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo36ws7j5/privsep.sock
Oct 14 09:03:14 np0005486759.ooo.test sudo[116575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:14 np0005486759.ooo.test sudo[116575]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:15 np0005486759.ooo.test sudo[116587]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8t2y5tnl/privsep.sock
Oct 14 09:03:15 np0005486759.ooo.test sudo[116587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:15 np0005486759.ooo.test sudo[116587]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:16 np0005486759.ooo.test sudo[116598]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9ceo_wfh/privsep.sock
Oct 14 09:03:16 np0005486759.ooo.test sudo[116598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:03:16 np0005486759.ooo.test podman[116601]: 2025-10-14 09:03:16.109096402 +0000 UTC m=+0.059844316 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git)
Oct 14 09:03:16 np0005486759.ooo.test podman[116601]: 2025-10-14 09:03:16.119301769 +0000 UTC m=+0.070049613 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44)
Oct 14 09:03:16 np0005486759.ooo.test podman[116601]: unhealthy
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: tmp-crun.P0tBdv.mount: Deactivated successfully.
Oct 14 09:03:16 np0005486759.ooo.test podman[116600]: 2025-10-14 09:03:16.183789388 +0000 UTC m=+0.134322235 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-07-21T16:28:53, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:03:16 np0005486759.ooo.test podman[116600]: 2025-10-14 09:03:16.221852553 +0000 UTC m=+0.172385400 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1, batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:03:16 np0005486759.ooo.test podman[116600]: unhealthy
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:16 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:03:16 np0005486759.ooo.test sudo[116598]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:16 np0005486759.ooo.test sudo[116649]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3mhuy96v/privsep.sock
Oct 14 09:03:16 np0005486759.ooo.test sudo[116649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:17 np0005486759.ooo.test sudo[116649]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:17 np0005486759.ooo.test sudo[116660]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcfxjfuxs/privsep.sock
Oct 14 09:03:17 np0005486759.ooo.test sudo[116660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:18 np0005486759.ooo.test sudo[116660]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:18 np0005486759.ooo.test sudo[116671]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgc9zay4w/privsep.sock
Oct 14 09:03:18 np0005486759.ooo.test sudo[116671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:19 np0005486759.ooo.test sudo[116671]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:19 np0005486759.ooo.test sudo[116684]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppzhat0hk/privsep.sock
Oct 14 09:03:19 np0005486759.ooo.test sudo[116684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:20 np0005486759.ooo.test sudo[116684]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:20 np0005486759.ooo.test sudo[116699]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpih596smo/privsep.sock
Oct 14 09:03:20 np0005486759.ooo.test sudo[116699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:21 np0005486759.ooo.test sudo[116699]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:21 np0005486759.ooo.test sudo[116710]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphj5bikwl/privsep.sock
Oct 14 09:03:21 np0005486759.ooo.test sudo[116710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:21 np0005486759.ooo.test sudo[116710]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:22 np0005486759.ooo.test sudo[116721]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqi606t3t/privsep.sock
Oct 14 09:03:22 np0005486759.ooo.test sudo[116721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:22 np0005486759.ooo.test sudo[116721]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:22 np0005486759.ooo.test sudo[116732]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1c90pwhf/privsep.sock
Oct 14 09:03:22 np0005486759.ooo.test sudo[116732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:23 np0005486759.ooo.test sudo[116732]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:23 np0005486759.ooo.test sudo[116743]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppdq9l4tk/privsep.sock
Oct 14 09:03:23 np0005486759.ooo.test sudo[116743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:24 np0005486759.ooo.test sudo[116743]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:24 np0005486759.ooo.test sudo[116754]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptkw5b8u1/privsep.sock
Oct 14 09:03:24 np0005486759.ooo.test sudo[116754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:25 np0005486759.ooo.test sudo[116754]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:25 np0005486759.ooo.test sudo[116771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnaiy3l6e/privsep.sock
Oct 14 09:03:25 np0005486759.ooo.test sudo[116771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:26 np0005486759.ooo.test sudo[116771]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:26 np0005486759.ooo.test sudo[116782]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1r349q46/privsep.sock
Oct 14 09:03:26 np0005486759.ooo.test sudo[116782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:27 np0005486759.ooo.test sudo[116782]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:27 np0005486759.ooo.test sudo[116793]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl7cf7csq/privsep.sock
Oct 14 09:03:27 np0005486759.ooo.test sudo[116793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:27 np0005486759.ooo.test sudo[116793]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:28 np0005486759.ooo.test sudo[116804]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdh6jw6p6/privsep.sock
Oct 14 09:03:28 np0005486759.ooo.test sudo[116804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:28 np0005486759.ooo.test sudo[116804]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:28 np0005486759.ooo.test sudo[116815]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzgm9tws7/privsep.sock
Oct 14 09:03:28 np0005486759.ooo.test sudo[116815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:29 np0005486759.ooo.test sudo[116815]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:29 np0005486759.ooo.test sudo[116826]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpll9jjc9c/privsep.sock
Oct 14 09:03:29 np0005486759.ooo.test sudo[116826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:30 np0005486759.ooo.test sudo[116826]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:30 np0005486759.ooo.test sudo[116843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvjlejgti/privsep.sock
Oct 14 09:03:30 np0005486759.ooo.test sudo[116843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:31 np0005486759.ooo.test sudo[116843]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:03:31 np0005486759.ooo.test podman[116850]: 2025-10-14 09:03:31.387927917 +0000 UTC m=+0.080230020 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-type=git, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Oct 14 09:03:31 np0005486759.ooo.test podman[116850]: 2025-10-14 09:03:31.422039159 +0000 UTC m=+0.114341272 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, release=2, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public)
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: tmp-crun.6WW3ys.mount: Deactivated successfully.
Oct 14 09:03:31 np0005486759.ooo.test podman[116870]: 2025-10-14 09:03:31.476632399 +0000 UTC m=+0.103164534 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, build-date=2025-07-21T14:48:37, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step5, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Oct 14 09:03:31 np0005486759.ooo.test podman[116870]: 2025-10-14 09:03:31.503367422 +0000 UTC m=+0.129899577 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, name=rhosp17/openstack-nova-compute, version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, release=1, config_id=tripleo_step5, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:03:31 np0005486759.ooo.test podman[116848]: 2025-10-14 09:03:31.431497874 +0000 UTC m=+0.127345777 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Oct 14 09:03:31 np0005486759.ooo.test sudo[116913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp61mtg2px/privsep.sock
Oct 14 09:03:31 np0005486759.ooo.test sudo[116913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:31 np0005486759.ooo.test podman[116848]: 2025-10-14 09:03:31.561564935 +0000 UTC m=+0.257412878 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, architecture=x86_64, release=1, version=17.1.9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, vcs-type=git)
Oct 14 09:03:31 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:03:32 np0005486759.ooo.test sudo[116913]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:32 np0005486759.ooo.test sudo[116924]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjlvgr0x9/privsep.sock
Oct 14 09:03:32 np0005486759.ooo.test sudo[116924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:33 np0005486759.ooo.test sudo[116924]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:33 np0005486759.ooo.test sudo[116935]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt9qqm150/privsep.sock
Oct 14 09:03:33 np0005486759.ooo.test sudo[116935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:33 np0005486759.ooo.test sudo[116935]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:34 np0005486759.ooo.test sudo[116946]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp42v06960/privsep.sock
Oct 14 09:03:34 np0005486759.ooo.test sudo[116946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:34 np0005486759.ooo.test sudo[116946]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:34 np0005486759.ooo.test sudo[116957]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr798q5pc/privsep.sock
Oct 14 09:03:34 np0005486759.ooo.test sudo[116957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:35 np0005486759.ooo.test sudo[116957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:35 np0005486759.ooo.test sudo[116970]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl_iy6kwx/privsep.sock
Oct 14 09:03:35 np0005486759.ooo.test sudo[116970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:36 np0005486759.ooo.test sudo[116970]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:36 np0005486759.ooo.test sudo[116985]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd72n_f1c/privsep.sock
Oct 14 09:03:36 np0005486759.ooo.test sudo[116985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:37 np0005486759.ooo.test sudo[116985]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:37 np0005486759.ooo.test sudo[116996]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp01pbtfcb/privsep.sock
Oct 14 09:03:37 np0005486759.ooo.test sudo[116996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:37 np0005486759.ooo.test sudo[116996]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:03:38 np0005486759.ooo.test systemd[1]: tmp-crun.rk5VnS.mount: Deactivated successfully.
Oct 14 09:03:38 np0005486759.ooo.test podman[117001]: 2025-10-14 09:03:38.014636008 +0000 UTC m=+0.083457650 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:03:38 np0005486759.ooo.test podman[117001]: 2025-10-14 09:03:38.177405698 +0000 UTC m=+0.246227370 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64)
Oct 14 09:03:38 np0005486759.ooo.test sudo[117036]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprb2y_z89/privsep.sock
Oct 14 09:03:38 np0005486759.ooo.test sudo[117036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:38 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:03:38 np0005486759.ooo.test sudo[117036]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:39 np0005486759.ooo.test sudo[117047]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyvjafglu/privsep.sock
Oct 14 09:03:39 np0005486759.ooo.test sudo[117047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:39 np0005486759.ooo.test sudo[117047]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:40 np0005486759.ooo.test sudo[117058]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc3p8ktg0/privsep.sock
Oct 14 09:03:40 np0005486759.ooo.test sudo[117058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:40 np0005486759.ooo.test sudo[117058]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:03:40 np0005486759.ooo.test podman[117062]: 2025-10-14 09:03:40.634621608 +0000 UTC m=+0.054000382 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, release=1)
Oct 14 09:03:40 np0005486759.ooo.test sudo[117091]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6mq_pmci/privsep.sock
Oct 14 09:03:40 np0005486759.ooo.test sudo[117091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:40 np0005486759.ooo.test podman[117062]: 2025-10-14 09:03:40.976941701 +0000 UTC m=+0.396320445 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, name=rhosp17/openstack-nova-compute, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 09:03:40 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:03:41 np0005486759.ooo.test sudo[117091]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:41 np0005486759.ooo.test podman[117102]: 2025-10-14 09:03:41.450397927 +0000 UTC m=+0.073976556 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, name=rhosp17/openstack-ceilometer-compute, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4)
Oct 14 09:03:41 np0005486759.ooo.test podman[117103]: 2025-10-14 09:03:41.481012249 +0000 UTC m=+0.104484305 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:03:41 np0005486759.ooo.test podman[117103]: 2025-10-14 09:03:41.494125458 +0000 UTC m=+0.117597554 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, version=17.1.9, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:03:41 np0005486759.ooo.test podman[117103]: unhealthy
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:03:41 np0005486759.ooo.test podman[117102]: 2025-10-14 09:03:41.516082192 +0000 UTC m=+0.139660811 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.9, batch=17.1_20250721.1, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute)
Oct 14 09:03:41 np0005486759.ooo.test podman[117102]: unhealthy
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:03:41 np0005486759.ooo.test podman[117100]: 2025-10-14 09:03:41.43157933 +0000 UTC m=+0.059381450 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 09:03:41 np0005486759.ooo.test podman[117100]: 2025-10-14 09:03:41.564682965 +0000 UTC m=+0.192485335 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-cron, version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, vcs-type=git, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, architecture=x86_64)
Oct 14 09:03:41 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:03:41 np0005486759.ooo.test sudo[117168]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp729wkipc/privsep.sock
Oct 14 09:03:41 np0005486759.ooo.test sudo[117168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:42 np0005486759.ooo.test sudo[117168]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:42 np0005486759.ooo.test sudo[117179]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgevt5jii/privsep.sock
Oct 14 09:03:42 np0005486759.ooo.test sudo[117179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:43 np0005486759.ooo.test sudo[117179]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:43 np0005486759.ooo.test sudo[117190]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp26zjb_2p/privsep.sock
Oct 14 09:03:43 np0005486759.ooo.test sudo[117190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:43 np0005486759.ooo.test sudo[117190]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:44 np0005486759.ooo.test sudo[117201]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp46zx2lup/privsep.sock
Oct 14 09:03:44 np0005486759.ooo.test sudo[117201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:44 np0005486759.ooo.test sudo[117201]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:45 np0005486759.ooo.test sudo[117212]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphn77avsg/privsep.sock
Oct 14 09:03:45 np0005486759.ooo.test sudo[117212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:45 np0005486759.ooo.test sudo[117212]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:45 np0005486759.ooo.test sudo[117223]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpigdc136p/privsep.sock
Oct 14 09:03:45 np0005486759.ooo.test sudo[117223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: tmp-crun.a184tB.mount: Deactivated successfully.
Oct 14 09:03:46 np0005486759.ooo.test podman[117227]: 2025-10-14 09:03:46.46925434 +0000 UTC m=+0.095891927 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, release=1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 09:03:46 np0005486759.ooo.test podman[117227]: 2025-10-14 09:03:46.506891183 +0000 UTC m=+0.133528750 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, distribution-scope=public)
Oct 14 09:03:46 np0005486759.ooo.test podman[117227]: unhealthy
Oct 14 09:03:46 np0005486759.ooo.test podman[117226]: 2025-10-14 09:03:46.518150274 +0000 UTC m=+0.145265296 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:03:46 np0005486759.ooo.test podman[117226]: 2025-10-14 09:03:46.531481948 +0000 UTC m=+0.158597010 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1)
Oct 14 09:03:46 np0005486759.ooo.test podman[117226]: unhealthy
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:03:46 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:03:46 np0005486759.ooo.test sudo[117223]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:46 np0005486759.ooo.test sudo[117280]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqt_u6qoo/privsep.sock
Oct 14 09:03:46 np0005486759.ooo.test sudo[117280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:47 np0005486759.ooo.test sudo[117280]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:47 np0005486759.ooo.test sudo[117291]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaxc4abuu/privsep.sock
Oct 14 09:03:47 np0005486759.ooo.test sudo[117291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:48 np0005486759.ooo.test sudo[117291]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:48 np0005486759.ooo.test sudo[117302]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6hiycdfa/privsep.sock
Oct 14 09:03:48 np0005486759.ooo.test sudo[117302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:49 np0005486759.ooo.test sudo[117302]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:49 np0005486759.ooo.test sudo[117313]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkhu2al7e/privsep.sock
Oct 14 09:03:49 np0005486759.ooo.test sudo[117313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:50 np0005486759.ooo.test sudo[117313]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:50 np0005486759.ooo.test sudo[117324]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo5hfta1q/privsep.sock
Oct 14 09:03:50 np0005486759.ooo.test sudo[117324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:50 np0005486759.ooo.test sudo[117324]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:51 np0005486759.ooo.test sudo[117335]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqrdjp80t/privsep.sock
Oct 14 09:03:51 np0005486759.ooo.test sudo[117335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:51 np0005486759.ooo.test sudo[117335]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:51 np0005486759.ooo.test sudo[117348]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnu7m53pq/privsep.sock
Oct 14 09:03:51 np0005486759.ooo.test sudo[117348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:52 np0005486759.ooo.test sudo[117348]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:52 np0005486759.ooo.test sudo[117363]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjst66_q8/privsep.sock
Oct 14 09:03:52 np0005486759.ooo.test sudo[117363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:53 np0005486759.ooo.test sudo[117363]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:53 np0005486759.ooo.test sudo[117374]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0snzd59z/privsep.sock
Oct 14 09:03:53 np0005486759.ooo.test sudo[117374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:54 np0005486759.ooo.test sudo[117374]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:54 np0005486759.ooo.test sudo[117385]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0564gehd/privsep.sock
Oct 14 09:03:54 np0005486759.ooo.test sudo[117385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:55 np0005486759.ooo.test sudo[117385]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:55 np0005486759.ooo.test sudo[117396]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptszw2hkw/privsep.sock
Oct 14 09:03:55 np0005486759.ooo.test sudo[117396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:55 np0005486759.ooo.test sudo[117396]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:56 np0005486759.ooo.test sudo[117407]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0nifgbr7/privsep.sock
Oct 14 09:03:56 np0005486759.ooo.test sudo[117407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:56 np0005486759.ooo.test sudo[117407]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:57 np0005486759.ooo.test sudo[117418]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpko1ujno0/privsep.sock
Oct 14 09:03:57 np0005486759.ooo.test sudo[117418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:57 np0005486759.ooo.test sudo[117418]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:57 np0005486759.ooo.test sudo[117435]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq3o1_m4f/privsep.sock
Oct 14 09:03:57 np0005486759.ooo.test sudo[117435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:58 np0005486759.ooo.test sudo[117435]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:58 np0005486759.ooo.test sudo[117446]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2ds5j4vn/privsep.sock
Oct 14 09:03:58 np0005486759.ooo.test sudo[117446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:59 np0005486759.ooo.test sudo[117446]: pam_unix(sudo:session): session closed for user root
Oct 14 09:03:59 np0005486759.ooo.test sudo[117457]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6gtc6fch/privsep.sock
Oct 14 09:03:59 np0005486759.ooo.test sudo[117457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:03:59 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:03:59 np0005486759.ooo.test recover_tripleo_nova_virtqemud[117460]: 47951
Oct 14 09:03:59 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:03:59 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:04:00 np0005486759.ooo.test sudo[117457]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:00 np0005486759.ooo.test sudo[117470]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt8z_3f68/privsep.sock
Oct 14 09:04:00 np0005486759.ooo.test sudo[117470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:01 np0005486759.ooo.test sudo[117470]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:01 np0005486759.ooo.test sudo[117481]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4yu11889/privsep.sock
Oct 14 09:04:01 np0005486759.ooo.test sudo[117481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:01 np0005486759.ooo.test sudo[117481]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:04:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:04:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:04:01 np0005486759.ooo.test systemd[1]: tmp-crun.jJKlLq.mount: Deactivated successfully.
Oct 14 09:04:01 np0005486759.ooo.test podman[117488]: 2025-10-14 09:04:01.935568805 +0000 UTC m=+0.101129581 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:04:02 np0005486759.ooo.test podman[117489]: 2025-10-14 09:04:02.016100113 +0000 UTC m=+0.176733255 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, name=rhosp17/openstack-collectd)
Oct 14 09:04:02 np0005486759.ooo.test podman[117488]: 2025-10-14 09:04:02.01954556 +0000 UTC m=+0.185106346 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, release=1, version=17.1.9, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public)
Oct 14 09:04:02 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:04:02 np0005486759.ooo.test podman[117489]: 2025-10-14 09:04:02.031113721 +0000 UTC m=+0.191746873 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=2, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, com.redhat.component=openstack-collectd-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Oct 14 09:04:02 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:04:02 np0005486759.ooo.test podman[117487]: 2025-10-14 09:04:01.989231456 +0000 UTC m=+0.155943347 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, version=17.1.9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid)
Oct 14 09:04:02 np0005486759.ooo.test sudo[117557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpitqkh_ia/privsep.sock
Oct 14 09:04:02 np0005486759.ooo.test sudo[117557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:02 np0005486759.ooo.test podman[117487]: 2025-10-14 09:04:02.074351998 +0000 UTC m=+0.241063919 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-07-21T13:27:15, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 09:04:02 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:04:02 np0005486759.ooo.test sudo[117557]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:02 np0005486759.ooo.test sudo[117573]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ik6c0ez/privsep.sock
Oct 14 09:04:02 np0005486759.ooo.test sudo[117573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:03 np0005486759.ooo.test sudo[117573]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:03 np0005486759.ooo.test sudo[117585]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6w9bfo0a/privsep.sock
Oct 14 09:04:03 np0005486759.ooo.test sudo[117585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:04 np0005486759.ooo.test sudo[117585]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:04 np0005486759.ooo.test sudo[117596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppsc54d1u/privsep.sock
Oct 14 09:04:04 np0005486759.ooo.test sudo[117596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:05 np0005486759.ooo.test sudo[117596]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:05 np0005486759.ooo.test sudo[117607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp6qa90b9/privsep.sock
Oct 14 09:04:05 np0005486759.ooo.test sudo[117607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:06 np0005486759.ooo.test sudo[117607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:06 np0005486759.ooo.test sudo[117618]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl4j5ctgx/privsep.sock
Oct 14 09:04:06 np0005486759.ooo.test sudo[117618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:06 np0005486759.ooo.test sudo[117618]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:07 np0005486759.ooo.test sudo[117629]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfrvi2y8j/privsep.sock
Oct 14 09:04:07 np0005486759.ooo.test sudo[117629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:07 np0005486759.ooo.test sudo[117629]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:08 np0005486759.ooo.test sudo[117642]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi3d4kvvu/privsep.sock
Oct 14 09:04:08 np0005486759.ooo.test sudo[117642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:04:08 np0005486759.ooo.test podman[117648]: 2025-10-14 09:04:08.442533237 +0000 UTC m=+0.067472152 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, release=1, vendor=Red Hat, Inc., batch=17.1_20250721.1)
Oct 14 09:04:08 np0005486759.ooo.test podman[117648]: 2025-10-14 09:04:08.606448872 +0000 UTC m=+0.231387737 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, batch=17.1_20250721.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git)
Oct 14 09:04:08 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:04:08 np0005486759.ooo.test sudo[117642]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:09 np0005486759.ooo.test sudo[117687]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcnhuh5up/privsep.sock
Oct 14 09:04:09 np0005486759.ooo.test sudo[117687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:09 np0005486759.ooo.test sudo[117687]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:09 np0005486759.ooo.test sudo[117698]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp82mnzv5x/privsep.sock
Oct 14 09:04:09 np0005486759.ooo.test sudo[117698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:10 np0005486759.ooo.test sudo[117698]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:10 np0005486759.ooo.test sudo[117709]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe5h43i_z/privsep.sock
Oct 14 09:04:10 np0005486759.ooo.test sudo[117709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: tmp-crun.ZQ4hfI.mount: Deactivated successfully.
Oct 14 09:04:11 np0005486759.ooo.test podman[117712]: 2025-10-14 09:04:11.439223719 +0000 UTC m=+0.075687148 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37)
Oct 14 09:04:11 np0005486759.ooo.test sudo[117709]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:11 np0005486759.ooo.test sudo[117742]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd796e4mq/privsep.sock
Oct 14 09:04:11 np0005486759.ooo.test sudo[117742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:04:11 np0005486759.ooo.test podman[117744]: 2025-10-14 09:04:11.856140895 +0000 UTC m=+0.084371189 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 09:04:11 np0005486759.ooo.test podman[117712]: 2025-10-14 09:04:11.881110762 +0000 UTC m=+0.517574171 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, container_name=nova_migration_target, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:04:11 np0005486759.ooo.test podman[117744]: 2025-10-14 09:04:11.887471911 +0000 UTC m=+0.115702195 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64)
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:04:11 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:04:11 np0005486759.ooo.test podman[117746]: 2025-10-14 09:04:11.961376453 +0000 UTC m=+0.186031076 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.9, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:04:11 np0005486759.ooo.test podman[117745]: 2025-10-14 09:04:11.833446298 +0000 UTC m=+0.061159386 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible)
Oct 14 09:04:12 np0005486759.ooo.test podman[117746]: 2025-10-14 09:04:12.005321192 +0000 UTC m=+0.229975775 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:04:12 np0005486759.ooo.test podman[117746]: unhealthy
Oct 14 09:04:12 np0005486759.ooo.test podman[117745]: 2025-10-14 09:04:12.013764265 +0000 UTC m=+0.241477413 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3)
Oct 14 09:04:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:04:12 np0005486759.ooo.test podman[117745]: unhealthy
Oct 14 09:04:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:04:12 np0005486759.ooo.test sudo[117742]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:12 np0005486759.ooo.test sudo[117808]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgnr_7t5c/privsep.sock
Oct 14 09:04:12 np0005486759.ooo.test sudo[117808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:13 np0005486759.ooo.test sudo[117808]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:13 np0005486759.ooo.test sudo[117821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprqq5wqv8/privsep.sock
Oct 14 09:04:13 np0005486759.ooo.test sudo[117821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:14 np0005486759.ooo.test sudo[117821]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:14 np0005486759.ooo.test sudo[117836]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkqnakvit/privsep.sock
Oct 14 09:04:14 np0005486759.ooo.test sudo[117836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:14 np0005486759.ooo.test sudo[117836]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:15 np0005486759.ooo.test sudo[117847]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppctfo6jt/privsep.sock
Oct 14 09:04:15 np0005486759.ooo.test sudo[117847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:15 np0005486759.ooo.test sudo[117847]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:16 np0005486759.ooo.test sudo[117858]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphh6ytdob/privsep.sock
Oct 14 09:04:16 np0005486759.ooo.test sudo[117858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:16 np0005486759.ooo.test sudo[117858]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:04:16 np0005486759.ooo.test podman[117865]: 2025-10-14 09:04:16.790337083 +0000 UTC m=+0.059143323 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-07-21T13:28:44)
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: tmp-crun.f6XVSl.mount: Deactivated successfully.
Oct 14 09:04:16 np0005486759.ooo.test podman[117862]: 2025-10-14 09:04:16.847217074 +0000 UTC m=+0.117410458 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.33.12)
Oct 14 09:04:16 np0005486759.ooo.test podman[117862]: 2025-10-14 09:04:16.863217363 +0000 UTC m=+0.133410797 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, release=1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:04:16 np0005486759.ooo.test podman[117862]: unhealthy
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:04:16 np0005486759.ooo.test podman[117865]: 2025-10-14 09:04:16.880128909 +0000 UTC m=+0.148935149 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:28:44, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.33.12, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:04:16 np0005486759.ooo.test podman[117865]: unhealthy
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:16 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:04:16 np0005486759.ooo.test sudo[117907]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj7z7z8wl/privsep.sock
Oct 14 09:04:16 np0005486759.ooo.test sudo[117907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:17 np0005486759.ooo.test sudo[117907]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:17 np0005486759.ooo.test sudo[117918]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps48y27o6/privsep.sock
Oct 14 09:04:17 np0005486759.ooo.test sudo[117918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:18 np0005486759.ooo.test sudo[117918]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:18 np0005486759.ooo.test sudo[117929]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkgbytz_e/privsep.sock
Oct 14 09:04:18 np0005486759.ooo.test sudo[117929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:19 np0005486759.ooo.test sudo[117929]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:19 np0005486759.ooo.test sudo[117946]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuv3om2q8/privsep.sock
Oct 14 09:04:19 np0005486759.ooo.test sudo[117946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:20 np0005486759.ooo.test sudo[117946]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:20 np0005486759.ooo.test sudo[117957]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe9rjyp34/privsep.sock
Oct 14 09:04:20 np0005486759.ooo.test sudo[117957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:21 np0005486759.ooo.test sudo[117957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:21 np0005486759.ooo.test sudo[117968]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaum4ezjp/privsep.sock
Oct 14 09:04:21 np0005486759.ooo.test sudo[117968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:21 np0005486759.ooo.test sudo[117968]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:22 np0005486759.ooo.test sudo[117979]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9cbguf1p/privsep.sock
Oct 14 09:04:22 np0005486759.ooo.test sudo[117979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:22 np0005486759.ooo.test sudo[117979]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:23 np0005486759.ooo.test sudo[117990]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps2fjztjl/privsep.sock
Oct 14 09:04:23 np0005486759.ooo.test sudo[117990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:23 np0005486759.ooo.test sudo[117990]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:23 np0005486759.ooo.test sudo[118001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmj5mqg03/privsep.sock
Oct 14 09:04:23 np0005486759.ooo.test sudo[118001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:24 np0005486759.ooo.test sudo[118001]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:24 np0005486759.ooo.test sudo[118018]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp40flvg4z/privsep.sock
Oct 14 09:04:24 np0005486759.ooo.test sudo[118018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:25 np0005486759.ooo.test sudo[118018]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:25 np0005486759.ooo.test sudo[118029]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5zt1s29v/privsep.sock
Oct 14 09:04:25 np0005486759.ooo.test sudo[118029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:26 np0005486759.ooo.test sudo[118029]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:26 np0005486759.ooo.test sudo[118040]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgeu_hjx3/privsep.sock
Oct 14 09:04:26 np0005486759.ooo.test sudo[118040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:27 np0005486759.ooo.test sudo[118040]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:27 np0005486759.ooo.test sudo[118051]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz_ajkodi/privsep.sock
Oct 14 09:04:27 np0005486759.ooo.test sudo[118051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:27 np0005486759.ooo.test sudo[118051]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:28 np0005486759.ooo.test sudo[118062]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqk0c3tbr/privsep.sock
Oct 14 09:04:28 np0005486759.ooo.test sudo[118062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:28 np0005486759.ooo.test sudo[118062]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:28 np0005486759.ooo.test sudo[118073]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwr7s91ty/privsep.sock
Oct 14 09:04:28 np0005486759.ooo.test sudo[118073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:29 np0005486759.ooo.test sudo[118073]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:29 np0005486759.ooo.test sudo[118089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm23sl61c/privsep.sock
Oct 14 09:04:29 np0005486759.ooo.test sudo[118089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:30 np0005486759.ooo.test sudo[118089]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:30 np0005486759.ooo.test sudo[118101]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwlh0yqrn/privsep.sock
Oct 14 09:04:30 np0005486759.ooo.test sudo[118101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:31 np0005486759.ooo.test sudo[118101]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:31 np0005486759.ooo.test sudo[118112]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbznw6cq9/privsep.sock
Oct 14 09:04:31 np0005486759.ooo.test sudo[118112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:32 np0005486759.ooo.test sudo[118112]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: tmp-crun.TUu4Ug.mount: Deactivated successfully.
Oct 14 09:04:32 np0005486759.ooo.test podman[118116]: 2025-10-14 09:04:32.343198563 +0000 UTC m=+0.068469553 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, vcs-type=git, version=17.1.9, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.openshift.expose-services=, release=1, batch=17.1_20250721.1)
Oct 14 09:04:32 np0005486759.ooo.test podman[118119]: 2025-10-14 09:04:32.407153255 +0000 UTC m=+0.130923027 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, architecture=x86_64, config_id=tripleo_step5, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 09:04:32 np0005486759.ooo.test podman[118116]: 2025-10-14 09:04:32.428063197 +0000 UTC m=+0.153334187 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, release=1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:04:32 np0005486759.ooo.test podman[118119]: 2025-10-14 09:04:32.458203096 +0000 UTC m=+0.181972808 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, release=1, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37)
Oct 14 09:04:32 np0005486759.ooo.test podman[118120]: 2025-10-14 09:04:32.414891057 +0000 UTC m=+0.134514021 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=2, batch=17.1_20250721.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=collectd, vendor=Red Hat, Inc., version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container)
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:04:32 np0005486759.ooo.test podman[118120]: 2025-10-14 09:04:32.493097263 +0000 UTC m=+0.212720227 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, architecture=x86_64, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.9, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, vcs-type=git, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:04:32 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:04:32 np0005486759.ooo.test sudo[118184]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp20x3_8nb/privsep.sock
Oct 14 09:04:32 np0005486759.ooo.test sudo[118184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:33 np0005486759.ooo.test sudo[118184]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:33 np0005486759.ooo.test sudo[118195]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpolpdhzaj/privsep.sock
Oct 14 09:04:33 np0005486759.ooo.test sudo[118195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:33 np0005486759.ooo.test sudo[118195]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:34 np0005486759.ooo.test sudo[118206]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx4i29b92/privsep.sock
Oct 14 09:04:34 np0005486759.ooo.test sudo[118206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:34 np0005486759.ooo.test sudo[118206]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:35 np0005486759.ooo.test sudo[118220]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5rgvjywx/privsep.sock
Oct 14 09:04:35 np0005486759.ooo.test sudo[118220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:35 np0005486759.ooo.test sudo[118220]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:35 np0005486759.ooo.test sudo[118234]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp23bg3zh5/privsep.sock
Oct 14 09:04:35 np0005486759.ooo.test sudo[118234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:36 np0005486759.ooo.test sudo[118234]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:36 np0005486759.ooo.test sudo[118245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9yz77_gg/privsep.sock
Oct 14 09:04:36 np0005486759.ooo.test sudo[118245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:37 np0005486759.ooo.test sudo[118245]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:37 np0005486759.ooo.test sudo[118256]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcr5s1q5d/privsep.sock
Oct 14 09:04:37 np0005486759.ooo.test sudo[118256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:38 np0005486759.ooo.test sudo[118256]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:38 np0005486759.ooo.test sudo[118267]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp68qxgkxt/privsep.sock
Oct 14 09:04:38 np0005486759.ooo.test sudo[118267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:39 np0005486759.ooo.test sudo[118267]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:04:39 np0005486759.ooo.test podman[118271]: 2025-10-14 09:04:39.125792521 +0000 UTC m=+0.053518458 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=1, tcib_managed=true)
Oct 14 09:04:39 np0005486759.ooo.test podman[118271]: 2025-10-14 09:04:39.348412764 +0000 UTC m=+0.276138701 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.9)
Oct 14 09:04:39 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:04:39 np0005486759.ooo.test sudo[118308]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbyx3fy4w/privsep.sock
Oct 14 09:04:39 np0005486759.ooo.test sudo[118308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:39 np0005486759.ooo.test sudo[118308]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:40 np0005486759.ooo.test sudo[118319]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqik58mff/privsep.sock
Oct 14 09:04:40 np0005486759.ooo.test sudo[118319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:40 np0005486759.ooo.test sudo[118319]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:41 np0005486759.ooo.test sudo[118336]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpktn9dgya/privsep.sock
Oct 14 09:04:41 np0005486759.ooo.test sudo[118336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:41 np0005486759.ooo.test sudo[118336]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:41 np0005486759.ooo.test sudo[118347]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa3rzsgpf/privsep.sock
Oct 14 09:04:41 np0005486759.ooo.test sudo[118347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:04:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: tmp-crun.0HqA8X.mount: Deactivated successfully.
Oct 14 09:04:42 np0005486759.ooo.test podman[118350]: 2025-10-14 09:04:42.109575002 +0000 UTC m=+0.131752115 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1)
Oct 14 09:04:42 np0005486759.ooo.test podman[118349]: 2025-10-14 09:04:42.076226223 +0000 UTC m=+0.102099141 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=)
Oct 14 09:04:42 np0005486759.ooo.test podman[118377]: 2025-10-14 09:04:42.184056941 +0000 UTC m=+0.099600143 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, release=1, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git)
Oct 14 09:04:42 np0005486759.ooo.test podman[118377]: 2025-10-14 09:04:42.195155937 +0000 UTC m=+0.110699069 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:04:42 np0005486759.ooo.test podman[118377]: unhealthy
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:04:42 np0005486759.ooo.test podman[118349]: 2025-10-14 09:04:42.208448721 +0000 UTC m=+0.234321609 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20250721.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, version=17.1.9)
Oct 14 09:04:42 np0005486759.ooo.test podman[118374]: 2025-10-14 09:04:42.165082771 +0000 UTC m=+0.083995207 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, release=1, build-date=2025-07-21T14:45:33, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=)
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:04:42 np0005486759.ooo.test podman[118374]: 2025-10-14 09:04:42.245290678 +0000 UTC m=+0.164203124 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:04:42 np0005486759.ooo.test podman[118374]: unhealthy
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:04:42 np0005486759.ooo.test podman[118350]: 2025-10-14 09:04:42.463280728 +0000 UTC m=+0.485457731 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, name=rhosp17/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, distribution-scope=public, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container)
Oct 14 09:04:42 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:04:42 np0005486759.ooo.test sudo[118347]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:42 np0005486759.ooo.test sudo[118439]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppgdeuqjs/privsep.sock
Oct 14 09:04:42 np0005486759.ooo.test sudo[118439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:43 np0005486759.ooo.test systemd[1]: tmp-crun.k0k5CG.mount: Deactivated successfully.
Oct 14 09:04:43 np0005486759.ooo.test sudo[118439]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:43 np0005486759.ooo.test sudo[118450]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8mnvh3xh/privsep.sock
Oct 14 09:04:43 np0005486759.ooo.test sudo[118450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:44 np0005486759.ooo.test sudo[118450]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:44 np0005486759.ooo.test sudo[118461]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphka99kud/privsep.sock
Oct 14 09:04:44 np0005486759.ooo.test sudo[118461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:45 np0005486759.ooo.test sudo[118461]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:45 np0005486759.ooo.test sudo[118472]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz4vi3akh/privsep.sock
Oct 14 09:04:45 np0005486759.ooo.test sudo[118472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:46 np0005486759.ooo.test sudo[118472]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:46 np0005486759.ooo.test sudo[118489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp193cc8a/privsep.sock
Oct 14 09:04:46 np0005486759.ooo.test sudo[118489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:46 np0005486759.ooo.test sudo[118489]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:04:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:04:47 np0005486759.ooo.test systemd[1]: tmp-crun.VDLF1A.mount: Deactivated successfully.
Oct 14 09:04:47 np0005486759.ooo.test podman[118493]: 2025-10-14 09:04:47.042667205 +0000 UTC m=+0.056649136 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:04:47 np0005486759.ooo.test podman[118493]: 2025-10-14 09:04:47.05827482 +0000 UTC m=+0.072256791 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, version=17.1.9, distribution-scope=public, build-date=2025-07-21T16:28:53, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:04:47 np0005486759.ooo.test podman[118493]: unhealthy
Oct 14 09:04:47 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:47 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:04:47 np0005486759.ooo.test podman[118496]: 2025-10-14 09:04:47.103250541 +0000 UTC m=+0.113088083 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, version=17.1.9, container_name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4)
Oct 14 09:04:47 np0005486759.ooo.test podman[118496]: 2025-10-14 09:04:47.116281607 +0000 UTC m=+0.126119179 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible)
Oct 14 09:04:47 np0005486759.ooo.test podman[118496]: unhealthy
Oct 14 09:04:47 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:04:47 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:04:47 np0005486759.ooo.test sudo[118539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxekh27fr/privsep.sock
Oct 14 09:04:47 np0005486759.ooo.test sudo[118539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:47 np0005486759.ooo.test sudo[118539]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:48 np0005486759.ooo.test sudo[118550]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqpo659ur/privsep.sock
Oct 14 09:04:48 np0005486759.ooo.test sudo[118550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:48 np0005486759.ooo.test sudo[118550]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:48 np0005486759.ooo.test sudo[118561]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvyyjtn14/privsep.sock
Oct 14 09:04:48 np0005486759.ooo.test sudo[118561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:49 np0005486759.ooo.test sudo[118561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:49 np0005486759.ooo.test sudo[118572]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_gz39fjw/privsep.sock
Oct 14 09:04:49 np0005486759.ooo.test sudo[118572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:50 np0005486759.ooo.test sudo[118572]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:50 np0005486759.ooo.test sudo[118583]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcemik236/privsep.sock
Oct 14 09:04:50 np0005486759.ooo.test sudo[118583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:51 np0005486759.ooo.test sudo[118583]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:51 np0005486759.ooo.test sudo[118600]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmwx2g0b_/privsep.sock
Oct 14 09:04:51 np0005486759.ooo.test sudo[118600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:52 np0005486759.ooo.test sudo[118600]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:52 np0005486759.ooo.test sudo[118611]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv3858v_c/privsep.sock
Oct 14 09:04:52 np0005486759.ooo.test sudo[118611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:53 np0005486759.ooo.test sudo[118611]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:53 np0005486759.ooo.test sudo[118622]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa4fp569j/privsep.sock
Oct 14 09:04:53 np0005486759.ooo.test sudo[118622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:53 np0005486759.ooo.test sudo[118622]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:54 np0005486759.ooo.test sudo[118633]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_s_c605_/privsep.sock
Oct 14 09:04:54 np0005486759.ooo.test sudo[118633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:54 np0005486759.ooo.test sudo[118633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:54 np0005486759.ooo.test sudo[118644]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmcyzt4up/privsep.sock
Oct 14 09:04:54 np0005486759.ooo.test sudo[118644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:55 np0005486759.ooo.test sudo[118644]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:55 np0005486759.ooo.test sudo[118655]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp94qv4gj5/privsep.sock
Oct 14 09:04:55 np0005486759.ooo.test sudo[118655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:56 np0005486759.ooo.test sudo[118655]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:56 np0005486759.ooo.test sudo[118669]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphzbcb9wp/privsep.sock
Oct 14 09:04:56 np0005486759.ooo.test sudo[118669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:57 np0005486759.ooo.test sudo[118669]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:57 np0005486759.ooo.test sudo[118683]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpflkfs_dh/privsep.sock
Oct 14 09:04:57 np0005486759.ooo.test sudo[118683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:58 np0005486759.ooo.test sudo[118683]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:58 np0005486759.ooo.test sudo[118694]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8u6b43jk/privsep.sock
Oct 14 09:04:58 np0005486759.ooo.test sudo[118694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:04:59 np0005486759.ooo.test sudo[118694]: pam_unix(sudo:session): session closed for user root
Oct 14 09:04:59 np0005486759.ooo.test sudo[118705]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyzbtsesh/privsep.sock
Oct 14 09:04:59 np0005486759.ooo.test sudo[118705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:00 np0005486759.ooo.test sudo[118705]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:00 np0005486759.ooo.test sudo[118716]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpke5ejy57/privsep.sock
Oct 14 09:05:00 np0005486759.ooo.test sudo[118716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:00 np0005486759.ooo.test sudo[118716]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:01 np0005486759.ooo.test sudo[118727]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgycigvb1/privsep.sock
Oct 14 09:05:01 np0005486759.ooo.test sudo[118727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:01 np0005486759.ooo.test sudo[118727]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:02 np0005486759.ooo.test sudo[118740]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy4do38dx/privsep.sock
Oct 14 09:05:02 np0005486759.ooo.test sudo[118740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:02 np0005486759.ooo.test sudo[118740]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:05:02 np0005486759.ooo.test podman[118750]: 2025-10-14 09:05:02.718818976 +0000 UTC m=+0.068155674 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-iscsid, version=17.1.9, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, container_name=iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1)
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: tmp-crun.q8uUEp.mount: Deactivated successfully.
Oct 14 09:05:02 np0005486759.ooo.test podman[118752]: 2025-10-14 09:05:02.733241555 +0000 UTC m=+0.073745748 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=2, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, distribution-scope=public, batch=17.1_20250721.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12)
Oct 14 09:05:02 np0005486759.ooo.test podman[118750]: 2025-10-14 09:05:02.760556535 +0000 UTC m=+0.109893233 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9)
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:05:02 np0005486759.ooo.test podman[118751]: 2025-10-14 09:05:02.770922728 +0000 UTC m=+0.116881581 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, release=1, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Oct 14 09:05:02 np0005486759.ooo.test podman[118752]: 2025-10-14 09:05:02.788624589 +0000 UTC m=+0.129128772 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, release=2)
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:05:02 np0005486759.ooo.test podman[118751]: 2025-10-14 09:05:02.818319695 +0000 UTC m=+0.164278538 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, build-date=2025-07-21T14:48:37, version=17.1.9, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git)
Oct 14 09:05:02 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:05:02 np0005486759.ooo.test sudo[118819]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmcoi3rpa/privsep.sock
Oct 14 09:05:02 np0005486759.ooo.test sudo[118819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:03 np0005486759.ooo.test sudo[118819]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:03 np0005486759.ooo.test sudo[118830]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnglh4332/privsep.sock
Oct 14 09:05:03 np0005486759.ooo.test sudo[118830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:04 np0005486759.ooo.test sudo[118830]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:04 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:05:04 np0005486759.ooo.test recover_tripleo_nova_virtqemud[118837]: 47951
Oct 14 09:05:04 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:05:04 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:05:04 np0005486759.ooo.test sudo[118843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjatx5zed/privsep.sock
Oct 14 09:05:04 np0005486759.ooo.test sudo[118843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:05 np0005486759.ooo.test sudo[118843]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:05 np0005486759.ooo.test sudo[118854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprs6pyw49/privsep.sock
Oct 14 09:05:05 np0005486759.ooo.test sudo[118854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:06 np0005486759.ooo.test sudo[118854]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:06 np0005486759.ooo.test sudo[118865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw9ij7m8t/privsep.sock
Oct 14 09:05:06 np0005486759.ooo.test sudo[118865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:06 np0005486759.ooo.test sudo[118865]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:07 np0005486759.ooo.test sudo[118876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5kfezcfy/privsep.sock
Oct 14 09:05:07 np0005486759.ooo.test sudo[118876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:07 np0005486759.ooo.test sudo[118876]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:07 np0005486759.ooo.test sudo[118893]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8ej4stlp/privsep.sock
Oct 14 09:05:07 np0005486759.ooo.test sudo[118893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:08 np0005486759.ooo.test sudo[118893]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:08 np0005486759.ooo.test sudo[118904]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5f5zvjow/privsep.sock
Oct 14 09:05:08 np0005486759.ooo.test sudo[118904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:09 np0005486759.ooo.test sudo[118904]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:05:09 np0005486759.ooo.test systemd[1]: tmp-crun.Eo6k0R.mount: Deactivated successfully.
Oct 14 09:05:09 np0005486759.ooo.test podman[118909]: 2025-10-14 09:05:09.526632628 +0000 UTC m=+0.074887854 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 09:05:09 np0005486759.ooo.test sudo[118943]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg5wvroe7/privsep.sock
Oct 14 09:05:09 np0005486759.ooo.test sudo[118943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:09 np0005486759.ooo.test podman[118909]: 2025-10-14 09:05:09.728372051 +0000 UTC m=+0.276627247 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, version=17.1.9, vendor=Red Hat, Inc., release=1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Oct 14 09:05:09 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:05:10 np0005486759.ooo.test sudo[118943]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:10 np0005486759.ooo.test sudo[118954]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfmpd46nq/privsep.sock
Oct 14 09:05:10 np0005486759.ooo.test sudo[118954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:11 np0005486759.ooo.test sudo[118954]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:11 np0005486759.ooo.test sudo[118965]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps_sb_6fu/privsep.sock
Oct 14 09:05:11 np0005486759.ooo.test sudo[118965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:12 np0005486759.ooo.test sudo[118965]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:12 np0005486759.ooo.test sudo[118976]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7w8mj00d/privsep.sock
Oct 14 09:05:12 np0005486759.ooo.test sudo[118976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:05:12 np0005486759.ooo.test podman[118978]: 2025-10-14 09:05:12.433450561 +0000 UTC m=+0.087899368 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12)
Oct 14 09:05:12 np0005486759.ooo.test podman[118978]: 2025-10-14 09:05:12.464633052 +0000 UTC m=+0.119081839 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.openshift.expose-services=, container_name=logrotate_crond, build-date=2025-07-21T13:07:52, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:05:12 np0005486759.ooo.test podman[118980]: 2025-10-14 09:05:12.477358099 +0000 UTC m=+0.125446098 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-07-21T15:29:47, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:05:12 np0005486759.ooo.test podman[118980]: 2025-10-14 09:05:12.502414419 +0000 UTC m=+0.150502428 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1, architecture=x86_64, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-07-21T15:29:47, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=)
Oct 14 09:05:12 np0005486759.ooo.test podman[118980]: unhealthy
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: tmp-crun.uYTcuk.mount: Deactivated successfully.
Oct 14 09:05:12 np0005486759.ooo.test podman[119024]: 2025-10-14 09:05:12.59267256 +0000 UTC m=+0.093686418 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_migration_target)
Oct 14 09:05:12 np0005486759.ooo.test podman[118979]: 2025-10-14 09:05:12.634115651 +0000 UTC m=+0.284043067 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1)
Oct 14 09:05:12 np0005486759.ooo.test podman[118979]: 2025-10-14 09:05:12.673336983 +0000 UTC m=+0.323264359 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, release=1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, version=17.1.9, io.openshift.expose-services=)
Oct 14 09:05:12 np0005486759.ooo.test podman[118979]: unhealthy
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:05:12 np0005486759.ooo.test sudo[118976]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:12 np0005486759.ooo.test podman[119024]: 2025-10-14 09:05:12.944026534 +0000 UTC m=+0.445040392 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, release=1, vcs-type=git, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:05:12 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:05:13 np0005486759.ooo.test sudo[119073]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmper8jibtn/privsep.sock
Oct 14 09:05:13 np0005486759.ooo.test sudo[119073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:13 np0005486759.ooo.test sudo[119073]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:13 np0005486759.ooo.test sudo[119084]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl6ioe_5h/privsep.sock
Oct 14 09:05:13 np0005486759.ooo.test sudo[119084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:14 np0005486759.ooo.test sudo[119084]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:14 np0005486759.ooo.test sudo[119095]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0dvou2_b/privsep.sock
Oct 14 09:05:14 np0005486759.ooo.test sudo[119095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:15 np0005486759.ooo.test sudo[119095]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:15 np0005486759.ooo.test sudo[119106]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8kkzrp9b/privsep.sock
Oct 14 09:05:15 np0005486759.ooo.test sudo[119106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:16 np0005486759.ooo.test sudo[119106]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:16 np0005486759.ooo.test sudo[119117]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptgcnt4e5/privsep.sock
Oct 14 09:05:16 np0005486759.ooo.test sudo[119117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:17 np0005486759.ooo.test sudo[119117]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:17 np0005486759.ooo.test sudo[119128]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6w204c3c/privsep.sock
Oct 14 09:05:17 np0005486759.ooo.test sudo[119128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:05:17 np0005486759.ooo.test podman[119131]: 2025-10-14 09:05:17.33527255 +0000 UTC m=+0.063533420 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, vcs-type=git, version=17.1.9)
Oct 14 09:05:17 np0005486759.ooo.test podman[119131]: 2025-10-14 09:05:17.344850318 +0000 UTC m=+0.073111288 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, io.openshift.expose-services=, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, distribution-scope=public, tcib_managed=true)
Oct 14 09:05:17 np0005486759.ooo.test podman[119131]: unhealthy
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: tmp-crun.unT8zO.mount: Deactivated successfully.
Oct 14 09:05:17 np0005486759.ooo.test podman[119130]: 2025-10-14 09:05:17.400899154 +0000 UTC m=+0.131325951 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public)
Oct 14 09:05:17 np0005486759.ooo.test podman[119130]: 2025-10-14 09:05:17.409772341 +0000 UTC m=+0.140199158 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1)
Oct 14 09:05:17 np0005486759.ooo.test podman[119130]: unhealthy
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:17 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:05:17 np0005486759.ooo.test sudo[119128]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:18 np0005486759.ooo.test sudo[119179]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmk07rpql/privsep.sock
Oct 14 09:05:18 np0005486759.ooo.test sudo[119179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:18 np0005486759.ooo.test sudo[119179]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:18 np0005486759.ooo.test sudo[119196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3n49e68q/privsep.sock
Oct 14 09:05:18 np0005486759.ooo.test sudo[119196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:19 np0005486759.ooo.test sudo[119196]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:19 np0005486759.ooo.test sudo[119207]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp62ekw9zh/privsep.sock
Oct 14 09:05:19 np0005486759.ooo.test sudo[119207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:20 np0005486759.ooo.test sudo[119207]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:20 np0005486759.ooo.test sudo[119218]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphwo04kxi/privsep.sock
Oct 14 09:05:20 np0005486759.ooo.test sudo[119218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:21 np0005486759.ooo.test sudo[119218]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:21 np0005486759.ooo.test sudo[119229]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9g55uvll/privsep.sock
Oct 14 09:05:21 np0005486759.ooo.test sudo[119229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:22 np0005486759.ooo.test sudo[119229]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:22 np0005486759.ooo.test sudo[119240]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqieerkm9/privsep.sock
Oct 14 09:05:22 np0005486759.ooo.test sudo[119240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:23 np0005486759.ooo.test sudo[119240]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:23 np0005486759.ooo.test sudo[119251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1_4pfou1/privsep.sock
Oct 14 09:05:23 np0005486759.ooo.test sudo[119251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:24 np0005486759.ooo.test sudo[119251]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:24 np0005486759.ooo.test sudo[119268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpek3ja0ls/privsep.sock
Oct 14 09:05:24 np0005486759.ooo.test sudo[119268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:24 np0005486759.ooo.test sudo[119268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:25 np0005486759.ooo.test sudo[119279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3vymb0wj/privsep.sock
Oct 14 09:05:25 np0005486759.ooo.test sudo[119279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:25 np0005486759.ooo.test sudo[119279]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:25 np0005486759.ooo.test sudo[119290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiyrzhven/privsep.sock
Oct 14 09:05:25 np0005486759.ooo.test sudo[119290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:26 np0005486759.ooo.test sudo[119290]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:26 np0005486759.ooo.test sudo[119301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp86_fcyu1/privsep.sock
Oct 14 09:05:26 np0005486759.ooo.test sudo[119301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:27 np0005486759.ooo.test sudo[119301]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:27 np0005486759.ooo.test sudo[119312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiplqbe_9/privsep.sock
Oct 14 09:05:27 np0005486759.ooo.test sudo[119312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:28 np0005486759.ooo.test sudo[119312]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:28 np0005486759.ooo.test sudo[119323]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk7_5gth2/privsep.sock
Oct 14 09:05:28 np0005486759.ooo.test sudo[119323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:29 np0005486759.ooo.test sudo[119323]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:29 np0005486759.ooo.test sudo[119340]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoway3v61/privsep.sock
Oct 14 09:05:29 np0005486759.ooo.test sudo[119340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:30 np0005486759.ooo.test sudo[119340]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:30 np0005486759.ooo.test sudo[119351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph9e0_91a/privsep.sock
Oct 14 09:05:30 np0005486759.ooo.test sudo[119351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:30 np0005486759.ooo.test sudo[119351]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:31 np0005486759.ooo.test sudo[119362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv54f1t45/privsep.sock
Oct 14 09:05:31 np0005486759.ooo.test sudo[119362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:31 np0005486759.ooo.test sudo[119362]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:32 np0005486759.ooo.test sudo[119373]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpha74vycr/privsep.sock
Oct 14 09:05:32 np0005486759.ooo.test sudo[119373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:32 np0005486759.ooo.test sudo[119373]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:32 np0005486759.ooo.test sudo[119384]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5xt1v24s/privsep.sock
Oct 14 09:05:32 np0005486759.ooo.test sudo[119384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:05:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:05:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:05:33 np0005486759.ooo.test systemd[1]: tmp-crun.K9Hnj6.mount: Deactivated successfully.
Oct 14 09:05:33 np0005486759.ooo.test podman[119387]: 2025-10-14 09:05:33.053129839 +0000 UTC m=+0.069027221 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, container_name=nova_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:05:33 np0005486759.ooo.test podman[119387]: 2025-10-14 09:05:33.075200197 +0000 UTC m=+0.091097599 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, version=17.1.9, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=)
Oct 14 09:05:33 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:05:33 np0005486759.ooo.test podman[119386]: 2025-10-14 09:05:33.038322897 +0000 UTC m=+0.066290045 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, name=rhosp17/openstack-iscsid, release=1, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, container_name=iscsid, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git)
Oct 14 09:05:33 np0005486759.ooo.test podman[119386]: 2025-10-14 09:05:33.121313872 +0000 UTC m=+0.149280940 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.9, distribution-scope=public)
Oct 14 09:05:33 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:05:33 np0005486759.ooo.test podman[119393]: 2025-10-14 09:05:33.091241185 +0000 UTC m=+0.111306807 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhosp17/openstack-collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, batch=17.1_20250721.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Oct 14 09:05:33 np0005486759.ooo.test podman[119393]: 2025-10-14 09:05:33.175338035 +0000 UTC m=+0.195403677 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, tcib_managed=true, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, container_name=collectd, distribution-scope=public, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:05:33 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:05:33 np0005486759.ooo.test sudo[119384]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:33 np0005486759.ooo.test sudo[119456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7yk25j82/privsep.sock
Oct 14 09:05:33 np0005486759.ooo.test sudo[119456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:34 np0005486759.ooo.test sudo[119456]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:34 np0005486759.ooo.test sudo[119473]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplploieka/privsep.sock
Oct 14 09:05:34 np0005486759.ooo.test sudo[119473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:35 np0005486759.ooo.test sudo[119473]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:35 np0005486759.ooo.test sudo[119484]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjkgsorad/privsep.sock
Oct 14 09:05:35 np0005486759.ooo.test sudo[119484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:36 np0005486759.ooo.test sudo[119484]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:36 np0005486759.ooo.test sudo[119495]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmlogw0ur/privsep.sock
Oct 14 09:05:36 np0005486759.ooo.test sudo[119495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:37 np0005486759.ooo.test sudo[119495]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:37 np0005486759.ooo.test sudo[119506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmnm281g/privsep.sock
Oct 14 09:05:37 np0005486759.ooo.test sudo[119506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:37 np0005486759.ooo.test sudo[119506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:38 np0005486759.ooo.test sudo[119517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbw_op1tj/privsep.sock
Oct 14 09:05:38 np0005486759.ooo.test sudo[119517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:38 np0005486759.ooo.test sudo[119517]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:38 np0005486759.ooo.test sudo[119528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdp0vz3gq/privsep.sock
Oct 14 09:05:38 np0005486759.ooo.test sudo[119528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:39 np0005486759.ooo.test sudo[119528]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:39 np0005486759.ooo.test sudo[119539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvmobiy1z/privsep.sock
Oct 14 09:05:39 np0005486759.ooo.test sudo[119539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:05:40 np0005486759.ooo.test sudo[119539]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:40 np0005486759.ooo.test systemd[1]: tmp-crun.uHf1oO.mount: Deactivated successfully.
Oct 14 09:05:40 np0005486759.ooo.test podman[119549]: 2025-10-14 09:05:40.453748284 +0000 UTC m=+0.080829998 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, io.openshift.expose-services=)
Oct 14 09:05:40 np0005486759.ooo.test sudo[119585]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0g02q74y/privsep.sock
Oct 14 09:05:40 np0005486759.ooo.test sudo[119585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:40 np0005486759.ooo.test podman[119549]: 2025-10-14 09:05:40.642613976 +0000 UTC m=+0.269695700 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20250721.1, distribution-scope=public)
Oct 14 09:05:40 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:05:41 np0005486759.ooo.test sudo[119585]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:41 np0005486759.ooo.test sudo[119596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4t6pobsz/privsep.sock
Oct 14 09:05:41 np0005486759.ooo.test sudo[119596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:42 np0005486759.ooo.test sudo[119596]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:42 np0005486759.ooo.test sudo[119607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqr8mzws3/privsep.sock
Oct 14 09:05:42 np0005486759.ooo.test sudo[119607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:42 np0005486759.ooo.test sudo[119607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:05:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:05:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:05:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: tmp-crun.nqf1yD.mount: Deactivated successfully.
Oct 14 09:05:43 np0005486759.ooo.test podman[119615]: 2025-10-14 09:05:43.133047002 +0000 UTC m=+0.137262067 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, version=17.1.9, tcib_managed=true, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, release=1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64)
Oct 14 09:05:43 np0005486759.ooo.test podman[119622]: 2025-10-14 09:05:43.093597803 +0000 UTC m=+0.096599400 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, version=17.1.9, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 09:05:43 np0005486759.ooo.test podman[119611]: 2025-10-14 09:05:43.102879032 +0000 UTC m=+0.116556201 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 09:05:43 np0005486759.ooo.test podman[119614]: 2025-10-14 09:05:43.05624146 +0000 UTC m=+0.067588017 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, distribution-scope=public, release=1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team)
Oct 14 09:05:43 np0005486759.ooo.test podman[119622]: 2025-10-14 09:05:43.17826989 +0000 UTC m=+0.181271417 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, release=1)
Oct 14 09:05:43 np0005486759.ooo.test podman[119622]: unhealthy
Oct 14 09:05:43 np0005486759.ooo.test podman[119611]: 2025-10-14 09:05:43.185377022 +0000 UTC m=+0.199054191 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, release=1, vcs-type=git, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, build-date=2025-07-21T13:07:52, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, config_id=tripleo_step4)
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:05:43 np0005486759.ooo.test podman[119615]: 2025-10-14 09:05:43.197061805 +0000 UTC m=+0.201276880 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., release=1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33)
Oct 14 09:05:43 np0005486759.ooo.test podman[119615]: unhealthy
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:05:43 np0005486759.ooo.test sudo[119693]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx2i78an1/privsep.sock
Oct 14 09:05:43 np0005486759.ooo.test sudo[119693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:43 np0005486759.ooo.test podman[119614]: 2025-10-14 09:05:43.393320258 +0000 UTC m=+0.404666795 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64)
Oct 14 09:05:43 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:05:43 np0005486759.ooo.test sudo[119693]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:44 np0005486759.ooo.test sudo[119704]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp58zd1qoc/privsep.sock
Oct 14 09:05:44 np0005486759.ooo.test sudo[119704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:44 np0005486759.ooo.test systemd[1]: tmp-crun.B3wmy8.mount: Deactivated successfully.
Oct 14 09:05:44 np0005486759.ooo.test sudo[119704]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:44 np0005486759.ooo.test sudo[119715]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_benrmpa/privsep.sock
Oct 14 09:05:44 np0005486759.ooo.test sudo[119715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:45 np0005486759.ooo.test sudo[119715]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:45 np0005486759.ooo.test sudo[119732]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfbbk7tt2/privsep.sock
Oct 14 09:05:45 np0005486759.ooo.test sudo[119732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:46 np0005486759.ooo.test sudo[119732]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:46 np0005486759.ooo.test sudo[119743]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3kdxon36/privsep.sock
Oct 14 09:05:46 np0005486759.ooo.test sudo[119743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:47 np0005486759.ooo.test sudo[119743]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:47 np0005486759.ooo.test sudo[119754]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzylvks6c/privsep.sock
Oct 14 09:05:47 np0005486759.ooo.test sudo[119754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: tmp-crun.sLAnje.mount: Deactivated successfully.
Oct 14 09:05:47 np0005486759.ooo.test podman[119757]: 2025-10-14 09:05:47.647199166 +0000 UTC m=+0.054946282 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, distribution-scope=public, batch=17.1_20250721.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.9, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1)
Oct 14 09:05:47 np0005486759.ooo.test podman[119756]: 2025-10-14 09:05:47.681920768 +0000 UTC m=+0.088778836 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:05:47 np0005486759.ooo.test podman[119756]: 2025-10-14 09:05:47.719945192 +0000 UTC m=+0.126803260 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, batch=17.1_20250721.1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:05:47 np0005486759.ooo.test podman[119756]: unhealthy
Oct 14 09:05:47 np0005486759.ooo.test podman[119757]: 2025-10-14 09:05:47.727316562 +0000 UTC m=+0.135063768 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller)
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:05:47 np0005486759.ooo.test podman[119757]: unhealthy
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:05:47 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:05:48 np0005486759.ooo.test sudo[119754]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:48 np0005486759.ooo.test sudo[119804]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpssu9bwfb/privsep.sock
Oct 14 09:05:48 np0005486759.ooo.test sudo[119804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:48 np0005486759.ooo.test systemd[1]: tmp-crun.mT1Jtx.mount: Deactivated successfully.
Oct 14 09:05:49 np0005486759.ooo.test sudo[119804]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:49 np0005486759.ooo.test sudo[119815]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyuw9490h/privsep.sock
Oct 14 09:05:49 np0005486759.ooo.test sudo[119815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:49 np0005486759.ooo.test sudo[119815]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:50 np0005486759.ooo.test sudo[119826]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp6la924p/privsep.sock
Oct 14 09:05:50 np0005486759.ooo.test sudo[119826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:50 np0005486759.ooo.test sudo[119826]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:51 np0005486759.ooo.test sudo[119843]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3pjf_fbt/privsep.sock
Oct 14 09:05:51 np0005486759.ooo.test sudo[119843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:51 np0005486759.ooo.test sudo[119843]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:51 np0005486759.ooo.test sudo[119854]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi7iw7qk_/privsep.sock
Oct 14 09:05:51 np0005486759.ooo.test sudo[119854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:52 np0005486759.ooo.test sudo[119854]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:52 np0005486759.ooo.test sudo[119865]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcdzk5rwh/privsep.sock
Oct 14 09:05:52 np0005486759.ooo.test sudo[119865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:53 np0005486759.ooo.test sudo[119865]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:53 np0005486759.ooo.test sudo[119876]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm5dek9o2/privsep.sock
Oct 14 09:05:53 np0005486759.ooo.test sudo[119876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:54 np0005486759.ooo.test sudo[119876]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:54 np0005486759.ooo.test sudo[119887]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpshfazt0v/privsep.sock
Oct 14 09:05:54 np0005486759.ooo.test sudo[119887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:55 np0005486759.ooo.test sudo[119887]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:55 np0005486759.ooo.test sudo[119898]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyxqjgfan/privsep.sock
Oct 14 09:05:55 np0005486759.ooo.test sudo[119898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:55 np0005486759.ooo.test sudo[119898]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:56 np0005486759.ooo.test sudo[119912]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyfyox80c/privsep.sock
Oct 14 09:05:56 np0005486759.ooo.test sudo[119912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:56 np0005486759.ooo.test sudo[119912]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:57 np0005486759.ooo.test sudo[119926]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoa37oyaj/privsep.sock
Oct 14 09:05:57 np0005486759.ooo.test sudo[119926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:57 np0005486759.ooo.test sudo[119926]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:57 np0005486759.ooo.test sudo[119937]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpntcnwfzl/privsep.sock
Oct 14 09:05:57 np0005486759.ooo.test sudo[119937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:58 np0005486759.ooo.test sudo[119937]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:58 np0005486759.ooo.test sudo[119948]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph5sxrszr/privsep.sock
Oct 14 09:05:58 np0005486759.ooo.test sudo[119948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:05:59 np0005486759.ooo.test sudo[119948]: pam_unix(sudo:session): session closed for user root
Oct 14 09:05:59 np0005486759.ooo.test sudo[119959]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdiqjlzoq/privsep.sock
Oct 14 09:05:59 np0005486759.ooo.test sudo[119959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:00 np0005486759.ooo.test sudo[119959]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:00 np0005486759.ooo.test sudo[119970]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv0ra3nxk/privsep.sock
Oct 14 09:06:00 np0005486759.ooo.test sudo[119970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:00 np0005486759.ooo.test sudo[119970]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:01 np0005486759.ooo.test sudo[119981]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprx0vexf6/privsep.sock
Oct 14 09:06:01 np0005486759.ooo.test sudo[119981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:01 np0005486759.ooo.test sudo[119981]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:02 np0005486759.ooo.test sudo[119998]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnw4_jdj2/privsep.sock
Oct 14 09:06:02 np0005486759.ooo.test sudo[119998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:02 np0005486759.ooo.test sudo[119998]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:02 np0005486759.ooo.test sudo[120009]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6vqzrf0t/privsep.sock
Oct 14 09:06:02 np0005486759.ooo.test sudo[120009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:06:03 np0005486759.ooo.test podman[120012]: 2025-10-14 09:06:03.491255076 +0000 UTC m=+0.119661608 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, version=17.1.9, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:06:03 np0005486759.ooo.test sudo[120009]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: tmp-crun.A09cWf.mount: Deactivated successfully.
Oct 14 09:06:03 np0005486759.ooo.test podman[120014]: 2025-10-14 09:06:03.507404089 +0000 UTC m=+0.129511794 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.33.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=2, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Oct 14 09:06:03 np0005486759.ooo.test podman[120012]: 2025-10-14 09:06:03.52286229 +0000 UTC m=+0.151268842 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, tcib_managed=true, version=17.1.9, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.buildah.version=1.33.12)
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:06:03 np0005486759.ooo.test podman[120013]: 2025-10-14 09:06:03.556758637 +0000 UTC m=+0.181003349 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, distribution-scope=public, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:06:03 np0005486759.ooo.test podman[120013]: 2025-10-14 09:06:03.584256173 +0000 UTC m=+0.208500895 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1, io.buildah.version=1.33.12, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, tcib_managed=true, container_name=nova_compute, distribution-scope=public)
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:06:03 np0005486759.ooo.test podman[120014]: 2025-10-14 09:06:03.593884943 +0000 UTC m=+0.215992628 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.buildah.version=1.33.12, release=2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20250721.1, container_name=collectd, version=17.1.9)
Oct 14 09:06:03 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:06:03 np0005486759.ooo.test sudo[120084]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1g0xqjw_/privsep.sock
Oct 14 09:06:03 np0005486759.ooo.test sudo[120084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:04 np0005486759.ooo.test sudo[120084]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:04 np0005486759.ooo.test sudo[120095]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplhd451vr/privsep.sock
Oct 14 09:06:04 np0005486759.ooo.test sudo[120095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:05 np0005486759.ooo.test sudo[120095]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:05 np0005486759.ooo.test sudo[120106]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe2kfeeo7/privsep.sock
Oct 14 09:06:05 np0005486759.ooo.test sudo[120106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:06 np0005486759.ooo.test sudo[120106]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:06 np0005486759.ooo.test sudo[120117]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5v7qoqw1/privsep.sock
Oct 14 09:06:06 np0005486759.ooo.test sudo[120117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:06 np0005486759.ooo.test sudo[120117]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:07 np0005486759.ooo.test sudo[120134]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3egdan3_/privsep.sock
Oct 14 09:06:07 np0005486759.ooo.test sudo[120134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:07 np0005486759.ooo.test sudo[120134]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:08 np0005486759.ooo.test sudo[120145]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe2jy1jvq/privsep.sock
Oct 14 09:06:08 np0005486759.ooo.test sudo[120145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:08 np0005486759.ooo.test sudo[120145]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:08 np0005486759.ooo.test sudo[120156]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp16xbq6bx/privsep.sock
Oct 14 09:06:08 np0005486759.ooo.test sudo[120156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:09 np0005486759.ooo.test sudo[120156]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:09 np0005486759.ooo.test sudo[120167]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpblfm40bc/privsep.sock
Oct 14 09:06:09 np0005486759.ooo.test sudo[120167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:10 np0005486759.ooo.test sudo[120167]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:10 np0005486759.ooo.test sudo[120178]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0xf6h52n/privsep.sock
Oct 14 09:06:10 np0005486759.ooo.test sudo[120178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:11 np0005486759.ooo.test sudo[120178]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:06:11 np0005486759.ooo.test podman[120184]: 2025-10-14 09:06:11.351248189 +0000 UTC m=+0.074402029 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, release=1, build-date=2025-07-21T13:07:59, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.9, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20250721.1, vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 09:06:11 np0005486759.ooo.test sudo[120217]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk9wuyh3l/privsep.sock
Oct 14 09:06:11 np0005486759.ooo.test sudo[120217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:11 np0005486759.ooo.test podman[120184]: 2025-10-14 09:06:11.504686798 +0000 UTC m=+0.227840568 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, version=17.1.9)
Oct 14 09:06:11 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:06:12 np0005486759.ooo.test sudo[120217]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:12 np0005486759.ooo.test sudo[120231]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphyw4oof6/privsep.sock
Oct 14 09:06:12 np0005486759.ooo.test sudo[120231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:12 np0005486759.ooo.test sudo[120231]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:13 np0005486759.ooo.test sudo[120245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkdlhm_04/privsep.sock
Oct 14 09:06:13 np0005486759.ooo.test sudo[120245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: tmp-crun.BZo3qx.mount: Deactivated successfully.
Oct 14 09:06:13 np0005486759.ooo.test podman[120249]: 2025-10-14 09:06:13.470546745 +0000 UTC m=+0.096100704 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, release=1, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:06:13 np0005486759.ooo.test podman[120249]: 2025-10-14 09:06:13.511729608 +0000 UTC m=+0.137283577 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Oct 14 09:06:13 np0005486759.ooo.test podman[120249]: unhealthy
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:06:13 np0005486759.ooo.test podman[120250]: 2025-10-14 09:06:13.435948868 +0000 UTC m=+0.062101106 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, batch=17.1_20250721.1, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git)
Oct 14 09:06:13 np0005486759.ooo.test podman[120285]: 2025-10-14 09:06:13.580079186 +0000 UTC m=+0.129000839 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 09:06:13 np0005486759.ooo.test podman[120248]: 2025-10-14 09:06:13.512998838 +0000 UTC m=+0.141296413 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, batch=17.1_20250721.1, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container)
Oct 14 09:06:13 np0005486759.ooo.test podman[120250]: 2025-10-14 09:06:13.620544567 +0000 UTC m=+0.246696815 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, config_id=tripleo_step4, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, release=1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-type=git)
Oct 14 09:06:13 np0005486759.ooo.test podman[120250]: unhealthy
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:06:13 np0005486759.ooo.test podman[120248]: 2025-10-14 09:06:13.644285886 +0000 UTC m=+0.272583481 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20250721.1, tcib_managed=true, build-date=2025-07-21T13:07:52, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64)
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:06:13 np0005486759.ooo.test sudo[120245]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:13 np0005486759.ooo.test sudo[120335]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpps5a2sxs/privsep.sock
Oct 14 09:06:13 np0005486759.ooo.test sudo[120335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:13 np0005486759.ooo.test podman[120285]: 2025-10-14 09:06:13.96547775 +0000 UTC m=+0.514399413 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, container_name=nova_migration_target, batch=17.1_20250721.1, vendor=Red Hat, Inc.)
Oct 14 09:06:13 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:06:14 np0005486759.ooo.test sudo[120335]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:14 np0005486759.ooo.test sudo[120346]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcazte_n2/privsep.sock
Oct 14 09:06:14 np0005486759.ooo.test sudo[120346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:15 np0005486759.ooo.test sudo[120346]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:15 np0005486759.ooo.test sudo[120357]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvom3ukh_/privsep.sock
Oct 14 09:06:15 np0005486759.ooo.test sudo[120357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:16 np0005486759.ooo.test sudo[120357]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:16 np0005486759.ooo.test sudo[120368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpchwzxkll/privsep.sock
Oct 14 09:06:16 np0005486759.ooo.test sudo[120368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:17 np0005486759.ooo.test sudo[120368]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:17 np0005486759.ooo.test sudo[120379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1hxmmhlm/privsep.sock
Oct 14 09:06:17 np0005486759.ooo.test sudo[120379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:18 np0005486759.ooo.test sudo[120379]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:06:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:06:18 np0005486759.ooo.test podman[120390]: 2025-10-14 09:06:18.219194303 +0000 UTC m=+0.074674696 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:06:18 np0005486759.ooo.test podman[120390]: 2025-10-14 09:06:18.232076044 +0000 UTC m=+0.087556407 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:06:18 np0005486759.ooo.test podman[120390]: unhealthy
Oct 14 09:06:18 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:18 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:06:18 np0005486759.ooo.test podman[120392]: 2025-10-14 09:06:18.266066793 +0000 UTC m=+0.119916296 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, container_name=ovn_controller, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:06:18 np0005486759.ooo.test podman[120392]: 2025-10-14 09:06:18.276850699 +0000 UTC m=+0.130700172 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, io.buildah.version=1.33.12, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, container_name=ovn_controller, architecture=x86_64, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=)
Oct 14 09:06:18 np0005486759.ooo.test podman[120392]: unhealthy
Oct 14 09:06:18 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:18 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:06:18 np0005486759.ooo.test sudo[120434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz0sewpsh/privsep.sock
Oct 14 09:06:18 np0005486759.ooo.test sudo[120434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:18 np0005486759.ooo.test sudo[120434]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:19 np0005486759.ooo.test sudo[120445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0lq1j5aj/privsep.sock
Oct 14 09:06:19 np0005486759.ooo.test sudo[120445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:19 np0005486759.ooo.test sudo[120445]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:20 np0005486759.ooo.test sudo[120456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgje35ne9/privsep.sock
Oct 14 09:06:20 np0005486759.ooo.test sudo[120456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:20 np0005486759.ooo.test sudo[120456]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:20 np0005486759.ooo.test sudo[120467]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0xwk69yn/privsep.sock
Oct 14 09:06:20 np0005486759.ooo.test sudo[120467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:21 np0005486759.ooo.test sudo[120467]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:21 np0005486759.ooo.test sudo[120478]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpioxm6p9k/privsep.sock
Oct 14 09:06:21 np0005486759.ooo.test sudo[120478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:22 np0005486759.ooo.test sudo[120478]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:22 np0005486759.ooo.test sudo[120489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7bu1cnq0/privsep.sock
Oct 14 09:06:22 np0005486759.ooo.test sudo[120489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:23 np0005486759.ooo.test sudo[120489]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:23 np0005486759.ooo.test sudo[120506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp08t61b9t/privsep.sock
Oct 14 09:06:23 np0005486759.ooo.test sudo[120506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:24 np0005486759.ooo.test sudo[120506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:24 np0005486759.ooo.test sudo[120517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1yjnmpi2/privsep.sock
Oct 14 09:06:24 np0005486759.ooo.test sudo[120517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:25 np0005486759.ooo.test sudo[120517]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:25 np0005486759.ooo.test sudo[120528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9uzo50tx/privsep.sock
Oct 14 09:06:25 np0005486759.ooo.test sudo[120528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:25 np0005486759.ooo.test sudo[120528]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:26 np0005486759.ooo.test sudo[120539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3t_8ln90/privsep.sock
Oct 14 09:06:26 np0005486759.ooo.test sudo[120539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:26 np0005486759.ooo.test sudo[120539]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:26 np0005486759.ooo.test sudo[120550]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpirybx1mu/privsep.sock
Oct 14 09:06:26 np0005486759.ooo.test sudo[120550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:27 np0005486759.ooo.test sudo[120550]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:27 np0005486759.ooo.test sudo[120561]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxd5lv31i/privsep.sock
Oct 14 09:06:27 np0005486759.ooo.test sudo[120561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:28 np0005486759.ooo.test sudo[120561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:28 np0005486759.ooo.test sudo[120577]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfrq6rl7e/privsep.sock
Oct 14 09:06:28 np0005486759.ooo.test sudo[120577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:29 np0005486759.ooo.test sudo[120577]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:29 np0005486759.ooo.test sudo[120589]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp62puwrw1/privsep.sock
Oct 14 09:06:29 np0005486759.ooo.test sudo[120589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:30 np0005486759.ooo.test sudo[120589]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:30 np0005486759.ooo.test sudo[120600]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxwy6wekw/privsep.sock
Oct 14 09:06:30 np0005486759.ooo.test sudo[120600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:30 np0005486759.ooo.test sudo[120600]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:31 np0005486759.ooo.test sudo[120611]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp77qc0efj/privsep.sock
Oct 14 09:06:31 np0005486759.ooo.test sudo[120611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:31 np0005486759.ooo.test sudo[120611]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:32 np0005486759.ooo.test sudo[120622]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkskaqrrq/privsep.sock
Oct 14 09:06:32 np0005486759.ooo.test sudo[120622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:32 np0005486759.ooo.test sudo[120622]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:33 np0005486759.ooo.test sudo[120633]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps8p240ag/privsep.sock
Oct 14 09:06:33 np0005486759.ooo.test sudo[120633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:33 np0005486759.ooo.test sudo[120633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: tmp-crun.aOJqTV.mount: Deactivated successfully.
Oct 14 09:06:33 np0005486759.ooo.test podman[120640]: 2025-10-14 09:06:33.764633962 +0000 UTC m=+0.083335437 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, container_name=nova_compute, vendor=Red Hat, Inc.)
Oct 14 09:06:33 np0005486759.ooo.test podman[120640]: 2025-10-14 09:06:33.785297745 +0000 UTC m=+0.103999250 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:06:33 np0005486759.ooo.test podman[120641]: 2025-10-14 09:06:33.748546371 +0000 UTC m=+0.067046260 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, container_name=collectd, batch=17.1_20250721.1, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, release=2, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:06:33 np0005486759.ooo.test podman[120639]: 2025-10-14 09:06:33.845068836 +0000 UTC m=+0.168901481 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, tcib_managed=true, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:06:33 np0005486759.ooo.test podman[120639]: 2025-10-14 09:06:33.879360185 +0000 UTC m=+0.203192810 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container)
Oct 14 09:06:33 np0005486759.ooo.test sudo[120708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx4uede0h/privsep.sock
Oct 14 09:06:33 np0005486759.ooo.test sudo[120708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:06:33 np0005486759.ooo.test podman[120641]: 2025-10-14 09:06:33.932050075 +0000 UTC m=+0.250549984 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, release=2, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git)
Oct 14 09:06:33 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:06:34 np0005486759.ooo.test sudo[120708]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:34 np0005486759.ooo.test sudo[120722]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxwqxndyf/privsep.sock
Oct 14 09:06:34 np0005486759.ooo.test sudo[120722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:35 np0005486759.ooo.test sudo[120722]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:35 np0005486759.ooo.test sudo[120733]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpphrc76g2/privsep.sock
Oct 14 09:06:35 np0005486759.ooo.test sudo[120733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:36 np0005486759.ooo.test sudo[120733]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:36 np0005486759.ooo.test sudo[120744]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph0uq2l75/privsep.sock
Oct 14 09:06:36 np0005486759.ooo.test sudo[120744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:37 np0005486759.ooo.test sudo[120744]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:37 np0005486759.ooo.test sudo[120755]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3uoukq43/privsep.sock
Oct 14 09:06:37 np0005486759.ooo.test sudo[120755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:38 np0005486759.ooo.test sudo[120755]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:38 np0005486759.ooo.test sudo[120766]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd8mb0rq1/privsep.sock
Oct 14 09:06:38 np0005486759.ooo.test sudo[120766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:38 np0005486759.ooo.test sudo[120766]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:39 np0005486759.ooo.test sudo[120779]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2obyn_qb/privsep.sock
Oct 14 09:06:39 np0005486759.ooo.test sudo[120779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:39 np0005486759.ooo.test sudo[120779]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:40 np0005486759.ooo.test sudo[120794]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf7yb6g8w/privsep.sock
Oct 14 09:06:40 np0005486759.ooo.test sudo[120794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:40 np0005486759.ooo.test sudo[120794]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:40 np0005486759.ooo.test sudo[120805]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk_k4qt2g/privsep.sock
Oct 14 09:06:40 np0005486759.ooo.test sudo[120805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:41 np0005486759.ooo.test sudo[120805]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:41 np0005486759.ooo.test sudo[120816]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5jyr_175/privsep.sock
Oct 14 09:06:41 np0005486759.ooo.test sudo[120816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:06:41 np0005486759.ooo.test systemd[1]: tmp-crun.6BPPs9.mount: Deactivated successfully.
Oct 14 09:06:41 np0005486759.ooo.test podman[120818]: 2025-10-14 09:06:41.810093712 +0000 UTC m=+0.072161999 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 09:06:41 np0005486759.ooo.test podman[120818]: 2025-10-14 09:06:41.977236917 +0000 UTC m=+0.239305224 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, container_name=metrics_qdr, release=1, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:06:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:06:42 np0005486759.ooo.test sudo[120816]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:42 np0005486759.ooo.test sudo[120857]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpca2pusco/privsep.sock
Oct 14 09:06:42 np0005486759.ooo.test sudo[120857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:43 np0005486759.ooo.test sudo[120857]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:43 np0005486759.ooo.test sudo[120868]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpufobflj4/privsep.sock
Oct 14 09:06:43 np0005486759.ooo.test sudo[120868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:44 np0005486759.ooo.test sudo[120868]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: tmp-crun.pVr2eB.mount: Deactivated successfully.
Oct 14 09:06:44 np0005486759.ooo.test podman[120875]: 2025-10-14 09:06:44.133495435 +0000 UTC m=+0.079393623 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, version=17.1.9, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Oct 14 09:06:44 np0005486759.ooo.test podman[120882]: 2025-10-14 09:06:44.141781713 +0000 UTC m=+0.086385571 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=)
Oct 14 09:06:44 np0005486759.ooo.test podman[120874]: 2025-10-14 09:06:44.114014208 +0000 UTC m=+0.068624398 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, container_name=logrotate_crond, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:06:44 np0005486759.ooo.test podman[120876]: 2025-10-14 09:06:44.172781779 +0000 UTC m=+0.118554614 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, version=17.1.9, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public)
Oct 14 09:06:44 np0005486759.ooo.test podman[120882]: 2025-10-14 09:06:44.182691128 +0000 UTC m=+0.127295006 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.33.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:06:44 np0005486759.ooo.test podman[120882]: unhealthy
Oct 14 09:06:44 np0005486759.ooo.test podman[120874]: 2025-10-14 09:06:44.193633878 +0000 UTC m=+0.148244068 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, batch=17.1_20250721.1)
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:06:44 np0005486759.ooo.test podman[120876]: 2025-10-14 09:06:44.212222087 +0000 UTC m=+0.157994922 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, release=1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, architecture=x86_64)
Oct 14 09:06:44 np0005486759.ooo.test podman[120876]: unhealthy
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:06:44 np0005486759.ooo.test sudo[120957]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo00_wryr/privsep.sock
Oct 14 09:06:44 np0005486759.ooo.test sudo[120957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:44 np0005486759.ooo.test podman[120875]: 2025-10-14 09:06:44.446154993 +0000 UTC m=+0.392053181 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 09:06:44 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:06:44 np0005486759.ooo.test sudo[120957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:45 np0005486759.ooo.test sudo[120975]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy9kvy2c3/privsep.sock
Oct 14 09:06:45 np0005486759.ooo.test sudo[120975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:45 np0005486759.ooo.test sudo[120975]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:45 np0005486759.ooo.test sudo[120986]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprwiazrcn/privsep.sock
Oct 14 09:06:45 np0005486759.ooo.test sudo[120986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:46 np0005486759.ooo.test sudo[120986]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:46 np0005486759.ooo.test sudo[120997]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl9hurppi/privsep.sock
Oct 14 09:06:46 np0005486759.ooo.test sudo[120997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:47 np0005486759.ooo.test sudo[120997]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:47 np0005486759.ooo.test sudo[121008]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9pvzjuqn/privsep.sock
Oct 14 09:06:47 np0005486759.ooo.test sudo[121008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:48 np0005486759.ooo.test sudo[121008]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: tmp-crun.errQCk.mount: Deactivated successfully.
Oct 14 09:06:48 np0005486759.ooo.test podman[121015]: 2025-10-14 09:06:48.395685622 +0000 UTC m=+0.068685780 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, version=17.1.9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64)
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: tmp-crun.RVYBR5.mount: Deactivated successfully.
Oct 14 09:06:48 np0005486759.ooo.test podman[121012]: 2025-10-14 09:06:48.471187484 +0000 UTC m=+0.146428432 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9)
Oct 14 09:06:48 np0005486759.ooo.test podman[121015]: 2025-10-14 09:06:48.483691244 +0000 UTC m=+0.156691452 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, distribution-scope=public, vcs-type=git, release=1, batch=17.1_20250721.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vendor=Red Hat, Inc., container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller)
Oct 14 09:06:48 np0005486759.ooo.test podman[121015]: unhealthy
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:06:48 np0005486759.ooo.test podman[121012]: 2025-10-14 09:06:48.509348583 +0000 UTC m=+0.184589511 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, version=17.1.9, release=1, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Oct 14 09:06:48 np0005486759.ooo.test podman[121012]: unhealthy
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:06:48 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:06:48 np0005486759.ooo.test sudo[121059]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwllsx_pf/privsep.sock
Oct 14 09:06:48 np0005486759.ooo.test sudo[121059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:49 np0005486759.ooo.test sudo[121059]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:49 np0005486759.ooo.test sudo[121070]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv6xq8bk_/privsep.sock
Oct 14 09:06:49 np0005486759.ooo.test sudo[121070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:49 np0005486759.ooo.test sudo[121070]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:50 np0005486759.ooo.test sudo[121087]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpptembals/privsep.sock
Oct 14 09:06:50 np0005486759.ooo.test sudo[121087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:50 np0005486759.ooo.test sudo[121087]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:50 np0005486759.ooo.test sudo[121098]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmpao6a5z/privsep.sock
Oct 14 09:06:50 np0005486759.ooo.test sudo[121098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:51 np0005486759.ooo.test sudo[121098]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:51 np0005486759.ooo.test sudo[121109]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp94r014lk/privsep.sock
Oct 14 09:06:51 np0005486759.ooo.test sudo[121109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:52 np0005486759.ooo.test sudo[121109]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:52 np0005486759.ooo.test sudo[121120]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa3xffw_4/privsep.sock
Oct 14 09:06:52 np0005486759.ooo.test sudo[121120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:53 np0005486759.ooo.test sudo[121120]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:53 np0005486759.ooo.test sudo[121131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5f8z6w9u/privsep.sock
Oct 14 09:06:53 np0005486759.ooo.test sudo[121131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:54 np0005486759.ooo.test sudo[121131]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:54 np0005486759.ooo.test sudo[121142]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxhti4pax/privsep.sock
Oct 14 09:06:54 np0005486759.ooo.test sudo[121142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:55 np0005486759.ooo.test sudo[121142]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:55 np0005486759.ooo.test sudo[121156]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuu9ebl5k/privsep.sock
Oct 14 09:06:55 np0005486759.ooo.test sudo[121156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:55 np0005486759.ooo.test sudo[121156]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:56 np0005486759.ooo.test sudo[121170]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwhale3gq/privsep.sock
Oct 14 09:06:56 np0005486759.ooo.test sudo[121170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:56 np0005486759.ooo.test sudo[121170]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:57 np0005486759.ooo.test sudo[121181]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu21l6ri_/privsep.sock
Oct 14 09:06:57 np0005486759.ooo.test sudo[121181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:57 np0005486759.ooo.test sudo[121181]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:57 np0005486759.ooo.test sudo[121192]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbmf9foc1/privsep.sock
Oct 14 09:06:57 np0005486759.ooo.test sudo[121192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:58 np0005486759.ooo.test sudo[121192]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:58 np0005486759.ooo.test sudo[121203]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplkzg6ij6/privsep.sock
Oct 14 09:06:58 np0005486759.ooo.test sudo[121203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:06:59 np0005486759.ooo.test sudo[121203]: pam_unix(sudo:session): session closed for user root
Oct 14 09:06:59 np0005486759.ooo.test sudo[121214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7b043sto/privsep.sock
Oct 14 09:06:59 np0005486759.ooo.test sudo[121214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:00 np0005486759.ooo.test sudo[121214]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:00 np0005486759.ooo.test sudo[121225]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy8t6m1hh/privsep.sock
Oct 14 09:07:00 np0005486759.ooo.test sudo[121225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:01 np0005486759.ooo.test sudo[121225]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:01 np0005486759.ooo.test sudo[121242]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ksmq_9d/privsep.sock
Oct 14 09:07:01 np0005486759.ooo.test sudo[121242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:01 np0005486759.ooo.test sudo[121242]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:01 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:07:01 np0005486759.ooo.test recover_tripleo_nova_virtqemud[121249]: 47951
Oct 14 09:07:01 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:07:01 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:07:02 np0005486759.ooo.test sudo[121255]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkg6j5r_0/privsep.sock
Oct 14 09:07:02 np0005486759.ooo.test sudo[121255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:02 np0005486759.ooo.test sudo[121255]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:03 np0005486759.ooo.test sudo[121266]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppq39m781/privsep.sock
Oct 14 09:07:03 np0005486759.ooo.test sudo[121266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:03 np0005486759.ooo.test sudo[121266]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:03 np0005486759.ooo.test sudo[121277]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt7h3ozs9/privsep.sock
Oct 14 09:07:03 np0005486759.ooo.test sudo[121277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:07:04 np0005486759.ooo.test podman[121281]: 2025-10-14 09:07:04.097984577 +0000 UTC m=+0.071769377 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, batch=17.1_20250721.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, release=2)
Oct 14 09:07:04 np0005486759.ooo.test podman[121281]: 2025-10-14 09:07:04.109080042 +0000 UTC m=+0.082864832 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, io.openshift.expose-services=)
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: tmp-crun.D9oaAQ.mount: Deactivated successfully.
Oct 14 09:07:04 np0005486759.ooo.test podman[121280]: 2025-10-14 09:07:04.152579117 +0000 UTC m=+0.127873134 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=)
Oct 14 09:07:04 np0005486759.ooo.test podman[121279]: 2025-10-14 09:07:04.162995671 +0000 UTC m=+0.137056979 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.33.12, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.9, distribution-scope=public, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 09:07:04 np0005486759.ooo.test podman[121279]: 2025-10-14 09:07:04.199347754 +0000 UTC m=+0.173409042 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, build-date=2025-07-21T13:27:15, container_name=iscsid, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12)
Oct 14 09:07:04 np0005486759.ooo.test podman[121280]: 2025-10-14 09:07:04.199608042 +0000 UTC m=+0.174902049 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:07:04 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:07:04 np0005486759.ooo.test sudo[121277]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:04 np0005486759.ooo.test sudo[121349]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj72cgvqe/privsep.sock
Oct 14 09:07:04 np0005486759.ooo.test sudo[121349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:05 np0005486759.ooo.test sudo[121349]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:05 np0005486759.ooo.test sudo[121360]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpygdfm871/privsep.sock
Oct 14 09:07:05 np0005486759.ooo.test sudo[121360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:06 np0005486759.ooo.test sudo[121360]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:06 np0005486759.ooo.test sudo[121377]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1yez65i_/privsep.sock
Oct 14 09:07:06 np0005486759.ooo.test sudo[121377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:07 np0005486759.ooo.test sudo[121377]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:07 np0005486759.ooo.test sudo[121388]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7qkfnd_2/privsep.sock
Oct 14 09:07:07 np0005486759.ooo.test sudo[121388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:08 np0005486759.ooo.test sudo[121388]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:08 np0005486759.ooo.test sudo[121399]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptcxftvb8/privsep.sock
Oct 14 09:07:08 np0005486759.ooo.test sudo[121399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:09 np0005486759.ooo.test sudo[121399]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:09 np0005486759.ooo.test sudo[121410]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph8o92lj8/privsep.sock
Oct 14 09:07:09 np0005486759.ooo.test sudo[121410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:09 np0005486759.ooo.test sudo[121410]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:10 np0005486759.ooo.test sudo[121421]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_xgnbhc/privsep.sock
Oct 14 09:07:10 np0005486759.ooo.test sudo[121421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:10 np0005486759.ooo.test sudo[121421]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:10 np0005486759.ooo.test sudo[121432]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdfnhu6rz/privsep.sock
Oct 14 09:07:10 np0005486759.ooo.test sudo[121432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:11 np0005486759.ooo.test sudo[121432]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:11 np0005486759.ooo.test sudo[121449]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnurqdn7f/privsep.sock
Oct 14 09:07:11 np0005486759.ooo.test sudo[121449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:07:12 np0005486759.ooo.test sudo[121449]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:12 np0005486759.ooo.test podman[121453]: 2025-10-14 09:07:12.445127361 +0000 UTC m=+0.072529550 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, tcib_managed=true)
Oct 14 09:07:12 np0005486759.ooo.test sudo[121486]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcfdk30il/privsep.sock
Oct 14 09:07:12 np0005486759.ooo.test sudo[121486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:12 np0005486759.ooo.test podman[121453]: 2025-10-14 09:07:12.668336014 +0000 UTC m=+0.295738163 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd)
Oct 14 09:07:12 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:07:13 np0005486759.ooo.test sudo[121486]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:13 np0005486759.ooo.test sudo[121499]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6x7zc70w/privsep.sock
Oct 14 09:07:13 np0005486759.ooo.test sudo[121499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:14 np0005486759.ooo.test sudo[121499]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:07:14 np0005486759.ooo.test sudo[121540]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp59nj1olk/privsep.sock
Oct 14 09:07:14 np0005486759.ooo.test sudo[121540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: tmp-crun.C3ru9i.mount: Deactivated successfully.
Oct 14 09:07:14 np0005486759.ooo.test podman[121507]: 2025-10-14 09:07:14.455065402 +0000 UTC m=+0.076599607 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, release=1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.9, config_id=tripleo_step4, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-07-21T13:07:52, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:07:14 np0005486759.ooo.test podman[121507]: 2025-10-14 09:07:14.487359237 +0000 UTC m=+0.108893452 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, build-date=2025-07-21T13:07:52, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1)
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:07:14 np0005486759.ooo.test podman[121509]: 2025-10-14 09:07:14.561008282 +0000 UTC m=+0.176067375 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_id=tripleo_step4, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9)
Oct 14 09:07:14 np0005486759.ooo.test podman[121508]: 2025-10-14 09:07:14.519102027 +0000 UTC m=+0.137190464 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, release=1, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4)
Oct 14 09:07:14 np0005486759.ooo.test podman[121549]: 2025-10-14 09:07:14.540127961 +0000 UTC m=+0.059280777 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, release=1, config_id=tripleo_step4, container_name=nova_migration_target, build-date=2025-07-21T14:48:37, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, vcs-type=git)
Oct 14 09:07:14 np0005486759.ooo.test podman[121509]: 2025-10-14 09:07:14.597748586 +0000 UTC m=+0.212807699 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 09:07:14 np0005486759.ooo.test podman[121509]: unhealthy
Oct 14 09:07:14 np0005486759.ooo.test podman[121508]: 2025-10-14 09:07:14.604284119 +0000 UTC m=+0.222372536 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1)
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:07:14 np0005486759.ooo.test podman[121508]: unhealthy
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:07:14 np0005486759.ooo.test podman[121549]: 2025-10-14 09:07:14.890638198 +0000 UTC m=+0.409791014 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:07:14 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:07:14 np0005486759.ooo.test sudo[121540]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:15 np0005486759.ooo.test sudo[121596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1l2qke9e/privsep.sock
Oct 14 09:07:15 np0005486759.ooo.test sudo[121596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:15 np0005486759.ooo.test systemd[1]: tmp-crun.9MHWCc.mount: Deactivated successfully.
Oct 14 09:07:15 np0005486759.ooo.test sudo[121596]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:16 np0005486759.ooo.test sudo[121607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprtjawlxt/privsep.sock
Oct 14 09:07:16 np0005486759.ooo.test sudo[121607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:16 np0005486759.ooo.test sudo[121607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:16 np0005486759.ooo.test sudo[121623]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyoakpe0q/privsep.sock
Oct 14 09:07:16 np0005486759.ooo.test sudo[121623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:17 np0005486759.ooo.test sudo[121623]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:17 np0005486759.ooo.test sudo[121635]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4lxaaosc/privsep.sock
Oct 14 09:07:17 np0005486759.ooo.test sudo[121635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:18 np0005486759.ooo.test sudo[121635]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:18 np0005486759.ooo.test sudo[121646]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpso0f4b92/privsep.sock
Oct 14 09:07:18 np0005486759.ooo.test sudo[121646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:07:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:07:18 np0005486759.ooo.test podman[121648]: 2025-10-14 09:07:18.724717612 +0000 UTC m=+0.076924148 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, release=1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.9, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:07:18 np0005486759.ooo.test podman[121648]: 2025-10-14 09:07:18.737327284 +0000 UTC m=+0.089533790 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ovn_metadata_agent, release=1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Oct 14 09:07:18 np0005486759.ooo.test podman[121648]: unhealthy
Oct 14 09:07:18 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:18 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:07:18 np0005486759.ooo.test podman[121649]: 2025-10-14 09:07:18.700049473 +0000 UTC m=+0.053639541 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, container_name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 09:07:18 np0005486759.ooo.test podman[121649]: 2025-10-14 09:07:18.781592353 +0000 UTC m=+0.135182401 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, version=17.1.9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 09:07:18 np0005486759.ooo.test podman[121649]: unhealthy
Oct 14 09:07:18 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:18 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:07:19 np0005486759.ooo.test sudo[121646]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:19 np0005486759.ooo.test sudo[121697]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzdbtgbko/privsep.sock
Oct 14 09:07:19 np0005486759.ooo.test sudo[121697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:20 np0005486759.ooo.test sudo[121697]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:20 np0005486759.ooo.test sudo[121708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx6vt9pwt/privsep.sock
Oct 14 09:07:20 np0005486759.ooo.test sudo[121708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:20 np0005486759.ooo.test sudo[121708]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:21 np0005486759.ooo.test sudo[121719]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp797kz2_l/privsep.sock
Oct 14 09:07:21 np0005486759.ooo.test sudo[121719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:21 np0005486759.ooo.test sudo[121719]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:22 np0005486759.ooo.test sudo[121730]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd1qw2me6/privsep.sock
Oct 14 09:07:22 np0005486759.ooo.test sudo[121730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:22 np0005486759.ooo.test sudo[121730]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:22 np0005486759.ooo.test sudo[121747]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppeqysdfp/privsep.sock
Oct 14 09:07:22 np0005486759.ooo.test sudo[121747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:23 np0005486759.ooo.test sudo[121747]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:23 np0005486759.ooo.test sudo[121758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphvwl6uz2/privsep.sock
Oct 14 09:07:23 np0005486759.ooo.test sudo[121758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:24 np0005486759.ooo.test sudo[121758]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:24 np0005486759.ooo.test sudo[121769]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5kao2x0i/privsep.sock
Oct 14 09:07:24 np0005486759.ooo.test sudo[121769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:25 np0005486759.ooo.test sudo[121769]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:25 np0005486759.ooo.test sudo[121780]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1ocp0p1h/privsep.sock
Oct 14 09:07:25 np0005486759.ooo.test sudo[121780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:26 np0005486759.ooo.test sudo[121780]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:26 np0005486759.ooo.test sudo[121791]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg3kzkwsa/privsep.sock
Oct 14 09:07:26 np0005486759.ooo.test sudo[121791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:26 np0005486759.ooo.test sudo[121791]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:27 np0005486759.ooo.test sudo[121802]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpulmfo31n/privsep.sock
Oct 14 09:07:27 np0005486759.ooo.test sudo[121802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:27 np0005486759.ooo.test sudo[121802]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:28 np0005486759.ooo.test sudo[121819]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0onekkv3/privsep.sock
Oct 14 09:07:28 np0005486759.ooo.test sudo[121819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:28 np0005486759.ooo.test sudo[121819]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:28 np0005486759.ooo.test sudo[121830]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppgxn49kt/privsep.sock
Oct 14 09:07:28 np0005486759.ooo.test sudo[121830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:29 np0005486759.ooo.test sudo[121830]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:29 np0005486759.ooo.test sudo[121841]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpympvov50/privsep.sock
Oct 14 09:07:29 np0005486759.ooo.test sudo[121841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:30 np0005486759.ooo.test sudo[121841]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:30 np0005486759.ooo.test sudo[121852]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd2t1p0id/privsep.sock
Oct 14 09:07:30 np0005486759.ooo.test sudo[121852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:31 np0005486759.ooo.test sudo[121852]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:31 np0005486759.ooo.test sudo[121863]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprk0fabnj/privsep.sock
Oct 14 09:07:31 np0005486759.ooo.test sudo[121863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:32 np0005486759.ooo.test sudo[121863]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:32 np0005486759.ooo.test sudo[121874]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpypexp9ee/privsep.sock
Oct 14 09:07:32 np0005486759.ooo.test sudo[121874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:33 np0005486759.ooo.test sudo[121874]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:33 np0005486759.ooo.test sudo[121890]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpola7p4u5/privsep.sock
Oct 14 09:07:33 np0005486759.ooo.test sudo[121890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:33 np0005486759.ooo.test sudo[121890]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:34 np0005486759.ooo.test sudo[121902]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvp35x2v3/privsep.sock
Oct 14 09:07:34 np0005486759.ooo.test sudo[121902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:07:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:07:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:07:34 np0005486759.ooo.test podman[121905]: 2025-10-14 09:07:34.424339862 +0000 UTC m=+0.055327854 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, batch=17.1_20250721.1, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, architecture=x86_64, version=17.1.9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 09:07:34 np0005486759.ooo.test podman[121906]: 2025-10-14 09:07:34.480743249 +0000 UTC m=+0.108622185 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:07:34 np0005486759.ooo.test podman[121906]: 2025-10-14 09:07:34.501239377 +0000 UTC m=+0.129118283 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, config_id=tripleo_step5, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:07:34 np0005486759.ooo.test podman[121907]: 2025-10-14 09:07:34.457622039 +0000 UTC m=+0.081146008 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, container_name=collectd, io.buildah.version=1.33.12, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=2, config_id=tripleo_step3)
Oct 14 09:07:34 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:07:34 np0005486759.ooo.test podman[121907]: 2025-10-14 09:07:34.538106906 +0000 UTC m=+0.161630885 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhosp17/openstack-collectd, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:07:34 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:07:34 np0005486759.ooo.test podman[121905]: 2025-10-14 09:07:34.558401988 +0000 UTC m=+0.189389990 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, build-date=2025-07-21T13:27:15, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid)
Oct 14 09:07:34 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:07:34 np0005486759.ooo.test sudo[121902]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:35 np0005486759.ooo.test sudo[121978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4ex5xb7/privsep.sock
Oct 14 09:07:35 np0005486759.ooo.test sudo[121978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:35 np0005486759.ooo.test sudo[121978]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:35 np0005486759.ooo.test sudo[121989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxu7vjbdy/privsep.sock
Oct 14 09:07:35 np0005486759.ooo.test sudo[121989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:36 np0005486759.ooo.test sudo[121989]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:36 np0005486759.ooo.test sudo[122000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplg7myshv/privsep.sock
Oct 14 09:07:36 np0005486759.ooo.test sudo[122000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:37 np0005486759.ooo.test sudo[122000]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:37 np0005486759.ooo.test sudo[122011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_v1cubiq/privsep.sock
Oct 14 09:07:37 np0005486759.ooo.test sudo[122011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:38 np0005486759.ooo.test sudo[122011]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:38 np0005486759.ooo.test sudo[122024]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1bjrjtm7/privsep.sock
Oct 14 09:07:38 np0005486759.ooo.test sudo[122024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:39 np0005486759.ooo.test sudo[122024]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:39 np0005486759.ooo.test sudo[122039]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpal1_o8fz/privsep.sock
Oct 14 09:07:39 np0005486759.ooo.test sudo[122039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:39 np0005486759.ooo.test sudo[122039]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:40 np0005486759.ooo.test sudo[122050]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp289zbr93/privsep.sock
Oct 14 09:07:40 np0005486759.ooo.test sudo[122050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:40 np0005486759.ooo.test sudo[122050]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:41 np0005486759.ooo.test sudo[122061]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0l2nuoq6/privsep.sock
Oct 14 09:07:41 np0005486759.ooo.test sudo[122061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:41 np0005486759.ooo.test sudo[122061]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:42 np0005486759.ooo.test sudo[122072]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy27h4x7l/privsep.sock
Oct 14 09:07:42 np0005486759.ooo.test sudo[122072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:42 np0005486759.ooo.test sudo[122072]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:42 np0005486759.ooo.test sudo[122083]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc7k71ntf/privsep.sock
Oct 14 09:07:42 np0005486759.ooo.test sudo[122083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:07:43 np0005486759.ooo.test systemd[1]: tmp-crun.jZB5UB.mount: Deactivated successfully.
Oct 14 09:07:43 np0005486759.ooo.test podman[122085]: 2025-10-14 09:07:43.035581043 +0000 UTC m=+0.061301511 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true)
Oct 14 09:07:43 np0005486759.ooo.test podman[122085]: 2025-10-14 09:07:43.205054471 +0000 UTC m=+0.230774909 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, config_id=tripleo_step1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true)
Oct 14 09:07:43 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:07:43 np0005486759.ooo.test sudo[122083]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:43 np0005486759.ooo.test sudo[122126]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkelm3mk8/privsep.sock
Oct 14 09:07:43 np0005486759.ooo.test sudo[122126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:44 np0005486759.ooo.test sudo[122126]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:44 np0005486759.ooo.test sudo[122141]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa3zqjx7v/privsep.sock
Oct 14 09:07:44 np0005486759.ooo.test sudo[122141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:07:44 np0005486759.ooo.test podman[122147]: 2025-10-14 09:07:44.76449316 +0000 UTC m=+0.076582246 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12)
Oct 14 09:07:44 np0005486759.ooo.test podman[122143]: 2025-10-14 09:07:44.740525944 +0000 UTC m=+0.065841202 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, container_name=logrotate_crond, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1, com.redhat.component=openstack-cron-container, version=17.1.9, tcib_managed=true, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, build-date=2025-07-21T13:07:52, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 09:07:44 np0005486759.ooo.test podman[122144]: 2025-10-14 09:07:44.807270042 +0000 UTC m=+0.126966705 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, batch=17.1_20250721.1, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, container_name=ceilometer_agent_compute)
Oct 14 09:07:44 np0005486759.ooo.test podman[122144]: 2025-10-14 09:07:44.825306054 +0000 UTC m=+0.145002727 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:07:44 np0005486759.ooo.test podman[122144]: unhealthy
Oct 14 09:07:44 np0005486759.ooo.test podman[122143]: 2025-10-14 09:07:44.83223435 +0000 UTC m=+0.157549578 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, build-date=2025-07-21T13:07:52, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, release=1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12)
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:07:44 np0005486759.ooo.test podman[122147]: 2025-10-14 09:07:44.88040053 +0000 UTC m=+0.192489596 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, batch=17.1_20250721.1, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9)
Oct 14 09:07:44 np0005486759.ooo.test podman[122147]: unhealthy
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:44 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:07:45 np0005486759.ooo.test sudo[122141]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:07:45 np0005486759.ooo.test podman[122203]: 2025-10-14 09:07:45.326265777 +0000 UTC m=+0.077009020 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Oct 14 09:07:45 np0005486759.ooo.test sudo[122231]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqleiws7k/privsep.sock
Oct 14 09:07:45 np0005486759.ooo.test sudo[122231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:45 np0005486759.ooo.test podman[122203]: 2025-10-14 09:07:45.676853586 +0000 UTC m=+0.427596839 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.buildah.version=1.33.12, container_name=nova_migration_target, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute)
Oct 14 09:07:45 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:07:45 np0005486759.ooo.test systemd[1]: tmp-crun.GfKNEX.mount: Deactivated successfully.
Oct 14 09:07:46 np0005486759.ooo.test sudo[122231]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:46 np0005486759.ooo.test sudo[122243]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjyn8t2iq/privsep.sock
Oct 14 09:07:46 np0005486759.ooo.test sudo[122243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:46 np0005486759.ooo.test sudo[122243]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:47 np0005486759.ooo.test sudo[122254]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkhtn4ky7/privsep.sock
Oct 14 09:07:47 np0005486759.ooo.test sudo[122254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:47 np0005486759.ooo.test sudo[122254]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:48 np0005486759.ooo.test sudo[122265]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo5_dr6m9/privsep.sock
Oct 14 09:07:48 np0005486759.ooo.test sudo[122265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:48 np0005486759.ooo.test sudo[122265]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:49 np0005486759.ooo.test sudo[122276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdl4b7a3t/privsep.sock
Oct 14 09:07:49 np0005486759.ooo.test sudo[122276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:07:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:07:49 np0005486759.ooo.test podman[122278]: 2025-10-14 09:07:49.107079101 +0000 UTC m=+0.058145891 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, release=1)
Oct 14 09:07:49 np0005486759.ooo.test podman[122278]: 2025-10-14 09:07:49.116274138 +0000 UTC m=+0.067340918 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-07-21T16:28:53, release=1, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:07:49 np0005486759.ooo.test podman[122278]: unhealthy
Oct 14 09:07:49 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:49 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:07:49 np0005486759.ooo.test podman[122279]: 2025-10-14 09:07:49.188775886 +0000 UTC m=+0.130111513 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, version=17.1.9, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, release=1, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:07:49 np0005486759.ooo.test podman[122279]: 2025-10-14 09:07:49.202371139 +0000 UTC m=+0.143706796 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, container_name=ovn_controller, release=1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, version=17.1.9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4)
Oct 14 09:07:49 np0005486759.ooo.test podman[122279]: unhealthy
Oct 14 09:07:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:07:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:07:49 np0005486759.ooo.test sudo[122276]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:49 np0005486759.ooo.test sudo[122332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4o57hyt5/privsep.sock
Oct 14 09:07:49 np0005486759.ooo.test sudo[122332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:50 np0005486759.ooo.test sudo[122332]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:50 np0005486759.ooo.test sudo[122343]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprovq8m7m/privsep.sock
Oct 14 09:07:50 np0005486759.ooo.test sudo[122343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:51 np0005486759.ooo.test sudo[122343]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:51 np0005486759.ooo.test sudo[122354]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzqdt_wa7/privsep.sock
Oct 14 09:07:51 np0005486759.ooo.test sudo[122354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:52 np0005486759.ooo.test sudo[122354]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:52 np0005486759.ooo.test sudo[122365]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgiq9000n/privsep.sock
Oct 14 09:07:52 np0005486759.ooo.test sudo[122365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:53 np0005486759.ooo.test sudo[122365]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:53 np0005486759.ooo.test sudo[122376]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpod1indgc/privsep.sock
Oct 14 09:07:53 np0005486759.ooo.test sudo[122376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:54 np0005486759.ooo.test sudo[122376]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:54 np0005486759.ooo.test sudo[122387]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfe36u3al/privsep.sock
Oct 14 09:07:54 np0005486759.ooo.test sudo[122387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:54 np0005486759.ooo.test sudo[122387]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:55 np0005486759.ooo.test sudo[122404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphna0c25x/privsep.sock
Oct 14 09:07:55 np0005486759.ooo.test sudo[122404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:55 np0005486759.ooo.test sudo[122404]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:56 np0005486759.ooo.test sudo[122415]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5f548439/privsep.sock
Oct 14 09:07:56 np0005486759.ooo.test sudo[122415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:56 np0005486759.ooo.test sudo[122415]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:56 np0005486759.ooo.test sudo[122426]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_49jwdfc/privsep.sock
Oct 14 09:07:56 np0005486759.ooo.test sudo[122426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:57 np0005486759.ooo.test sudo[122426]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:57 np0005486759.ooo.test sudo[122437]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpchc2i9sv/privsep.sock
Oct 14 09:07:57 np0005486759.ooo.test sudo[122437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:58 np0005486759.ooo.test sudo[122437]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:58 np0005486759.ooo.test sudo[122448]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5_5znxc5/privsep.sock
Oct 14 09:07:58 np0005486759.ooo.test sudo[122448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:59 np0005486759.ooo.test sudo[122448]: pam_unix(sudo:session): session closed for user root
Oct 14 09:07:59 np0005486759.ooo.test sudo[122459]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmvp2z6oq/privsep.sock
Oct 14 09:07:59 np0005486759.ooo.test sudo[122459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:07:59 np0005486759.ooo.test sudo[122459]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:00 np0005486759.ooo.test sudo[122475]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmr50ksfa/privsep.sock
Oct 14 09:08:00 np0005486759.ooo.test sudo[122475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:00 np0005486759.ooo.test sudo[122475]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:00 np0005486759.ooo.test sudo[122487]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy04ttks2/privsep.sock
Oct 14 09:08:00 np0005486759.ooo.test sudo[122487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:01 np0005486759.ooo.test sudo[122487]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:01 np0005486759.ooo.test sudo[122498]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2q8riqq2/privsep.sock
Oct 14 09:08:01 np0005486759.ooo.test sudo[122498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:02 np0005486759.ooo.test sudo[122498]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:02 np0005486759.ooo.test sudo[122509]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyj14t7r3/privsep.sock
Oct 14 09:08:02 np0005486759.ooo.test sudo[122509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:03 np0005486759.ooo.test sudo[122509]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:03 np0005486759.ooo.test sudo[122520]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzoogynq9/privsep.sock
Oct 14 09:08:03 np0005486759.ooo.test sudo[122520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:04 np0005486759.ooo.test sudo[122520]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:04 np0005486759.ooo.test sudo[122531]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps5q0l13z/privsep.sock
Oct 14 09:08:04 np0005486759.ooo.test sudo[122531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:04 np0005486759.ooo.test sudo[122531]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:08:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:08:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:08:05 np0005486759.ooo.test systemd[1]: tmp-crun.hQb7VC.mount: Deactivated successfully.
Oct 14 09:08:05 np0005486759.ooo.test podman[122538]: 2025-10-14 09:08:05.016654791 +0000 UTC m=+0.074439989 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, tcib_managed=true, batch=17.1_20250721.1, container_name=nova_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible)
Oct 14 09:08:05 np0005486759.ooo.test podman[122536]: 2025-10-14 09:08:04.991810128 +0000 UTC m=+0.057845323 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, tcib_managed=true, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid)
Oct 14 09:08:05 np0005486759.ooo.test podman[122538]: 2025-10-14 09:08:05.066341759 +0000 UTC m=+0.124126957 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20250721.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=nova_compute, vcs-type=git, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9)
Oct 14 09:08:05 np0005486759.ooo.test podman[122536]: 2025-10-14 09:08:05.075337089 +0000 UTC m=+0.141372304 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container)
Oct 14 09:08:05 np0005486759.ooo.test systemd[1]: tmp-crun.sj9GBY.mount: Deactivated successfully.
Oct 14 09:08:05 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:08:05 np0005486759.ooo.test podman[122544]: 2025-10-14 09:08:05.08788963 +0000 UTC m=+0.143731867 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.expose-services=, container_name=collectd, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Oct 14 09:08:05 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:08:05 np0005486759.ooo.test podman[122544]: 2025-10-14 09:08:05.098210412 +0000 UTC m=+0.154052629 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, config_id=tripleo_step3, version=17.1.9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:08:05 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:08:05 np0005486759.ooo.test sudo[122602]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2o5h3buz/privsep.sock
Oct 14 09:08:05 np0005486759.ooo.test sudo[122602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:05 np0005486759.ooo.test sudo[122602]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:06 np0005486759.ooo.test sudo[122619]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp82q54o78/privsep.sock
Oct 14 09:08:06 np0005486759.ooo.test sudo[122619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:06 np0005486759.ooo.test sudo[122619]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:06 np0005486759.ooo.test sudo[122630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8nhldqnv/privsep.sock
Oct 14 09:08:06 np0005486759.ooo.test sudo[122630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:07 np0005486759.ooo.test sudo[122630]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:07 np0005486759.ooo.test sudo[122641]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp52aooogt/privsep.sock
Oct 14 09:08:07 np0005486759.ooo.test sudo[122641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:08 np0005486759.ooo.test sudo[122641]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:08 np0005486759.ooo.test sudo[122652]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu8pt9u_g/privsep.sock
Oct 14 09:08:08 np0005486759.ooo.test sudo[122652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:09 np0005486759.ooo.test sudo[122652]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:09 np0005486759.ooo.test sudo[122663]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph2kspv7q/privsep.sock
Oct 14 09:08:09 np0005486759.ooo.test sudo[122663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:10 np0005486759.ooo.test sudo[122663]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:10 np0005486759.ooo.test sudo[122674]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd7ym_uby/privsep.sock
Oct 14 09:08:10 np0005486759.ooo.test sudo[122674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:11 np0005486759.ooo.test sudo[122674]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:11 np0005486759.ooo.test sudo[122691]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxn2qkp59/privsep.sock
Oct 14 09:08:11 np0005486759.ooo.test sudo[122691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:11 np0005486759.ooo.test sudo[122691]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:12 np0005486759.ooo.test sudo[122702]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuhlojsmh/privsep.sock
Oct 14 09:08:12 np0005486759.ooo.test sudo[122702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:12 np0005486759.ooo.test sudo[122702]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:13 np0005486759.ooo.test sudo[122713]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk0bgefxt/privsep.sock
Oct 14 09:08:13 np0005486759.ooo.test sudo[122713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:08:13 np0005486759.ooo.test systemd[1]: tmp-crun.iSRpEE.mount: Deactivated successfully.
Oct 14 09:08:13 np0005486759.ooo.test podman[122716]: 2025-10-14 09:08:13.431862917 +0000 UTC m=+0.068496445 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Oct 14 09:08:13 np0005486759.ooo.test podman[122716]: 2025-10-14 09:08:13.631736652 +0000 UTC m=+0.268370190 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Oct 14 09:08:13 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:08:13 np0005486759.ooo.test sudo[122713]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:13 np0005486759.ooo.test sudo[122753]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkcuofjr4/privsep.sock
Oct 14 09:08:13 np0005486759.ooo.test sudo[122753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:14 np0005486759.ooo.test sudo[122753]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:14 np0005486759.ooo.test sudo[122764]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9zyivxms/privsep.sock
Oct 14 09:08:14 np0005486759.ooo.test sudo[122764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:15 np0005486759.ooo.test sudo[122764]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:08:15 np0005486759.ooo.test podman[122769]: 2025-10-14 09:08:15.412773253 +0000 UTC m=+0.058322477 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vendor=Red Hat, Inc., container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:08:15 np0005486759.ooo.test podman[122769]: 2025-10-14 09:08:15.422310971 +0000 UTC m=+0.067860195 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, release=1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, build-date=2025-07-21T13:07:52, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20250721.1, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9)
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:08:15 np0005486759.ooo.test podman[122771]: 2025-10-14 09:08:15.465489555 +0000 UTC m=+0.107445507 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, batch=17.1_20250721.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, io.openshift.expose-services=, release=1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:08:15 np0005486759.ooo.test podman[122771]: 2025-10-14 09:08:15.483639411 +0000 UTC m=+0.125595433 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, version=17.1.9, build-date=2025-07-21T14:45:33, container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Oct 14 09:08:15 np0005486759.ooo.test podman[122771]: unhealthy
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: tmp-crun.nwxY8n.mount: Deactivated successfully.
Oct 14 09:08:15 np0005486759.ooo.test podman[122777]: 2025-10-14 09:08:15.607440277 +0000 UTC m=+0.247077927 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47)
Oct 14 09:08:15 np0005486759.ooo.test podman[122777]: 2025-10-14 09:08:15.641564719 +0000 UTC m=+0.281202459 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:08:15 np0005486759.ooo.test podman[122777]: unhealthy
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:15 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:08:15 np0005486759.ooo.test sudo[122831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn0f_p5qf/privsep.sock
Oct 14 09:08:15 np0005486759.ooo.test sudo[122831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:16 np0005486759.ooo.test sudo[122831]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:08:16 np0005486759.ooo.test podman[122840]: 2025-10-14 09:08:16.374478387 +0000 UTC m=+0.046656804 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64)
Oct 14 09:08:16 np0005486759.ooo.test sudo[122869]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp5mgtaqo/privsep.sock
Oct 14 09:08:16 np0005486759.ooo.test sudo[122869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:16 np0005486759.ooo.test podman[122840]: 2025-10-14 09:08:16.71524932 +0000 UTC m=+0.387427757 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9)
Oct 14 09:08:16 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:08:17 np0005486759.ooo.test sudo[122869]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:17 np0005486759.ooo.test sudo[122880]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp790z2p9k/privsep.sock
Oct 14 09:08:17 np0005486759.ooo.test sudo[122880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:18 np0005486759.ooo.test sudo[122880]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:18 np0005486759.ooo.test sudo[122891]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4s6xyceq/privsep.sock
Oct 14 09:08:18 np0005486759.ooo.test sudo[122891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:18 np0005486759.ooo.test sudo[122891]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:19 np0005486759.ooo.test sudo[122902]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl2syvxt8/privsep.sock
Oct 14 09:08:19 np0005486759.ooo.test sudo[122902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:08:19 np0005486759.ooo.test podman[122904]: 2025-10-14 09:08:19.240911233 +0000 UTC m=+0.070871638 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, vcs-type=git, version=17.1.9, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 09:08:19 np0005486759.ooo.test podman[122904]: 2025-10-14 09:08:19.277459841 +0000 UTC m=+0.107420296 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, distribution-scope=public, architecture=x86_64, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true)
Oct 14 09:08:19 np0005486759.ooo.test podman[122904]: unhealthy
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: tmp-crun.fzrkDT.mount: Deactivated successfully.
Oct 14 09:08:19 np0005486759.ooo.test podman[122922]: 2025-10-14 09:08:19.336305294 +0000 UTC m=+0.086061002 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, container_name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, io.openshift.expose-services=)
Oct 14 09:08:19 np0005486759.ooo.test podman[122922]: 2025-10-14 09:08:19.374359289 +0000 UTC m=+0.124114977 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, version=17.1.9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 09:08:19 np0005486759.ooo.test podman[122922]: unhealthy
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:19 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:08:19 np0005486759.ooo.test sudo[122902]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:20 np0005486759.ooo.test sudo[122953]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5fynduq0/privsep.sock
Oct 14 09:08:20 np0005486759.ooo.test sudo[122953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:20 np0005486759.ooo.test sudo[122953]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:20 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:08:20 np0005486759.ooo.test recover_tripleo_nova_virtqemud[122960]: 47951
Oct 14 09:08:20 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:08:20 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:08:20 np0005486759.ooo.test sudo[122966]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpli7tubg1/privsep.sock
Oct 14 09:08:20 np0005486759.ooo.test sudo[122966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:21 np0005486759.ooo.test sudo[122966]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:21 np0005486759.ooo.test sudo[122982]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpshvto1ov/privsep.sock
Oct 14 09:08:21 np0005486759.ooo.test sudo[122982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:22 np0005486759.ooo.test sudo[122982]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:22 np0005486759.ooo.test sudo[122994]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe4s647dl/privsep.sock
Oct 14 09:08:22 np0005486759.ooo.test sudo[122994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:23 np0005486759.ooo.test sudo[122994]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:23 np0005486759.ooo.test sudo[123005]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw17l_2o8/privsep.sock
Oct 14 09:08:23 np0005486759.ooo.test sudo[123005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:24 np0005486759.ooo.test sudo[123005]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:24 np0005486759.ooo.test sudo[123016]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoy85nhih/privsep.sock
Oct 14 09:08:24 np0005486759.ooo.test sudo[123016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:24 np0005486759.ooo.test sudo[123016]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:25 np0005486759.ooo.test sudo[123027]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_d9imlcs/privsep.sock
Oct 14 09:08:25 np0005486759.ooo.test sudo[123027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:25 np0005486759.ooo.test sudo[123027]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:26 np0005486759.ooo.test sudo[123038]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0noobtje/privsep.sock
Oct 14 09:08:26 np0005486759.ooo.test sudo[123038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:26 np0005486759.ooo.test sudo[123038]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:27 np0005486759.ooo.test sudo[123051]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcghwwg5_/privsep.sock
Oct 14 09:08:27 np0005486759.ooo.test sudo[123051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:27 np0005486759.ooo.test sudo[123051]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:27 np0005486759.ooo.test sudo[123066]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbin_ljzn/privsep.sock
Oct 14 09:08:27 np0005486759.ooo.test sudo[123066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:28 np0005486759.ooo.test sudo[123066]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:28 np0005486759.ooo.test sudo[123077]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdphc0r_u/privsep.sock
Oct 14 09:08:28 np0005486759.ooo.test sudo[123077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:29 np0005486759.ooo.test sudo[123077]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:29 np0005486759.ooo.test sudo[123088]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7iat2y2f/privsep.sock
Oct 14 09:08:29 np0005486759.ooo.test sudo[123088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:30 np0005486759.ooo.test sudo[123088]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:30 np0005486759.ooo.test sudo[123099]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoh4ssat3/privsep.sock
Oct 14 09:08:30 np0005486759.ooo.test sudo[123099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:30 np0005486759.ooo.test sudo[123099]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:31 np0005486759.ooo.test sudo[123110]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp62d6ff2l/privsep.sock
Oct 14 09:08:31 np0005486759.ooo.test sudo[123110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:31 np0005486759.ooo.test sudo[123110]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:32 np0005486759.ooo.test sudo[123121]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpobsm_fdu/privsep.sock
Oct 14 09:08:32 np0005486759.ooo.test sudo[123121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:32 np0005486759.ooo.test sudo[123121]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:33 np0005486759.ooo.test sudo[123138]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjab7pgk6/privsep.sock
Oct 14 09:08:33 np0005486759.ooo.test sudo[123138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:33 np0005486759.ooo.test sudo[123138]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:33 np0005486759.ooo.test sudo[123149]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkurfku13/privsep.sock
Oct 14 09:08:33 np0005486759.ooo.test sudo[123149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:34 np0005486759.ooo.test sudo[123149]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:34 np0005486759.ooo.test sudo[123160]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc49m716q/privsep.sock
Oct 14 09:08:34 np0005486759.ooo.test sudo[123160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:35 np0005486759.ooo.test sudo[123160]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:08:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:08:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:08:35 np0005486759.ooo.test podman[123164]: 2025-10-14 09:08:35.447124621 +0000 UTC m=+0.073619343 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:08:35 np0005486759.ooo.test podman[123165]: 2025-10-14 09:08:35.427147739 +0000 UTC m=+0.056527181 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, release=1, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, version=17.1.9, distribution-scope=public, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Oct 14 09:08:35 np0005486759.ooo.test podman[123164]: 2025-10-14 09:08:35.484206746 +0000 UTC m=+0.110701468 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.buildah.version=1.33.12, release=1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Oct 14 09:08:35 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:08:35 np0005486759.ooo.test podman[123165]: 2025-10-14 09:08:35.507665247 +0000 UTC m=+0.137044699 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, config_id=tripleo_step5, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Oct 14 09:08:35 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:08:35 np0005486759.ooo.test podman[123166]: 2025-10-14 09:08:35.561224635 +0000 UTC m=+0.185613853 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, vendor=Red Hat, Inc., container_name=collectd, batch=17.1_20250721.1, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=2, name=rhosp17/openstack-collectd, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03)
Oct 14 09:08:35 np0005486759.ooo.test podman[123166]: 2025-10-14 09:08:35.576397527 +0000 UTC m=+0.200786795 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.openshift.expose-services=)
Oct 14 09:08:35 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:08:35 np0005486759.ooo.test sudo[123234]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpctarpm__/privsep.sock
Oct 14 09:08:35 np0005486759.ooo.test sudo[123234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:36 np0005486759.ooo.test sudo[123234]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:36 np0005486759.ooo.test sudo[123245]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3x4p5pdx/privsep.sock
Oct 14 09:08:36 np0005486759.ooo.test sudo[123245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:37 np0005486759.ooo.test sudo[123245]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:37 np0005486759.ooo.test sudo[123256]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxq0g3k5c/privsep.sock
Oct 14 09:08:37 np0005486759.ooo.test sudo[123256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:38 np0005486759.ooo.test sudo[123256]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:38 np0005486759.ooo.test sudo[123273]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr9kw5i61/privsep.sock
Oct 14 09:08:38 np0005486759.ooo.test sudo[123273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:39 np0005486759.ooo.test sudo[123273]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:39 np0005486759.ooo.test sudo[123284]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp18nv1d0c/privsep.sock
Oct 14 09:08:39 np0005486759.ooo.test sudo[123284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:39 np0005486759.ooo.test sudo[123284]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:40 np0005486759.ooo.test sudo[123295]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8bhea45l/privsep.sock
Oct 14 09:08:40 np0005486759.ooo.test sudo[123295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:40 np0005486759.ooo.test sudo[123295]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:41 np0005486759.ooo.test sudo[123306]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm15pp9p7/privsep.sock
Oct 14 09:08:41 np0005486759.ooo.test sudo[123306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:41 np0005486759.ooo.test sudo[123306]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:41 np0005486759.ooo.test sudo[123317]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxux6pmkq/privsep.sock
Oct 14 09:08:41 np0005486759.ooo.test sudo[123317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:42 np0005486759.ooo.test sudo[123317]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:42 np0005486759.ooo.test sudo[123328]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0fadya2n/privsep.sock
Oct 14 09:08:42 np0005486759.ooo.test sudo[123328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:43 np0005486759.ooo.test sudo[123328]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:43 np0005486759.ooo.test sudo[123345]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptwcbuj_8/privsep.sock
Oct 14 09:08:43 np0005486759.ooo.test sudo[123345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:08:43 np0005486759.ooo.test systemd[1]: tmp-crun.UcgAMO.mount: Deactivated successfully.
Oct 14 09:08:43 np0005486759.ooo.test podman[123347]: 2025-10-14 09:08:43.856468083 +0000 UTC m=+0.060324550 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, release=1, batch=17.1_20250721.1, distribution-scope=public, version=17.1.9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 09:08:44 np0005486759.ooo.test podman[123347]: 2025-10-14 09:08:44.007371883 +0000 UTC m=+0.211228330 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1)
Oct 14 09:08:44 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:08:44 np0005486759.ooo.test sudo[123345]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:44 np0005486759.ooo.test sudo[123383]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj0z09778/privsep.sock
Oct 14 09:08:44 np0005486759.ooo.test sudo[123383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:45 np0005486759.ooo.test sudo[123383]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:45 np0005486759.ooo.test sudo[123394]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4pqwoo1q/privsep.sock
Oct 14 09:08:45 np0005486759.ooo.test sudo[123394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: tmp-crun.V5gGbx.mount: Deactivated successfully.
Oct 14 09:08:45 np0005486759.ooo.test podman[123397]: 2025-10-14 09:08:45.579357264 +0000 UTC m=+0.062778007 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, batch=17.1_20250721.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, release=1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, architecture=x86_64, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1)
Oct 14 09:08:45 np0005486759.ooo.test podman[123397]: 2025-10-14 09:08:45.612899328 +0000 UTC m=+0.096320111 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:45:33, version=17.1.9, release=1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 09:08:45 np0005486759.ooo.test podman[123396]: 2025-10-14 09:08:45.631818258 +0000 UTC m=+0.119566205 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Oct 14 09:08:45 np0005486759.ooo.test podman[123397]: unhealthy
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:08:45 np0005486759.ooo.test podman[123396]: 2025-10-14 09:08:45.71858444 +0000 UTC m=+0.206332377 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:52, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-cron, architecture=x86_64, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true)
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:08:45 np0005486759.ooo.test podman[123435]: 2025-10-14 09:08:45.782413508 +0000 UTC m=+0.080376764 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, release=1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 09:08:45 np0005486759.ooo.test podman[123435]: 2025-10-14 09:08:45.798455327 +0000 UTC m=+0.096418553 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Oct 14 09:08:45 np0005486759.ooo.test podman[123435]: unhealthy
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:45 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:08:46 np0005486759.ooo.test sudo[123394]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:46 np0005486759.ooo.test sudo[123464]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6k0yb4yj/privsep.sock
Oct 14 09:08:46 np0005486759.ooo.test sudo[123464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:46 np0005486759.ooo.test systemd[1]: tmp-crun.xOUV8h.mount: Deactivated successfully.
Oct 14 09:08:47 np0005486759.ooo.test sudo[123464]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:08:47 np0005486759.ooo.test podman[123468]: 2025-10-14 09:08:47.181002758 +0000 UTC m=+0.102585706 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:08:47 np0005486759.ooo.test sudo[123497]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpik8ko6_t/privsep.sock
Oct 14 09:08:47 np0005486759.ooo.test sudo[123497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:47 np0005486759.ooo.test podman[123468]: 2025-10-14 09:08:47.619242667 +0000 UTC m=+0.540825625 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.9, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, distribution-scope=public)
Oct 14 09:08:47 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:08:47 np0005486759.ooo.test sudo[123497]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:48 np0005486759.ooo.test sudo[123509]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp25llx5s5/privsep.sock
Oct 14 09:08:48 np0005486759.ooo.test sudo[123509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:48 np0005486759.ooo.test sudo[123509]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:49 np0005486759.ooo.test sudo[123526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1cur5he5/privsep.sock
Oct 14 09:08:49 np0005486759.ooo.test sudo[123526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:08:49 np0005486759.ooo.test podman[123529]: 2025-10-14 09:08:49.451589636 +0000 UTC m=+0.081550291 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, version=17.1.9, architecture=x86_64, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:08:49 np0005486759.ooo.test podman[123529]: 2025-10-14 09:08:49.473815368 +0000 UTC m=+0.103776023 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9)
Oct 14 09:08:49 np0005486759.ooo.test podman[123529]: unhealthy
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: tmp-crun.ijU9Y2.mount: Deactivated successfully.
Oct 14 09:08:49 np0005486759.ooo.test podman[123545]: 2025-10-14 09:08:49.570115237 +0000 UTC m=+0.096200837 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, batch=17.1_20250721.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.33.12, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9)
Oct 14 09:08:49 np0005486759.ooo.test podman[123545]: 2025-10-14 09:08:49.58237538 +0000 UTC m=+0.108461010 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1, io.buildah.version=1.33.12, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, version=17.1.9, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:08:49 np0005486759.ooo.test podman[123545]: unhealthy
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:08:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:08:49 np0005486759.ooo.test sudo[123526]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:49 np0005486759.ooo.test sudo[123574]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpecg9kdu0/privsep.sock
Oct 14 09:08:49 np0005486759.ooo.test sudo[123574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:50 np0005486759.ooo.test sudo[123574]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:50 np0005486759.ooo.test sudo[123585]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmc_lwi0d/privsep.sock
Oct 14 09:08:50 np0005486759.ooo.test sudo[123585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:51 np0005486759.ooo.test sudo[123585]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:51 np0005486759.ooo.test sudo[123596]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxsa6nqx5/privsep.sock
Oct 14 09:08:51 np0005486759.ooo.test sudo[123596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:52 np0005486759.ooo.test sudo[123596]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:52 np0005486759.ooo.test sudo[123607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_jhq43ek/privsep.sock
Oct 14 09:08:52 np0005486759.ooo.test sudo[123607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:53 np0005486759.ooo.test sudo[123607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:53 np0005486759.ooo.test sudo[123618]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0dud36rk/privsep.sock
Oct 14 09:08:53 np0005486759.ooo.test sudo[123618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:54 np0005486759.ooo.test sudo[123618]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:54 np0005486759.ooo.test sudo[123635]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_n1be_fm/privsep.sock
Oct 14 09:08:54 np0005486759.ooo.test sudo[123635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:55 np0005486759.ooo.test sudo[123635]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:55 np0005486759.ooo.test sudo[123646]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbn9mt28h/privsep.sock
Oct 14 09:08:55 np0005486759.ooo.test sudo[123646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:55 np0005486759.ooo.test sudo[123646]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:56 np0005486759.ooo.test sudo[123657]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3mc71z0t/privsep.sock
Oct 14 09:08:56 np0005486759.ooo.test sudo[123657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:56 np0005486759.ooo.test sudo[123657]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:57 np0005486759.ooo.test sudo[123668]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvfucllsg/privsep.sock
Oct 14 09:08:57 np0005486759.ooo.test sudo[123668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:57 np0005486759.ooo.test sudo[123668]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:57 np0005486759.ooo.test sudo[123679]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_saoqfo/privsep.sock
Oct 14 09:08:57 np0005486759.ooo.test sudo[123679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:58 np0005486759.ooo.test sudo[123679]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:58 np0005486759.ooo.test sudo[123690]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm2r5d7sj/privsep.sock
Oct 14 09:08:58 np0005486759.ooo.test sudo[123690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:08:59 np0005486759.ooo.test sudo[123690]: pam_unix(sudo:session): session closed for user root
Oct 14 09:08:59 np0005486759.ooo.test sudo[123706]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2655dr78/privsep.sock
Oct 14 09:08:59 np0005486759.ooo.test sudo[123706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:00 np0005486759.ooo.test sudo[123706]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:00 np0005486759.ooo.test sudo[123718]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf8n_dflh/privsep.sock
Oct 14 09:09:00 np0005486759.ooo.test sudo[123718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:01 np0005486759.ooo.test sudo[123718]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:01 np0005486759.ooo.test sudo[123729]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9i8labvn/privsep.sock
Oct 14 09:09:01 np0005486759.ooo.test sudo[123729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:02 np0005486759.ooo.test sudo[123729]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:02 np0005486759.ooo.test sudo[123740]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6unxp4rj/privsep.sock
Oct 14 09:09:02 np0005486759.ooo.test sudo[123740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:02 np0005486759.ooo.test sudo[123740]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:03 np0005486759.ooo.test sudo[123751]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpic1j9t67/privsep.sock
Oct 14 09:09:03 np0005486759.ooo.test sudo[123751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:03 np0005486759.ooo.test sudo[123751]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:04 np0005486759.ooo.test sudo[123762]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4p__5qwg/privsep.sock
Oct 14 09:09:04 np0005486759.ooo.test sudo[123762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:04 np0005486759.ooo.test sudo[123762]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:04 np0005486759.ooo.test sudo[123775]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp532ydhxb/privsep.sock
Oct 14 09:09:04 np0005486759.ooo.test sudo[123775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:05 np0005486759.ooo.test sudo[123775]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:05 np0005486759.ooo.test sudo[123790]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm2uobdy1/privsep.sock
Oct 14 09:09:05 np0005486759.ooo.test sudo[123790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:09:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:09:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:09:05 np0005486759.ooo.test podman[123793]: 2025-10-14 09:09:05.828023176 +0000 UTC m=+0.079600180 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, version=17.1.9, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_compute, batch=17.1_20250721.1)
Oct 14 09:09:05 np0005486759.ooo.test podman[123792]: 2025-10-14 09:09:05.808124396 +0000 UTC m=+0.063982844 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vendor=Red Hat, Inc., release=1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid)
Oct 14 09:09:05 np0005486759.ooo.test podman[123794]: 2025-10-14 09:09:05.865297926 +0000 UTC m=+0.115593660 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.9, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Oct 14 09:09:05 np0005486759.ooo.test podman[123793]: 2025-10-14 09:09:05.878307302 +0000 UTC m=+0.129884286 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team)
Oct 14 09:09:05 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:09:05 np0005486759.ooo.test podman[123792]: 2025-10-14 09:09:05.894222148 +0000 UTC m=+0.150080576 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, io.openshift.expose-services=)
Oct 14 09:09:05 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:09:05 np0005486759.ooo.test podman[123794]: 2025-10-14 09:09:05.929875338 +0000 UTC m=+0.180171022 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:09:05 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:09:06 np0005486759.ooo.test sudo[123790]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:06 np0005486759.ooo.test sudo[123863]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3cu1ynnj/privsep.sock
Oct 14 09:09:06 np0005486759.ooo.test sudo[123863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:07 np0005486759.ooo.test sudo[123863]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:07 np0005486759.ooo.test sudo[123874]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnk4svf8c/privsep.sock
Oct 14 09:09:07 np0005486759.ooo.test sudo[123874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:08 np0005486759.ooo.test sudo[123874]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:08 np0005486759.ooo.test sudo[123885]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi4wvxl67/privsep.sock
Oct 14 09:09:08 np0005486759.ooo.test sudo[123885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:08 np0005486759.ooo.test sudo[123885]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:09 np0005486759.ooo.test sudo[123896]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvmnj_idv/privsep.sock
Oct 14 09:09:09 np0005486759.ooo.test sudo[123896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:09 np0005486759.ooo.test sudo[123896]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:10 np0005486759.ooo.test sudo[123907]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphse7mvif/privsep.sock
Oct 14 09:09:10 np0005486759.ooo.test sudo[123907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:10 np0005486759.ooo.test sudo[123907]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:10 np0005486759.ooo.test sudo[123924]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn42ni_28/privsep.sock
Oct 14 09:09:10 np0005486759.ooo.test sudo[123924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:11 np0005486759.ooo.test sudo[123924]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:11 np0005486759.ooo.test sudo[123935]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7fclhvfy/privsep.sock
Oct 14 09:09:11 np0005486759.ooo.test sudo[123935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:12 np0005486759.ooo.test sudo[123935]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:12 np0005486759.ooo.test sudo[123946]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpish9hdqr/privsep.sock
Oct 14 09:09:12 np0005486759.ooo.test sudo[123946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:13 np0005486759.ooo.test sudo[123946]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:13 np0005486759.ooo.test sudo[123957]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_xc9e9rq/privsep.sock
Oct 14 09:09:13 np0005486759.ooo.test sudo[123957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:14 np0005486759.ooo.test sudo[123957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:09:14 np0005486759.ooo.test podman[123962]: 2025-10-14 09:09:14.120701924 +0000 UTC m=+0.050167773 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, name=rhosp17/openstack-qdrouterd, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Oct 14 09:09:14 np0005486759.ooo.test sudo[123996]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn_0xzn67/privsep.sock
Oct 14 09:09:14 np0005486759.ooo.test sudo[123996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:14 np0005486759.ooo.test podman[123962]: 2025-10-14 09:09:14.354822156 +0000 UTC m=+0.284287945 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.9, container_name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-qdrouterd, release=1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 09:09:14 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:09:14 np0005486759.ooo.test sudo[123996]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:15 np0005486759.ooo.test sudo[124007]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpb4gkztj1/privsep.sock
Oct 14 09:09:15 np0005486759.ooo.test sudo[124007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:15 np0005486759.ooo.test sudo[124007]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:09:15 np0005486759.ooo.test podman[124019]: 2025-10-14 09:09:15.883353792 +0000 UTC m=+0.072900171 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:09:15 np0005486759.ooo.test podman[124019]: 2025-10-14 09:09:15.896478001 +0000 UTC m=+0.086024380 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T14:45:33, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute)
Oct 14 09:09:15 np0005486759.ooo.test podman[124019]: unhealthy
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: tmp-crun.SCebgT.mount: Deactivated successfully.
Oct 14 09:09:15 np0005486759.ooo.test podman[124047]: 2025-10-14 09:09:15.961273569 +0000 UTC m=+0.064343085 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, vendor=Red Hat, Inc.)
Oct 14 09:09:15 np0005486759.ooo.test podman[124047]: 2025-10-14 09:09:15.969154405 +0000 UTC m=+0.072223911 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, release=1, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi)
Oct 14 09:09:15 np0005486759.ooo.test podman[124047]: unhealthy
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:15 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:09:15 np0005486759.ooo.test podman[124017]: 2025-10-14 09:09:15.947636495 +0000 UTC m=+0.137841864 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20250721.1, name=rhosp17/openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, release=1, version=17.1.9, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-07-21T13:07:52, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:09:16 np0005486759.ooo.test podman[124017]: 2025-10-14 09:09:16.025880462 +0000 UTC m=+0.216085811 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=)
Oct 14 09:09:16 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:09:16 np0005486759.ooo.test sudo[124078]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9u32gulz/privsep.sock
Oct 14 09:09:16 np0005486759.ooo.test sudo[124078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:16 np0005486759.ooo.test sudo[124078]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:16 np0005486759.ooo.test sudo[124089]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9o0cja4f/privsep.sock
Oct 14 09:09:16 np0005486759.ooo.test sudo[124089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:17 np0005486759.ooo.test sudo[124089]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:17 np0005486759.ooo.test sudo[124100]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfmrv_kyb/privsep.sock
Oct 14 09:09:17 np0005486759.ooo.test sudo[124100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:09:17 np0005486759.ooo.test podman[124102]: 2025-10-14 09:09:17.887773662 +0000 UTC m=+0.078238378 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, io.buildah.version=1.33.12, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true)
Oct 14 09:09:18 np0005486759.ooo.test podman[124102]: 2025-10-14 09:09:18.22930959 +0000 UTC m=+0.419774256 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-07-21T14:48:37, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:09:18 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:09:18 np0005486759.ooo.test sudo[124100]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:18 np0005486759.ooo.test sudo[124134]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdqdxykes/privsep.sock
Oct 14 09:09:18 np0005486759.ooo.test sudo[124134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:19 np0005486759.ooo.test sudo[124134]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:19 np0005486759.ooo.test sudo[124145]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8f6bmzpi/privsep.sock
Oct 14 09:09:19 np0005486759.ooo.test sudo[124145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:20 np0005486759.ooo.test sudo[124145]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:09:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:09:20 np0005486759.ooo.test podman[124149]: 2025-10-14 09:09:20.191081929 +0000 UTC m=+0.086831865 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:09:20 np0005486759.ooo.test podman[124149]: 2025-10-14 09:09:20.209402651 +0000 UTC m=+0.105152587 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, release=1, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 09:09:20 np0005486759.ooo.test podman[124149]: unhealthy
Oct 14 09:09:20 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:20 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:09:20 np0005486759.ooo.test podman[124152]: 2025-10-14 09:09:20.16442305 +0000 UTC m=+0.060761744 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, release=1, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, container_name=ovn_controller, managed_by=tripleo_ansible, version=17.1.9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Oct 14 09:09:20 np0005486759.ooo.test podman[124152]: 2025-10-14 09:09:20.297568416 +0000 UTC m=+0.193907100 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-type=git, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, architecture=x86_64, io.buildah.version=1.33.12, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:09:20 np0005486759.ooo.test podman[124152]: unhealthy
Oct 14 09:09:20 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:20 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:09:20 np0005486759.ooo.test sudo[124196]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpban7bwlt/privsep.sock
Oct 14 09:09:20 np0005486759.ooo.test sudo[124196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:20 np0005486759.ooo.test sudo[124196]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:21 np0005486759.ooo.test sudo[124212]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2t4x77wu/privsep.sock
Oct 14 09:09:21 np0005486759.ooo.test sudo[124212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:21 np0005486759.ooo.test sudo[124212]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:21 np0005486759.ooo.test sudo[124224]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmf7h4fb4/privsep.sock
Oct 14 09:09:21 np0005486759.ooo.test sudo[124224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:22 np0005486759.ooo.test sudo[124224]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:22 np0005486759.ooo.test sudo[124235]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf75n6n2m/privsep.sock
Oct 14 09:09:22 np0005486759.ooo.test sudo[124235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:23 np0005486759.ooo.test sudo[124235]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:23 np0005486759.ooo.test sudo[124246]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxxjefroh/privsep.sock
Oct 14 09:09:23 np0005486759.ooo.test sudo[124246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:24 np0005486759.ooo.test sudo[124246]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:24 np0005486759.ooo.test sudo[124257]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzpf9pusd/privsep.sock
Oct 14 09:09:24 np0005486759.ooo.test sudo[124257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:25 np0005486759.ooo.test sudo[124257]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:25 np0005486759.ooo.test sudo[124268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqbu63v5c/privsep.sock
Oct 14 09:09:25 np0005486759.ooo.test sudo[124268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:26 np0005486759.ooo.test sudo[124268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:26 np0005486759.ooo.test sudo[124282]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp18i6jwy5/privsep.sock
Oct 14 09:09:26 np0005486759.ooo.test sudo[124282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:27 np0005486759.ooo.test sudo[124282]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:27 np0005486759.ooo.test sudo[124296]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpks1fvwxw/privsep.sock
Oct 14 09:09:27 np0005486759.ooo.test sudo[124296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:28 np0005486759.ooo.test sudo[124296]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:28 np0005486759.ooo.test sudo[124307]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcontg65x/privsep.sock
Oct 14 09:09:28 np0005486759.ooo.test sudo[124307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:29 np0005486759.ooo.test sudo[124307]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:29 np0005486759.ooo.test sudo[124318]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprzpuwpo4/privsep.sock
Oct 14 09:09:29 np0005486759.ooo.test sudo[124318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:29 np0005486759.ooo.test sudo[124318]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:30 np0005486759.ooo.test sudo[124329]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplikd_q24/privsep.sock
Oct 14 09:09:30 np0005486759.ooo.test sudo[124329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:30 np0005486759.ooo.test sudo[124329]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:31 np0005486759.ooo.test sudo[124340]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmlgf9d40/privsep.sock
Oct 14 09:09:31 np0005486759.ooo.test sudo[124340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:31 np0005486759.ooo.test sudo[124340]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:32 np0005486759.ooo.test sudo[124357]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa6hh_ky8/privsep.sock
Oct 14 09:09:32 np0005486759.ooo.test sudo[124357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:32 np0005486759.ooo.test sudo[124357]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:32 np0005486759.ooo.test sudo[124368]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpa74cd9ys/privsep.sock
Oct 14 09:09:32 np0005486759.ooo.test sudo[124368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:33 np0005486759.ooo.test sudo[124368]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:33 np0005486759.ooo.test sudo[124379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp96gax7iz/privsep.sock
Oct 14 09:09:33 np0005486759.ooo.test sudo[124379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:34 np0005486759.ooo.test sudo[124379]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:34 np0005486759.ooo.test sudo[124390]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbs26hppi/privsep.sock
Oct 14 09:09:34 np0005486759.ooo.test sudo[124390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:34 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:09:34 np0005486759.ooo.test recover_tripleo_nova_virtqemud[124393]: 47951
Oct 14 09:09:34 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:09:34 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:09:35 np0005486759.ooo.test sudo[124390]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:35 np0005486759.ooo.test sudo[124403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyn15r6ug/privsep.sock
Oct 14 09:09:35 np0005486759.ooo.test sudo[124403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:36 np0005486759.ooo.test sudo[124403]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: tmp-crun.tW5dHf.mount: Deactivated successfully.
Oct 14 09:09:36 np0005486759.ooo.test podman[124411]: 2025-10-14 09:09:36.208755986 +0000 UTC m=+0.058989578 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, release=2, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-collectd, version=17.1.9, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd)
Oct 14 09:09:36 np0005486759.ooo.test podman[124411]: 2025-10-14 09:09:36.213862806 +0000 UTC m=+0.064096368 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, batch=17.1_20250721.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, release=2, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: tmp-crun.nDqh7s.mount: Deactivated successfully.
Oct 14 09:09:36 np0005486759.ooo.test podman[124410]: 2025-10-14 09:09:36.270356665 +0000 UTC m=+0.120760072 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Oct 14 09:09:36 np0005486759.ooo.test podman[124410]: 2025-10-14 09:09:36.313686244 +0000 UTC m=+0.164089621 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=1, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_compute, version=17.1.9, batch=17.1_20250721.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:09:36 np0005486759.ooo.test podman[124409]: 2025-10-14 09:09:36.31999559 +0000 UTC m=+0.171231254 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, container_name=iscsid, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:09:36 np0005486759.ooo.test sudo[124476]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpknr_1p45/privsep.sock
Oct 14 09:09:36 np0005486759.ooo.test sudo[124476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:36 np0005486759.ooo.test podman[124409]: 2025-10-14 09:09:36.402309145 +0000 UTC m=+0.253544799 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, release=1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15)
Oct 14 09:09:36 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:09:36 np0005486759.ooo.test sudo[124476]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:37 np0005486759.ooo.test sudo[124489]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn1ah97h0/privsep.sock
Oct 14 09:09:37 np0005486759.ooo.test sudo[124489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:37 np0005486759.ooo.test sudo[124489]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:38 np0005486759.ooo.test sudo[124504]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgcn2hvkq/privsep.sock
Oct 14 09:09:38 np0005486759.ooo.test sudo[124504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:38 np0005486759.ooo.test sudo[124504]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:38 np0005486759.ooo.test sudo[124515]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp290_sza5/privsep.sock
Oct 14 09:09:38 np0005486759.ooo.test sudo[124515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:39 np0005486759.ooo.test sudo[124515]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:39 np0005486759.ooo.test sudo[124526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpumb7wujm/privsep.sock
Oct 14 09:09:39 np0005486759.ooo.test sudo[124526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:40 np0005486759.ooo.test sudo[124526]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:40 np0005486759.ooo.test sudo[124537]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgtp6wld7/privsep.sock
Oct 14 09:09:40 np0005486759.ooo.test sudo[124537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:41 np0005486759.ooo.test sudo[124537]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:41 np0005486759.ooo.test sudo[124548]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwngfaiy8/privsep.sock
Oct 14 09:09:41 np0005486759.ooo.test sudo[124548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:41 np0005486759.ooo.test sudo[124548]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:42 np0005486759.ooo.test sudo[124559]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgvstidui/privsep.sock
Oct 14 09:09:42 np0005486759.ooo.test sudo[124559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:42 np0005486759.ooo.test sudo[124559]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:43 np0005486759.ooo.test sudo[124576]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8x2gvc_y/privsep.sock
Oct 14 09:09:43 np0005486759.ooo.test sudo[124576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:43 np0005486759.ooo.test sudo[124576]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:43 np0005486759.ooo.test sudo[124587]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd0qihedb/privsep.sock
Oct 14 09:09:43 np0005486759.ooo.test sudo[124587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:44 np0005486759.ooo.test sudo[124587]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:09:44 np0005486759.ooo.test systemd[1]: tmp-crun.vGwGeM.mount: Deactivated successfully.
Oct 14 09:09:44 np0005486759.ooo.test podman[124592]: 2025-10-14 09:09:44.677342734 +0000 UTC m=+0.093616168 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, container_name=metrics_qdr, tcib_managed=true, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, io.buildah.version=1.33.12, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Oct 14 09:09:44 np0005486759.ooo.test sudo[124626]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7sx04tug/privsep.sock
Oct 14 09:09:44 np0005486759.ooo.test sudo[124626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:44 np0005486759.ooo.test podman[124592]: 2025-10-14 09:09:44.887246561 +0000 UTC m=+0.303519965 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-07-21T13:07:59, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1)
Oct 14 09:09:44 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:09:45 np0005486759.ooo.test sudo[124626]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:45 np0005486759.ooo.test sudo[124637]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplbx5idq0/privsep.sock
Oct 14 09:09:45 np0005486759.ooo.test sudo[124637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:46 np0005486759.ooo.test sudo[124637]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: tmp-crun.YJkx40.mount: Deactivated successfully.
Oct 14 09:09:46 np0005486759.ooo.test podman[124642]: 2025-10-14 09:09:46.332157903 +0000 UTC m=+0.074053567 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, release=1, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:09:46 np0005486759.ooo.test podman[124642]: 2025-10-14 09:09:46.36735423 +0000 UTC m=+0.109249944 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container)
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: tmp-crun.KI0KGn.mount: Deactivated successfully.
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:09:46 np0005486759.ooo.test podman[124644]: 2025-10-14 09:09:46.395287199 +0000 UTC m=+0.133091875 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, vendor=Red Hat, Inc., vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.33.12, batch=17.1_20250721.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T14:45:33, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:09:46 np0005486759.ooo.test podman[124644]: 2025-10-14 09:09:46.438728493 +0000 UTC m=+0.176533189 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, release=1, architecture=x86_64, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 09:09:46 np0005486759.ooo.test podman[124644]: unhealthy
Oct 14 09:09:46 np0005486759.ooo.test podman[124645]: 2025-10-14 09:09:46.450564941 +0000 UTC m=+0.183496376 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_id=tripleo_step4, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, release=1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git)
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:09:46 np0005486759.ooo.test podman[124645]: 2025-10-14 09:09:46.465683912 +0000 UTC m=+0.198615397 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, version=17.1.9, release=1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true)
Oct 14 09:09:46 np0005486759.ooo.test podman[124645]: unhealthy
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:46 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:09:46 np0005486759.ooo.test sudo[124707]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpld39gm67/privsep.sock
Oct 14 09:09:46 np0005486759.ooo.test sudo[124707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:47 np0005486759.ooo.test sudo[124707]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:47 np0005486759.ooo.test sudo[124718]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk9nqcvym/privsep.sock
Oct 14 09:09:47 np0005486759.ooo.test sudo[124718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:48 np0005486759.ooo.test sudo[124718]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:48 np0005486759.ooo.test sudo[124734]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdozbiy4c/privsep.sock
Oct 14 09:09:48 np0005486759.ooo.test sudo[124734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:09:48 np0005486759.ooo.test systemd[1]: tmp-crun.abdZrX.mount: Deactivated successfully.
Oct 14 09:09:48 np0005486759.ooo.test podman[124736]: 2025-10-14 09:09:48.435834573 +0000 UTC m=+0.095421273 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.33.12, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2025-07-21T14:48:37, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:09:48 np0005486759.ooo.test podman[124736]: 2025-10-14 09:09:48.805549568 +0000 UTC m=+0.465136308 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.9, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git)
Oct 14 09:09:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:09:48 np0005486759.ooo.test sudo[124734]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:49 np0005486759.ooo.test sudo[124767]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx6u0s4o8/privsep.sock
Oct 14 09:09:49 np0005486759.ooo.test sudo[124767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:49 np0005486759.ooo.test sudo[124767]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:50 np0005486759.ooo.test sudo[124778]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiwm9itun/privsep.sock
Oct 14 09:09:50 np0005486759.ooo.test sudo[124778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:09:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:09:50 np0005486759.ooo.test podman[124781]: 2025-10-14 09:09:50.442639557 +0000 UTC m=+0.070104584 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Oct 14 09:09:50 np0005486759.ooo.test podman[124782]: 2025-10-14 09:09:50.502750059 +0000 UTC m=+0.125164379 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true)
Oct 14 09:09:50 np0005486759.ooo.test podman[124782]: 2025-10-14 09:09:50.514315979 +0000 UTC m=+0.136730329 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, vendor=Red Hat, Inc., release=1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:09:50 np0005486759.ooo.test podman[124782]: unhealthy
Oct 14 09:09:50 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:50 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:09:50 np0005486759.ooo.test podman[124781]: 2025-10-14 09:09:50.535302243 +0000 UTC m=+0.162767330 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12)
Oct 14 09:09:50 np0005486759.ooo.test podman[124781]: unhealthy
Oct 14 09:09:50 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:09:50 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:09:50 np0005486759.ooo.test sudo[124778]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:50 np0005486759.ooo.test sudo[124830]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps9lk27g_/privsep.sock
Oct 14 09:09:50 np0005486759.ooo.test sudo[124830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:51 np0005486759.ooo.test sudo[124830]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:51 np0005486759.ooo.test sudo[124841]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9izp93hv/privsep.sock
Oct 14 09:09:51 np0005486759.ooo.test sudo[124841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:52 np0005486759.ooo.test sudo[124841]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:52 np0005486759.ooo.test sudo[124852]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7txotyf9/privsep.sock
Oct 14 09:09:52 np0005486759.ooo.test sudo[124852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:53 np0005486759.ooo.test sudo[124852]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:53 np0005486759.ooo.test sudo[124863]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphjnkkpbd/privsep.sock
Oct 14 09:09:53 np0005486759.ooo.test sudo[124863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:54 np0005486759.ooo.test sudo[124863]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:54 np0005486759.ooo.test sudo[124880]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpljx28891/privsep.sock
Oct 14 09:09:54 np0005486759.ooo.test sudo[124880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:54 np0005486759.ooo.test sudo[124880]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:55 np0005486759.ooo.test sudo[124891]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5n4enqf3/privsep.sock
Oct 14 09:09:55 np0005486759.ooo.test sudo[124891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:55 np0005486759.ooo.test sudo[124891]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:56 np0005486759.ooo.test sudo[124902]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprrz5yuhp/privsep.sock
Oct 14 09:09:56 np0005486759.ooo.test sudo[124902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:56 np0005486759.ooo.test sudo[124902]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:56 np0005486759.ooo.test sudo[124913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdzmsacgv/privsep.sock
Oct 14 09:09:56 np0005486759.ooo.test sudo[124913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:57 np0005486759.ooo.test sudo[124913]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:57 np0005486759.ooo.test sudo[124924]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2ow8ibkw/privsep.sock
Oct 14 09:09:57 np0005486759.ooo.test sudo[124924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:58 np0005486759.ooo.test sudo[124924]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:58 np0005486759.ooo.test sudo[124935]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcw2c64uh/privsep.sock
Oct 14 09:09:58 np0005486759.ooo.test sudo[124935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:09:59 np0005486759.ooo.test sudo[124935]: pam_unix(sudo:session): session closed for user root
Oct 14 09:09:59 np0005486759.ooo.test sudo[124952]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp25rjmx4e/privsep.sock
Oct 14 09:09:59 np0005486759.ooo.test sudo[124952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:00 np0005486759.ooo.test sudo[124952]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:00 np0005486759.ooo.test sudo[124963]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0xv6w40p/privsep.sock
Oct 14 09:10:00 np0005486759.ooo.test sudo[124963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:00 np0005486759.ooo.test sudo[124963]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:01 np0005486759.ooo.test sudo[124974]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpht73n1gu/privsep.sock
Oct 14 09:10:01 np0005486759.ooo.test sudo[124974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:01 np0005486759.ooo.test sudo[124974]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:02 np0005486759.ooo.test sudo[124985]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4lq80ww/privsep.sock
Oct 14 09:10:02 np0005486759.ooo.test sudo[124985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:02 np0005486759.ooo.test sudo[124985]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:03 np0005486759.ooo.test sudo[124996]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqba6mkuk/privsep.sock
Oct 14 09:10:03 np0005486759.ooo.test sudo[124996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:03 np0005486759.ooo.test sudo[124996]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:03 np0005486759.ooo.test sudo[125007]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn3zhc4qx/privsep.sock
Oct 14 09:10:03 np0005486759.ooo.test sudo[125007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:04 np0005486759.ooo.test sudo[125007]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:04 np0005486759.ooo.test sudo[125024]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdz1psih_/privsep.sock
Oct 14 09:10:04 np0005486759.ooo.test sudo[125024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:05 np0005486759.ooo.test sudo[125024]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:05 np0005486759.ooo.test sudo[125035]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpatgqdtwr/privsep.sock
Oct 14 09:10:05 np0005486759.ooo.test sudo[125035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:06 np0005486759.ooo.test sudo[125035]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:06 np0005486759.ooo.test sudo[125046]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph1ca7o5s/privsep.sock
Oct 14 09:10:06 np0005486759.ooo.test sudo[125046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: tmp-crun.grqgvq.mount: Deactivated successfully.
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:10:06 np0005486759.ooo.test podman[125048]: 2025-10-14 09:10:06.48973901 +0000 UTC m=+0.131652431 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.buildah.version=1.33.12, container_name=nova_compute, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, tcib_managed=true, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:10:06 np0005486759.ooo.test podman[125049]: 2025-10-14 09:10:06.49843042 +0000 UTC m=+0.136072537 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-collectd-container, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Oct 14 09:10:06 np0005486759.ooo.test podman[125049]: 2025-10-14 09:10:06.504395245 +0000 UTC m=+0.142037402 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-collectd, release=2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, version=17.1.9, com.redhat.component=openstack-collectd-container, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git)
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:10:06 np0005486759.ooo.test podman[125048]: 2025-10-14 09:10:06.547830385 +0000 UTC m=+0.189743746 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, distribution-scope=public, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, config_id=tripleo_step5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.9, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12)
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:10:06 np0005486759.ooo.test podman[125075]: 2025-10-14 09:10:06.559547799 +0000 UTC m=+0.066393874 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.9, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 09:10:06 np0005486759.ooo.test podman[125075]: 2025-10-14 09:10:06.570441327 +0000 UTC m=+0.077287432 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, build-date=2025-07-21T13:27:15, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1, container_name=iscsid, vcs-type=git, batch=17.1_20250721.1, com.redhat.component=openstack-iscsid-container)
Oct 14 09:10:06 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:10:06 np0005486759.ooo.test sudo[125046]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:07 np0005486759.ooo.test sudo[125120]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6lw2j0cb/privsep.sock
Oct 14 09:10:07 np0005486759.ooo.test sudo[125120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:07 np0005486759.ooo.test sudo[125120]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:08 np0005486759.ooo.test sudo[125131]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9jpj6zbi/privsep.sock
Oct 14 09:10:08 np0005486759.ooo.test sudo[125131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:08 np0005486759.ooo.test sudo[125131]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:09 np0005486759.ooo.test sudo[125142]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3eob37dn/privsep.sock
Oct 14 09:10:09 np0005486759.ooo.test sudo[125142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:09 np0005486759.ooo.test sudo[125142]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:10 np0005486759.ooo.test sudo[125159]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp0b9ni28/privsep.sock
Oct 14 09:10:10 np0005486759.ooo.test sudo[125159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:10 np0005486759.ooo.test sudo[125159]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:10 np0005486759.ooo.test sudo[125170]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn6o7avf3/privsep.sock
Oct 14 09:10:10 np0005486759.ooo.test sudo[125170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:11 np0005486759.ooo.test sudo[125170]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:11 np0005486759.ooo.test sudo[125181]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjy_al0xo/privsep.sock
Oct 14 09:10:11 np0005486759.ooo.test sudo[125181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:12 np0005486759.ooo.test sudo[125181]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:12 np0005486759.ooo.test sudo[125192]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq2bw25_l/privsep.sock
Oct 14 09:10:12 np0005486759.ooo.test sudo[125192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:13 np0005486759.ooo.test sudo[125192]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:13 np0005486759.ooo.test sudo[125203]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp28x9aes1/privsep.sock
Oct 14 09:10:13 np0005486759.ooo.test sudo[125203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:14 np0005486759.ooo.test sudo[125203]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:14 np0005486759.ooo.test sudo[125214]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4lzdhlad/privsep.sock
Oct 14 09:10:14 np0005486759.ooo.test sudo[125214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:15 np0005486759.ooo.test sudo[125214]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:10:15 np0005486759.ooo.test podman[125221]: 2025-10-14 09:10:15.192636152 +0000 UTC m=+0.066491567 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-07-21T13:07:59, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1, tcib_managed=true, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:10:15 np0005486759.ooo.test podman[125221]: 2025-10-14 09:10:15.370826656 +0000 UTC m=+0.244682031 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.9, release=1, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-type=git, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.buildah.version=1.33.12, container_name=metrics_qdr)
Oct 14 09:10:15 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:10:15 np0005486759.ooo.test sudo[125261]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6ua3zvm_/privsep.sock
Oct 14 09:10:15 np0005486759.ooo.test sudo[125261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:15 np0005486759.ooo.test sudo[125261]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:16 np0005486759.ooo.test sudo[125272]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn9nli3mc/privsep.sock
Oct 14 09:10:16 np0005486759.ooo.test sudo[125272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:16 np0005486759.ooo.test sudo[125272]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: tmp-crun.OmszWZ.mount: Deactivated successfully.
Oct 14 09:10:16 np0005486759.ooo.test podman[125278]: 2025-10-14 09:10:16.899008255 +0000 UTC m=+0.077817818 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.9, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1)
Oct 14 09:10:16 np0005486759.ooo.test podman[125278]: 2025-10-14 09:10:16.904426044 +0000 UTC m=+0.083235667 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, version=17.1.9, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: tmp-crun.N3ODFw.mount: Deactivated successfully.
Oct 14 09:10:16 np0005486759.ooo.test podman[125280]: 2025-10-14 09:10:16.960495875 +0000 UTC m=+0.130265867 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-07-21T15:29:47, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.9, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:10:16 np0005486759.ooo.test podman[125280]: 2025-10-14 09:10:16.971151757 +0000 UTC m=+0.140921679 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20250721.1, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T15:29:47, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, tcib_managed=true, release=1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:10:16 np0005486759.ooo.test podman[125280]: unhealthy
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:16 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:10:17 np0005486759.ooo.test podman[125279]: 2025-10-14 09:10:17.05592655 +0000 UTC m=+0.228902952 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, tcib_managed=true, batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33)
Oct 14 09:10:17 np0005486759.ooo.test sudo[125336]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8h8881et/privsep.sock
Oct 14 09:10:17 np0005486759.ooo.test sudo[125336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:17 np0005486759.ooo.test podman[125279]: 2025-10-14 09:10:17.071339828 +0000 UTC m=+0.244316280 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T14:45:33, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc.)
Oct 14 09:10:17 np0005486759.ooo.test podman[125279]: unhealthy
Oct 14 09:10:17 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:17 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:10:17 np0005486759.ooo.test sudo[125336]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:17 np0005486759.ooo.test sudo[125352]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2riqhblt/privsep.sock
Oct 14 09:10:17 np0005486759.ooo.test sudo[125352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:18 np0005486759.ooo.test sudo[125352]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:18 np0005486759.ooo.test sudo[125363]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnebs3813/privsep.sock
Oct 14 09:10:18 np0005486759.ooo.test sudo[125363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:19 np0005486759.ooo.test sudo[125363]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:10:19 np0005486759.ooo.test systemd[1]: tmp-crun.ZhX2K5.mount: Deactivated successfully.
Oct 14 09:10:19 np0005486759.ooo.test podman[125367]: 2025-10-14 09:10:19.312686389 +0000 UTC m=+0.073474473 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, batch=17.1_20250721.1, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:10:19 np0005486759.ooo.test sudo[125396]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp26kv5vw8/privsep.sock
Oct 14 09:10:19 np0005486759.ooo.test sudo[125396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:19 np0005486759.ooo.test podman[125367]: 2025-10-14 09:10:19.693267811 +0000 UTC m=+0.454055835 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, config_id=tripleo_step4, architecture=x86_64, build-date=2025-07-21T14:48:37, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1)
Oct 14 09:10:19 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:10:20 np0005486759.ooo.test sudo[125396]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:20 np0005486759.ooo.test sudo[125408]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuyqn4d3b/privsep.sock
Oct 14 09:10:20 np0005486759.ooo.test sudo[125408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:20 np0005486759.ooo.test sudo[125408]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:10:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:10:21 np0005486759.ooo.test systemd[1]: tmp-crun.b2dUon.mount: Deactivated successfully.
Oct 14 09:10:21 np0005486759.ooo.test podman[125418]: 2025-10-14 09:10:21.083157054 +0000 UTC m=+0.097708207 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, version=17.1.9)
Oct 14 09:10:21 np0005486759.ooo.test podman[125421]: 2025-10-14 09:10:21.119843023 +0000 UTC m=+0.130913797 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, version=17.1.9, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:10:21 np0005486759.ooo.test podman[125418]: 2025-10-14 09:10:21.129496884 +0000 UTC m=+0.144048037 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:10:21 np0005486759.ooo.test podman[125418]: unhealthy
Oct 14 09:10:21 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:21 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:10:21 np0005486759.ooo.test podman[125421]: 2025-10-14 09:10:21.185802892 +0000 UTC m=+0.196873686 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:10:21 np0005486759.ooo.test podman[125421]: unhealthy
Oct 14 09:10:21 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:21 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:10:21 np0005486759.ooo.test sudo[125466]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpajk6lymc/privsep.sock
Oct 14 09:10:21 np0005486759.ooo.test sudo[125466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:21 np0005486759.ooo.test sudo[125466]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:22 np0005486759.ooo.test sudo[125477]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppmex1pzi/privsep.sock
Oct 14 09:10:22 np0005486759.ooo.test sudo[125477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:22 np0005486759.ooo.test sudo[125477]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:22 np0005486759.ooo.test sudo[125488]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4s1k3ve6/privsep.sock
Oct 14 09:10:22 np0005486759.ooo.test sudo[125488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:23 np0005486759.ooo.test sudo[125488]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:23 np0005486759.ooo.test sudo[125499]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplzld36yc/privsep.sock
Oct 14 09:10:23 np0005486759.ooo.test sudo[125499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:24 np0005486759.ooo.test sudo[125499]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:24 np0005486759.ooo.test sudo[125510]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdarbnobv/privsep.sock
Oct 14 09:10:24 np0005486759.ooo.test sudo[125510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:25 np0005486759.ooo.test sudo[125510]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:25 np0005486759.ooo.test sudo[125521]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp71r7uihp/privsep.sock
Oct 14 09:10:25 np0005486759.ooo.test sudo[125521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:26 np0005486759.ooo.test sudo[125521]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:26 np0005486759.ooo.test sudo[125538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgajgh7yg/privsep.sock
Oct 14 09:10:26 np0005486759.ooo.test sudo[125538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:26 np0005486759.ooo.test sudo[125538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:27 np0005486759.ooo.test sudo[125549]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4phg5kcw/privsep.sock
Oct 14 09:10:27 np0005486759.ooo.test sudo[125549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:27 np0005486759.ooo.test sudo[125549]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:28 np0005486759.ooo.test sudo[125560]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkrdm3cpf/privsep.sock
Oct 14 09:10:28 np0005486759.ooo.test sudo[125560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:28 np0005486759.ooo.test sudo[125560]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:28 np0005486759.ooo.test sudo[125571]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqft0vsvu/privsep.sock
Oct 14 09:10:28 np0005486759.ooo.test sudo[125571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:29 np0005486759.ooo.test sudo[125571]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:29 np0005486759.ooo.test sudo[125582]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpupbee12k/privsep.sock
Oct 14 09:10:29 np0005486759.ooo.test sudo[125582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:30 np0005486759.ooo.test sudo[125582]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:30 np0005486759.ooo.test sudo[125593]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppbgxke4v/privsep.sock
Oct 14 09:10:30 np0005486759.ooo.test sudo[125593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:31 np0005486759.ooo.test sudo[125593]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:31 np0005486759.ooo.test sudo[125607]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpahzmgwmr/privsep.sock
Oct 14 09:10:31 np0005486759.ooo.test sudo[125607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:32 np0005486759.ooo.test sudo[125607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:32 np0005486759.ooo.test sudo[125621]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9gru3v_8/privsep.sock
Oct 14 09:10:32 np0005486759.ooo.test sudo[125621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:32 np0005486759.ooo.test sudo[125621]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:33 np0005486759.ooo.test sudo[125632]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp46zzxuvp/privsep.sock
Oct 14 09:10:33 np0005486759.ooo.test sudo[125632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:33 np0005486759.ooo.test sudo[125632]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:33 np0005486759.ooo.test sudo[125643]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6g_30e4d/privsep.sock
Oct 14 09:10:33 np0005486759.ooo.test sudo[125643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:34 np0005486759.ooo.test sudo[125643]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:34 np0005486759.ooo.test sudo[125654]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp41zmp0e9/privsep.sock
Oct 14 09:10:34 np0005486759.ooo.test sudo[125654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:35 np0005486759.ooo.test sudo[125654]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:35 np0005486759.ooo.test sudo[125665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphihd3na4/privsep.sock
Oct 14 09:10:35 np0005486759.ooo.test sudo[125665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:36 np0005486759.ooo.test sudo[125665]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:36 np0005486759.ooo.test sudo[125676]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt3ektnt1/privsep.sock
Oct 14 09:10:36 np0005486759.ooo.test sudo[125676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:10:36 np0005486759.ooo.test podman[125679]: 2025-10-14 09:10:36.675855178 +0000 UTC m=+0.094730043 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, vcs-type=git, version=17.1.9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20250721.1, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Oct 14 09:10:36 np0005486759.ooo.test podman[125679]: 2025-10-14 09:10:36.724263612 +0000 UTC m=+0.143138497 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-collectd-container, release=2, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: tmp-crun.Rc1tsc.mount: Deactivated successfully.
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:10:36 np0005486759.ooo.test podman[125681]: 2025-10-14 09:10:36.737757842 +0000 UTC m=+0.152986504 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.expose-services=, io.buildah.version=1.33.12, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible)
Oct 14 09:10:36 np0005486759.ooo.test podman[125710]: 2025-10-14 09:10:36.79759886 +0000 UTC m=+0.114917901 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, distribution-scope=public, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T13:27:15, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Oct 14 09:10:36 np0005486759.ooo.test podman[125681]: 2025-10-14 09:10:36.824262418 +0000 UTC m=+0.239491050 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, tcib_managed=true, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 09:10:36 np0005486759.ooo.test podman[125710]: 2025-10-14 09:10:36.832311678 +0000 UTC m=+0.149630709 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.9, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, batch=17.1_20250721.1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12)
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:10:36 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:10:37 np0005486759.ooo.test sudo[125676]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:37 np0005486759.ooo.test sudo[125756]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwu10hh25/privsep.sock
Oct 14 09:10:37 np0005486759.ooo.test sudo[125756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:37 np0005486759.ooo.test systemd[1]: tmp-crun.qmrkFX.mount: Deactivated successfully.
Oct 14 09:10:38 np0005486759.ooo.test sudo[125756]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:38 np0005486759.ooo.test sudo[125767]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9c2hicnu/privsep.sock
Oct 14 09:10:38 np0005486759.ooo.test sudo[125767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:38 np0005486759.ooo.test sudo[125767]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:39 np0005486759.ooo.test sudo[125778]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdo0xrcep/privsep.sock
Oct 14 09:10:39 np0005486759.ooo.test sudo[125778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:39 np0005486759.ooo.test sshd[24120]: Received disconnect from 192.168.122.100 port 39458:11: disconnected by user
Oct 14 09:10:39 np0005486759.ooo.test sshd[24120]: Disconnected from user tripleo-admin 192.168.122.100 port 39458
Oct 14 09:10:39 np0005486759.ooo.test sshd[24100]: pam_unix(sshd:session): session closed for user tripleo-admin
Oct 14 09:10:39 np0005486759.ooo.test systemd[1]: session-12.scope: Deactivated successfully.
Oct 14 09:10:39 np0005486759.ooo.test systemd[1]: session-12.scope: Consumed 7min 9.286s CPU time.
Oct 14 09:10:39 np0005486759.ooo.test systemd-logind[759]: Session 12 logged out. Waiting for processes to exit.
Oct 14 09:10:39 np0005486759.ooo.test sudo[125778]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:39 np0005486759.ooo.test systemd-logind[759]: Removed session 12.
Oct 14 09:10:40 np0005486759.ooo.test sudo[125789]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpswfowgut/privsep.sock
Oct 14 09:10:40 np0005486759.ooo.test sudo[125789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:40 np0005486759.ooo.test sudo[125789]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:40 np0005486759.ooo.test sudo[125800]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpue3mpt6z/privsep.sock
Oct 14 09:10:40 np0005486759.ooo.test sudo[125800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:41 np0005486759.ooo.test sudo[125800]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:41 np0005486759.ooo.test sudo[125811]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp44p9ai81/privsep.sock
Oct 14 09:10:41 np0005486759.ooo.test sudo[125811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:42 np0005486759.ooo.test sudo[125811]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:42 np0005486759.ooo.test sudo[125828]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp77a6p4mi/privsep.sock
Oct 14 09:10:42 np0005486759.ooo.test sudo[125828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:43 np0005486759.ooo.test sudo[125828]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:43 np0005486759.ooo.test sudo[125839]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoegpxlvk/privsep.sock
Oct 14 09:10:43 np0005486759.ooo.test sudo[125839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:44 np0005486759.ooo.test sudo[125839]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:44 np0005486759.ooo.test sudo[125850]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvpuwfodo/privsep.sock
Oct 14 09:10:44 np0005486759.ooo.test sudo[125850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:45 np0005486759.ooo.test sudo[125850]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:45 np0005486759.ooo.test sudo[125861]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv0b0waru/privsep.sock
Oct 14 09:10:45 np0005486759.ooo.test sudo[125861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:45 np0005486759.ooo.test sudo[125861]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:10:46 np0005486759.ooo.test systemd[1]: tmp-crun.bZ3CGg.mount: Deactivated successfully.
Oct 14 09:10:46 np0005486759.ooo.test podman[125867]: 2025-10-14 09:10:46.041062072 +0000 UTC m=+0.098195441 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:10:46 np0005486759.ooo.test sudo[125902]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzge4rlr8/privsep.sock
Oct 14 09:10:46 np0005486759.ooo.test sudo[125902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:46 np0005486759.ooo.test podman[125867]: 2025-10-14 09:10:46.254194263 +0000 UTC m=+0.311327602 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:59, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:10:46 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:10:46 np0005486759.ooo.test sudo[125902]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:46 np0005486759.ooo.test sudo[125913]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf28tivyg/privsep.sock
Oct 14 09:10:46 np0005486759.ooo.test sudo[125913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:10:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: tmp-crun.9ntC8j.mount: Deactivated successfully.
Oct 14 09:10:47 np0005486759.ooo.test podman[125915]: 2025-10-14 09:10:47.06215279 +0000 UTC m=+0.064073642 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true)
Oct 14 09:10:47 np0005486759.ooo.test podman[125915]: 2025-10-14 09:10:47.097320802 +0000 UTC m=+0.099241654 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20250721.1, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64)
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:10:47 np0005486759.ooo.test podman[125916]: 2025-10-14 09:10:47.121533674 +0000 UTC m=+0.119201743 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, tcib_managed=true, batch=17.1_20250721.1, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, version=17.1.9, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Oct 14 09:10:47 np0005486759.ooo.test podman[125916]: 2025-10-14 09:10:47.131237386 +0000 UTC m=+0.128905435 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20250721.1, vcs-type=git, release=1, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, vendor=Red Hat, Inc.)
Oct 14 09:10:47 np0005486759.ooo.test podman[125916]: unhealthy
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:10:47 np0005486759.ooo.test podman[125950]: 2025-10-14 09:10:47.172545098 +0000 UTC m=+0.046725232 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:10:47 np0005486759.ooo.test podman[125950]: 2025-10-14 09:10:47.210617511 +0000 UTC m=+0.084797655 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1, io.buildah.version=1.33.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-07-21T14:45:33, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, vendor=Red Hat, Inc.)
Oct 14 09:10:47 np0005486759.ooo.test podman[125950]: unhealthy
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:10:47 np0005486759.ooo.test sudo[125913]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:47 np0005486759.ooo.test sudo[125987]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9zem2qqh/privsep.sock
Oct 14 09:10:47 np0005486759.ooo.test sudo[125987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:48 np0005486759.ooo.test sudo[125987]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:48 np0005486759.ooo.test sudo[125998]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxnrf5i6i/privsep.sock
Oct 14 09:10:48 np0005486759.ooo.test sudo[125998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:49 np0005486759.ooo.test sudo[125998]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:49 np0005486759.ooo.test sudo[126009]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzyjl5rii/privsep.sock
Oct 14 09:10:49 np0005486759.ooo.test sudo[126009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 1002...
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Activating special unit Exit the Session...
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Removed slice User Background Tasks Slice.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped target Main User Target.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped target Basic System.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped target Paths.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped target Sockets.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped target Timers.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Closed D-Bus User Message Bus Socket.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Stopped Create User's Volatile Files and Directories.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Removed slice User Application Slice.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Reached target Shutdown.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Finished Exit the Session.
Oct 14 09:10:49 np0005486759.ooo.test systemd[24104]: Reached target Exit the Session.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: user@1002.service: Deactivated successfully.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 1002.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: user@1002.service: Consumed 5.578s CPU time, read 0B from disk, written 7.0K to disk.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/1002...
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: run-user-1002.mount: Deactivated successfully.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/1002.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 1002.
Oct 14 09:10:49 np0005486759.ooo.test systemd[1]: user-1002.slice: Consumed 7min 14.895s CPU time.
Oct 14 09:10:49 np0005486759.ooo.test podman[126012]: 2025-10-14 09:10:49.95764827 +0000 UTC m=+0.082426021 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:10:50 np0005486759.ooo.test sudo[126009]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:50 np0005486759.ooo.test podman[126012]: 2025-10-14 09:10:50.304545965 +0000 UTC m=+0.429323776 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, release=1, container_name=nova_migration_target, managed_by=tripleo_ansible, build-date=2025-07-21T14:48:37, version=17.1.9, batch=17.1_20250721.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64)
Oct 14 09:10:50 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:10:50 np0005486759.ooo.test sudo[126044]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9rgx_ab0/privsep.sock
Oct 14 09:10:50 np0005486759.ooo.test sudo[126044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:51 np0005486759.ooo.test sudo[126044]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:10:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:10:51 np0005486759.ooo.test sudo[126057]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0lic0vpu/privsep.sock
Oct 14 09:10:51 np0005486759.ooo.test sudo[126057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:51 np0005486759.ooo.test podman[126056]: 2025-10-14 09:10:51.458573252 +0000 UTC m=+0.077040874 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, version=17.1.9, batch=17.1_20250721.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 09:10:51 np0005486759.ooo.test podman[126054]: 2025-10-14 09:10:51.436672352 +0000 UTC m=+0.063884815 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., batch=17.1_20250721.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Oct 14 09:10:51 np0005486759.ooo.test podman[126056]: 2025-10-14 09:10:51.498303637 +0000 UTC m=+0.116771219 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, release=1, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_controller, build-date=2025-07-21T13:28:44, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:10:51 np0005486759.ooo.test podman[126056]: unhealthy
Oct 14 09:10:51 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:51 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:10:51 np0005486759.ooo.test podman[126054]: 2025-10-14 09:10:51.520310749 +0000 UTC m=+0.147523202 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, distribution-scope=public, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:10:51 np0005486759.ooo.test podman[126054]: unhealthy
Oct 14 09:10:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:10:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:10:51 np0005486759.ooo.test sudo[126057]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:52 np0005486759.ooo.test sudo[126104]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc4sk0xd5/privsep.sock
Oct 14 09:10:52 np0005486759.ooo.test sudo[126104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:52 np0005486759.ooo.test sudo[126104]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:53 np0005486759.ooo.test sudo[126121]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp95esqn5x/privsep.sock
Oct 14 09:10:53 np0005486759.ooo.test sudo[126121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:53 np0005486759.ooo.test sudo[126121]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:54 np0005486759.ooo.test sudo[126132]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplqly90t9/privsep.sock
Oct 14 09:10:54 np0005486759.ooo.test sudo[126132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:54 np0005486759.ooo.test sudo[126132]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:54 np0005486759.ooo.test sudo[126143]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5afzcqsv/privsep.sock
Oct 14 09:10:54 np0005486759.ooo.test sudo[126143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:55 np0005486759.ooo.test sudo[126143]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:55 np0005486759.ooo.test sudo[126154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmyleyrbv/privsep.sock
Oct 14 09:10:55 np0005486759.ooo.test sudo[126154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:56 np0005486759.ooo.test sudo[126154]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:56 np0005486759.ooo.test sudo[126165]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptm0g62b0/privsep.sock
Oct 14 09:10:56 np0005486759.ooo.test sudo[126165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:57 np0005486759.ooo.test sudo[126165]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:57 np0005486759.ooo.test sudo[126176]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp326vzzhz/privsep.sock
Oct 14 09:10:57 np0005486759.ooo.test sudo[126176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:58 np0005486759.ooo.test sudo[126176]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:58 np0005486759.ooo.test sudo[126192]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpruna7cza/privsep.sock
Oct 14 09:10:58 np0005486759.ooo.test sudo[126192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:59 np0005486759.ooo.test sudo[126192]: pam_unix(sudo:session): session closed for user root
Oct 14 09:10:59 np0005486759.ooo.test sudo[126204]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0kmyl6nv/privsep.sock
Oct 14 09:10:59 np0005486759.ooo.test sudo[126204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:10:59 np0005486759.ooo.test sudo[126204]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:00 np0005486759.ooo.test sudo[126215]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfcsiu2d7/privsep.sock
Oct 14 09:11:00 np0005486759.ooo.test sudo[126215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:00 np0005486759.ooo.test sudo[126215]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:00 np0005486759.ooo.test sudo[126226]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpivqo1bdp/privsep.sock
Oct 14 09:11:00 np0005486759.ooo.test sudo[126226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:01 np0005486759.ooo.test sudo[126226]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:01 np0005486759.ooo.test sudo[126237]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6wnyxlt4/privsep.sock
Oct 14 09:11:01 np0005486759.ooo.test sudo[126237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:02 np0005486759.ooo.test sudo[126237]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:02 np0005486759.ooo.test sudo[126248]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyasjqkpb/privsep.sock
Oct 14 09:11:02 np0005486759.ooo.test sudo[126248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:03 np0005486759.ooo.test sudo[126248]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:03 np0005486759.ooo.test sudo[126259]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8yl7tt2r/privsep.sock
Oct 14 09:11:03 np0005486759.ooo.test sudo[126259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:04 np0005486759.ooo.test sudo[126259]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:04 np0005486759.ooo.test sudo[126276]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp97dzm53c/privsep.sock
Oct 14 09:11:04 np0005486759.ooo.test sudo[126276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:05 np0005486759.ooo.test sudo[126276]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:05 np0005486759.ooo.test sudo[126287]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2wbahuph/privsep.sock
Oct 14 09:11:05 np0005486759.ooo.test sudo[126287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:05 np0005486759.ooo.test sudo[126287]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:06 np0005486759.ooo.test sudo[126298]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoy86fyy0/privsep.sock
Oct 14 09:11:06 np0005486759.ooo.test sudo[126298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:06 np0005486759.ooo.test sudo[126298]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:07 np0005486759.ooo.test sudo[126309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg0rx6ium/privsep.sock
Oct 14 09:11:07 np0005486759.ooo.test sudo[126309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:11:07 np0005486759.ooo.test recover_tripleo_nova_virtqemud[126326]: 47951
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:11:07 np0005486759.ooo.test podman[126311]: 2025-10-14 09:11:07.121717053 +0000 UTC m=+0.083000129 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, release=1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_id=tripleo_step3, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc.)
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: tmp-crun.LA3rjs.mount: Deactivated successfully.
Oct 14 09:11:07 np0005486759.ooo.test podman[126313]: 2025-10-14 09:11:07.181133609 +0000 UTC m=+0.135005025 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.9, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 09:11:07 np0005486759.ooo.test podman[126313]: 2025-10-14 09:11:07.187248338 +0000 UTC m=+0.141119834 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, distribution-scope=public, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:11:07 np0005486759.ooo.test podman[126312]: 2025-10-14 09:11:07.186813075 +0000 UTC m=+0.141919960 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, release=1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:11:07 np0005486759.ooo.test podman[126311]: 2025-10-14 09:11:07.207484147 +0000 UTC m=+0.168767273 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-07-21T13:27:15, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, batch=17.1_20250721.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid)
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:11:07 np0005486759.ooo.test podman[126312]: 2025-10-14 09:11:07.270469744 +0000 UTC m=+0.225576599 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., config_id=tripleo_step5, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.9, release=1)
Oct 14 09:11:07 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:11:07 np0005486759.ooo.test sudo[126309]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:07 np0005486759.ooo.test sudo[126381]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpdyo5w00g/privsep.sock
Oct 14 09:11:07 np0005486759.ooo.test sudo[126381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:08 np0005486759.ooo.test sudo[126381]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:08 np0005486759.ooo.test sudo[126392]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqij5cj4e/privsep.sock
Oct 14 09:11:08 np0005486759.ooo.test sudo[126392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:09 np0005486759.ooo.test sudo[126392]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:09 np0005486759.ooo.test sudo[126409]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5jhrmkyc/privsep.sock
Oct 14 09:11:09 np0005486759.ooo.test sudo[126409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:10 np0005486759.ooo.test sudo[126409]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:10 np0005486759.ooo.test sudo[126420]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfge0b9ly/privsep.sock
Oct 14 09:11:10 np0005486759.ooo.test sudo[126420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:11 np0005486759.ooo.test sudo[126420]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:11 np0005486759.ooo.test sudo[126431]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfpftt2ul/privsep.sock
Oct 14 09:11:11 np0005486759.ooo.test sudo[126431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:11 np0005486759.ooo.test sudo[126431]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:12 np0005486759.ooo.test sudo[126442]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4udi892p/privsep.sock
Oct 14 09:11:12 np0005486759.ooo.test sudo[126442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:12 np0005486759.ooo.test sudo[126442]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:12 np0005486759.ooo.test sudo[126453]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpk6k70cy_/privsep.sock
Oct 14 09:11:12 np0005486759.ooo.test sudo[126453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:13 np0005486759.ooo.test sudo[126453]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:13 np0005486759.ooo.test sudo[126464]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj74urocw/privsep.sock
Oct 14 09:11:13 np0005486759.ooo.test sudo[126464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:14 np0005486759.ooo.test sudo[126464]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:14 np0005486759.ooo.test sudo[126478]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkl_1c08p/privsep.sock
Oct 14 09:11:14 np0005486759.ooo.test sudo[126478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:15 np0005486759.ooo.test sudo[126478]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:15 np0005486759.ooo.test sudo[126492]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx6ktfpiq/privsep.sock
Oct 14 09:11:15 np0005486759.ooo.test sudo[126492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:16 np0005486759.ooo.test sudo[126492]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:11:16 np0005486759.ooo.test podman[126498]: 2025-10-14 09:11:16.438044298 +0000 UTC m=+0.064529145 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.expose-services=, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-07-21T13:07:59, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:11:16 np0005486759.ooo.test sudo[126530]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbpfwuufs/privsep.sock
Oct 14 09:11:16 np0005486759.ooo.test sudo[126530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:16 np0005486759.ooo.test podman[126498]: 2025-10-14 09:11:16.628314328 +0000 UTC m=+0.254799145 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=metrics_qdr, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:59, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:11:16 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:11:17 np0005486759.ooo.test sudo[126530]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:17 np0005486759.ooo.test sudo[126541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuexjvck8/privsep.sock
Oct 14 09:11:17 np0005486759.ooo.test sudo[126541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:11:17 np0005486759.ooo.test podman[126543]: 2025-10-14 09:11:17.424076146 +0000 UTC m=+0.058221149 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:11:17 np0005486759.ooo.test podman[126543]: 2025-10-14 09:11:17.428924117 +0000 UTC m=+0.063069110 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, container_name=logrotate_crond)
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:11:17 np0005486759.ooo.test podman[126544]: 2025-10-14 09:11:17.508826949 +0000 UTC m=+0.140711822 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1, managed_by=tripleo_ansible, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:11:17 np0005486759.ooo.test podman[126544]: 2025-10-14 09:11:17.551636289 +0000 UTC m=+0.183521182 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, release=1, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:11:17 np0005486759.ooo.test podman[126544]: unhealthy
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:11:17 np0005486759.ooo.test podman[126545]: 2025-10-14 09:11:17.511374448 +0000 UTC m=+0.137916595 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 09:11:17 np0005486759.ooo.test podman[126545]: 2025-10-14 09:11:17.596321997 +0000 UTC m=+0.222864154 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-07-21T15:29:47, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_id=tripleo_step4, release=1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, tcib_managed=true)
Oct 14 09:11:17 np0005486759.ooo.test podman[126545]: unhealthy
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:17 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:11:17 np0005486759.ooo.test sudo[126541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:18 np0005486759.ooo.test sudo[126606]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpec35qm5v/privsep.sock
Oct 14 09:11:18 np0005486759.ooo.test sudo[126606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:18 np0005486759.ooo.test sudo[126606]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:19 np0005486759.ooo.test sudo[126617]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwmfsx1h9/privsep.sock
Oct 14 09:11:19 np0005486759.ooo.test sudo[126617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:19 np0005486759.ooo.test sudo[126617]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:20 np0005486759.ooo.test sudo[126630]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpam0wdgu3/privsep.sock
Oct 14 09:11:20 np0005486759.ooo.test sudo[126630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:11:20 np0005486759.ooo.test systemd[1]: tmp-crun.enOVzv.mount: Deactivated successfully.
Oct 14 09:11:20 np0005486759.ooo.test podman[126637]: 2025-10-14 09:11:20.454307102 +0000 UTC m=+0.086127366 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1, batch=17.1_20250721.1, version=17.1.9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, tcib_managed=true, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:11:20 np0005486759.ooo.test sudo[126630]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:20 np0005486759.ooo.test podman[126637]: 2025-10-14 09:11:20.770239736 +0000 UTC m=+0.402060040 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.9, build-date=2025-07-21T14:48:37, release=1, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Oct 14 09:11:20 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:11:20 np0005486759.ooo.test sudo[126668]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf5dkvvh3/privsep.sock
Oct 14 09:11:20 np0005486759.ooo.test sudo[126668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:21 np0005486759.ooo.test sudo[126668]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:21 np0005486759.ooo.test sudo[126679]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp514sfcpn/privsep.sock
Oct 14 09:11:21 np0005486759.ooo.test sudo[126679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: tmp-crun.U2V4EK.mount: Deactivated successfully.
Oct 14 09:11:21 np0005486759.ooo.test podman[126681]: 2025-10-14 09:11:21.766725809 +0000 UTC m=+0.076715704 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.33.12, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-07-21T16:28:53, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:11:21 np0005486759.ooo.test podman[126681]: 2025-10-14 09:11:21.803289684 +0000 UTC m=+0.113279569 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, architecture=x86_64, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, managed_by=tripleo_ansible, tcib_managed=true)
Oct 14 09:11:21 np0005486759.ooo.test podman[126681]: unhealthy
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:11:21 np0005486759.ooo.test podman[126682]: 2025-10-14 09:11:21.815292368 +0000 UTC m=+0.121150235 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, version=17.1.9, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, build-date=2025-07-21T13:28:44)
Oct 14 09:11:21 np0005486759.ooo.test podman[126682]: 2025-10-14 09:11:21.827686582 +0000 UTC m=+0.133544469 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 09:11:21 np0005486759.ooo.test podman[126682]: unhealthy
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:21 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:11:22 np0005486759.ooo.test sudo[126679]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:22 np0005486759.ooo.test sudo[126726]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm6vbq_to/privsep.sock
Oct 14 09:11:22 np0005486759.ooo.test sudo[126726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:23 np0005486759.ooo.test sudo[126726]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:23 np0005486759.ooo.test sudo[126737]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph3ho13dc/privsep.sock
Oct 14 09:11:23 np0005486759.ooo.test sudo[126737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:24 np0005486759.ooo.test sudo[126737]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:24 np0005486759.ooo.test sudo[126748]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpo9do90ot/privsep.sock
Oct 14 09:11:24 np0005486759.ooo.test sudo[126748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:25 np0005486759.ooo.test sudo[126748]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:25 np0005486759.ooo.test sudo[126761]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy4hu_q6d/privsep.sock
Oct 14 09:11:25 np0005486759.ooo.test sudo[126761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:25 np0005486759.ooo.test sudo[126761]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:26 np0005486759.ooo.test sudo[126776]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6jh8nbw8/privsep.sock
Oct 14 09:11:26 np0005486759.ooo.test sudo[126776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:26 np0005486759.ooo.test sudo[126776]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:27 np0005486759.ooo.test sudo[126787]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp43714a2k/privsep.sock
Oct 14 09:11:27 np0005486759.ooo.test sudo[126787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:27 np0005486759.ooo.test sudo[126787]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:28 np0005486759.ooo.test sudo[126798]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm4p4a3jl/privsep.sock
Oct 14 09:11:28 np0005486759.ooo.test sudo[126798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:28 np0005486759.ooo.test sudo[126798]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:28 np0005486759.ooo.test sudo[126809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5k59fh5g/privsep.sock
Oct 14 09:11:28 np0005486759.ooo.test sudo[126809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:29 np0005486759.ooo.test sudo[126809]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:29 np0005486759.ooo.test sudo[126820]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgzqb_z_j/privsep.sock
Oct 14 09:11:29 np0005486759.ooo.test sudo[126820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:30 np0005486759.ooo.test sudo[126820]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:30 np0005486759.ooo.test sudo[126831]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6xp_tl2x/privsep.sock
Oct 14 09:11:30 np0005486759.ooo.test sudo[126831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:30 np0005486759.ooo.test sudo[126831]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:31 np0005486759.ooo.test sudo[126848]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpthi5s4dv/privsep.sock
Oct 14 09:11:31 np0005486759.ooo.test sudo[126848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:31 np0005486759.ooo.test sudo[126848]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:32 np0005486759.ooo.test sudo[126859]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9xdsd93v/privsep.sock
Oct 14 09:11:32 np0005486759.ooo.test sudo[126859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:32 np0005486759.ooo.test sudo[126859]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:33 np0005486759.ooo.test sudo[126870]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn33h8525/privsep.sock
Oct 14 09:11:33 np0005486759.ooo.test sudo[126870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:33 np0005486759.ooo.test sudo[126870]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:33 np0005486759.ooo.test sudo[126881]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgpof04lt/privsep.sock
Oct 14 09:11:33 np0005486759.ooo.test sudo[126881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:34 np0005486759.ooo.test sudo[126881]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:34 np0005486759.ooo.test sudo[126892]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpczghjkvs/privsep.sock
Oct 14 09:11:34 np0005486759.ooo.test sudo[126892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:35 np0005486759.ooo.test sudo[126892]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:35 np0005486759.ooo.test sudo[126903]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpehizgpxo/privsep.sock
Oct 14 09:11:35 np0005486759.ooo.test sudo[126903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:36 np0005486759.ooo.test sudo[126903]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:36 np0005486759.ooo.test sudo[126916]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp13oo0q4q/privsep.sock
Oct 14 09:11:36 np0005486759.ooo.test sudo[126916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:36 np0005486759.ooo.test sudo[126916]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:37 np0005486759.ooo.test sudo[126931]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp177t2_fy/privsep.sock
Oct 14 09:11:37 np0005486759.ooo.test sudo[126931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: tmp-crun.OPS3gX.mount: Deactivated successfully.
Oct 14 09:11:37 np0005486759.ooo.test podman[126934]: 2025-10-14 09:11:37.313799835 +0000 UTC m=+0.058685164 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, release=2, version=17.1.9, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, build-date=2025-07-21T13:04:03, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, io.buildah.version=1.33.12, distribution-scope=public)
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:11:37 np0005486759.ooo.test podman[126933]: 2025-10-14 09:11:37.370740343 +0000 UTC m=+0.118038627 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, release=1, container_name=iscsid, vcs-type=git, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 09:11:37 np0005486759.ooo.test podman[126934]: 2025-10-14 09:11:37.397639989 +0000 UTC m=+0.142525338 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, architecture=x86_64, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, batch=17.1_20250721.1, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=2, io.buildah.version=1.33.12)
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:11:37 np0005486759.ooo.test podman[126933]: 2025-10-14 09:11:37.408403724 +0000 UTC m=+0.155702048 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:11:37 np0005486759.ooo.test podman[126961]: 2025-10-14 09:11:37.401644383 +0000 UTC m=+0.061121229 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_id=tripleo_step5, tcib_managed=true, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:11:37 np0005486759.ooo.test podman[126961]: 2025-10-14 09:11:37.483351762 +0000 UTC m=+0.142828588 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step5, version=17.1.9, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 09:11:37 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:11:37 np0005486759.ooo.test sudo[126931]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:38 np0005486759.ooo.test sudo[127003]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph5973o9f/privsep.sock
Oct 14 09:11:38 np0005486759.ooo.test sudo[127003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:38 np0005486759.ooo.test sudo[127003]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:38 np0005486759.ooo.test sudo[127014]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_kild690/privsep.sock
Oct 14 09:11:38 np0005486759.ooo.test sudo[127014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:39 np0005486759.ooo.test sudo[127014]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:39 np0005486759.ooo.test sudo[127025]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpya0hkprj/privsep.sock
Oct 14 09:11:39 np0005486759.ooo.test sudo[127025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:40 np0005486759.ooo.test sudo[127025]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:40 np0005486759.ooo.test sudo[127036]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprvktf5tl/privsep.sock
Oct 14 09:11:40 np0005486759.ooo.test sudo[127036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:41 np0005486759.ooo.test sudo[127036]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:41 np0005486759.ooo.test sudo[127047]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpon7m_861/privsep.sock
Oct 14 09:11:41 np0005486759.ooo.test sudo[127047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:42 np0005486759.ooo.test sudo[127047]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:42 np0005486759.ooo.test sudo[127064]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpojlopomu/privsep.sock
Oct 14 09:11:42 np0005486759.ooo.test sudo[127064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:42 np0005486759.ooo.test sudo[127064]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:43 np0005486759.ooo.test sudo[127075]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph1on6ygr/privsep.sock
Oct 14 09:11:43 np0005486759.ooo.test sudo[127075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:43 np0005486759.ooo.test sudo[127075]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:44 np0005486759.ooo.test sudo[127086]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7yeavewp/privsep.sock
Oct 14 09:11:44 np0005486759.ooo.test sudo[127086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:44 np0005486759.ooo.test sudo[127086]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:44 np0005486759.ooo.test sudo[127097]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzc1zq0qn/privsep.sock
Oct 14 09:11:44 np0005486759.ooo.test sudo[127097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:45 np0005486759.ooo.test sudo[127097]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:45 np0005486759.ooo.test sudo[127108]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp41g3grms/privsep.sock
Oct 14 09:11:45 np0005486759.ooo.test sudo[127108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:46 np0005486759.ooo.test sudo[127108]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:46 np0005486759.ooo.test sudo[127119]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphxeyczo7/privsep.sock
Oct 14 09:11:46 np0005486759.ooo.test sudo[127119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:47 np0005486759.ooo.test sudo[127119]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:11:47 np0005486759.ooo.test podman[127130]: 2025-10-14 09:11:47.262705479 +0000 UTC m=+0.048971071 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, release=1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd)
Oct 14 09:11:47 np0005486759.ooo.test sudo[127166]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppe1rn9tj/privsep.sock
Oct 14 09:11:47 np0005486759.ooo.test sudo[127166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:47 np0005486759.ooo.test podman[127130]: 2025-10-14 09:11:47.441884535 +0000 UTC m=+0.228150067 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: tmp-crun.kFF8ro.mount: Deactivated successfully.
Oct 14 09:11:47 np0005486759.ooo.test podman[127168]: 2025-10-14 09:11:47.541993085 +0000 UTC m=+0.077136267 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, name=rhosp17/openstack-cron, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:11:47 np0005486759.ooo.test podman[127168]: 2025-10-14 09:11:47.577259771 +0000 UTC m=+0.112402913 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, architecture=x86_64, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-07-21T13:07:52)
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:11:47 np0005486759.ooo.test podman[127188]: 2025-10-14 09:11:47.685065559 +0000 UTC m=+0.073356380 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.9, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Oct 14 09:11:47 np0005486759.ooo.test podman[127188]: 2025-10-14 09:11:47.696468353 +0000 UTC m=+0.084759204 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, version=17.1.9, build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Oct 14 09:11:47 np0005486759.ooo.test podman[127188]: unhealthy
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:11:47 np0005486759.ooo.test podman[127203]: 2025-10-14 09:11:47.766976243 +0000 UTC m=+0.076367523 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, release=1, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, batch=17.1_20250721.1, architecture=x86_64, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git)
Oct 14 09:11:47 np0005486759.ooo.test podman[127203]: 2025-10-14 09:11:47.811344141 +0000 UTC m=+0.120735421 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, io.buildah.version=1.33.12, build-date=2025-07-21T15:29:47, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, batch=17.1_20250721.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Oct 14 09:11:47 np0005486759.ooo.test podman[127203]: unhealthy
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:47 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:11:48 np0005486759.ooo.test sudo[127166]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:48 np0005486759.ooo.test systemd[1]: tmp-crun.Pixh68.mount: Deactivated successfully.
Oct 14 09:11:48 np0005486759.ooo.test sudo[127235]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1y3v1hua/privsep.sock
Oct 14 09:11:48 np0005486759.ooo.test sudo[127235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:48 np0005486759.ooo.test sudo[127235]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:49 np0005486759.ooo.test sudo[127246]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprvqd9hfa/privsep.sock
Oct 14 09:11:49 np0005486759.ooo.test sudo[127246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:49 np0005486759.ooo.test sudo[127246]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:50 np0005486759.ooo.test sudo[127257]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgu3cke9b/privsep.sock
Oct 14 09:11:50 np0005486759.ooo.test sudo[127257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:50 np0005486759.ooo.test sudo[127257]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:50 np0005486759.ooo.test sudo[127268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpth7ezzor/privsep.sock
Oct 14 09:11:50 np0005486759.ooo.test sudo[127268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:51 np0005486759.ooo.test sudo[127268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:11:51 np0005486759.ooo.test systemd[1]: tmp-crun.U6WyiZ.mount: Deactivated successfully.
Oct 14 09:11:51 np0005486759.ooo.test podman[127274]: 2025-10-14 09:11:51.408761625 +0000 UTC m=+0.075908729 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, release=1, batch=17.1_20250721.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:11:51 np0005486759.ooo.test sudo[127302]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvxjhu99n/privsep.sock
Oct 14 09:11:51 np0005486759.ooo.test sudo[127302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:51 np0005486759.ooo.test podman[127274]: 2025-10-14 09:11:51.762418621 +0000 UTC m=+0.429565715 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20250721.1, container_name=nova_migration_target, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_id=tripleo_step4)
Oct 14 09:11:51 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:11:52 np0005486759.ooo.test sudo[127302]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:11:52 np0005486759.ooo.test podman[127310]: 2025-10-14 09:11:52.200424296 +0000 UTC m=+0.071699889 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.buildah.version=1.33.12, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:28:44, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:11:52 np0005486759.ooo.test podman[127310]: 2025-10-14 09:11:52.233476533 +0000 UTC m=+0.104752126 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., version=17.1.9, architecture=x86_64, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:11:52 np0005486759.ooo.test podman[127310]: unhealthy
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:11:52 np0005486759.ooo.test podman[127308]: 2025-10-14 09:11:52.18703405 +0000 UTC m=+0.065569387 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, distribution-scope=public)
Oct 14 09:11:52 np0005486759.ooo.test podman[127308]: 2025-10-14 09:11:52.315530501 +0000 UTC m=+0.194065798 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-07-21T16:28:53, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:11:52 np0005486759.ooo.test podman[127308]: unhealthy
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:11:52 np0005486759.ooo.test sudo[127356]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp25orvx8i/privsep.sock
Oct 14 09:11:52 np0005486759.ooo.test sudo[127356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:52 np0005486759.ooo.test systemd[1]: tmp-crun.GZLHJS.mount: Deactivated successfully.
Oct 14 09:11:53 np0005486759.ooo.test sudo[127356]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:53 np0005486759.ooo.test sudo[127371]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc3x194nx/privsep.sock
Oct 14 09:11:53 np0005486759.ooo.test sudo[127371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:53 np0005486759.ooo.test sudo[127371]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:54 np0005486759.ooo.test sudo[127382]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpepaplgr5/privsep.sock
Oct 14 09:11:54 np0005486759.ooo.test sudo[127382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:54 np0005486759.ooo.test sudo[127382]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:55 np0005486759.ooo.test sudo[127393]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi1yfr46y/privsep.sock
Oct 14 09:11:55 np0005486759.ooo.test sudo[127393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:55 np0005486759.ooo.test sudo[127393]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:55 np0005486759.ooo.test sudo[127404]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcpe456o2/privsep.sock
Oct 14 09:11:55 np0005486759.ooo.test sudo[127404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:56 np0005486759.ooo.test sudo[127404]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:56 np0005486759.ooo.test sudo[127415]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphps5oyeu/privsep.sock
Oct 14 09:11:56 np0005486759.ooo.test sudo[127415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:57 np0005486759.ooo.test sudo[127415]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:57 np0005486759.ooo.test sudo[127426]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc74val83/privsep.sock
Oct 14 09:11:57 np0005486759.ooo.test sudo[127426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:58 np0005486759.ooo.test sudo[127426]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:58 np0005486759.ooo.test sudo[127443]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9xcc4d67/privsep.sock
Oct 14 09:11:58 np0005486759.ooo.test sudo[127443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:59 np0005486759.ooo.test sudo[127443]: pam_unix(sudo:session): session closed for user root
Oct 14 09:11:59 np0005486759.ooo.test sudo[127454]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbvpspnco/privsep.sock
Oct 14 09:11:59 np0005486759.ooo.test sudo[127454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:11:59 np0005486759.ooo.test sudo[127454]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:00 np0005486759.ooo.test sudo[127465]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg40maxoz/privsep.sock
Oct 14 09:12:00 np0005486759.ooo.test sudo[127465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:00 np0005486759.ooo.test sudo[127465]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:01 np0005486759.ooo.test sudo[127476]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj1jeoxg2/privsep.sock
Oct 14 09:12:01 np0005486759.ooo.test sudo[127476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:01 np0005486759.ooo.test sudo[127476]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:01 np0005486759.ooo.test sudo[127487]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4y5pb6xu/privsep.sock
Oct 14 09:12:01 np0005486759.ooo.test sudo[127487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:02 np0005486759.ooo.test sudo[127487]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:02 np0005486759.ooo.test sudo[127498]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_l45y3u5/privsep.sock
Oct 14 09:12:02 np0005486759.ooo.test sudo[127498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:03 np0005486759.ooo.test sudo[127498]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:03 np0005486759.ooo.test sudo[127514]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpynjfadxc/privsep.sock
Oct 14 09:12:03 np0005486759.ooo.test sudo[127514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:04 np0005486759.ooo.test sudo[127514]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:04 np0005486759.ooo.test sudo[127526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1d9my_dp/privsep.sock
Oct 14 09:12:04 np0005486759.ooo.test sudo[127526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:04 np0005486759.ooo.test sudo[127526]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:05 np0005486759.ooo.test sudo[127537]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1y5uz8w1/privsep.sock
Oct 14 09:12:05 np0005486759.ooo.test sudo[127537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:05 np0005486759.ooo.test sudo[127537]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:06 np0005486759.ooo.test sudo[127548]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxon1__2t/privsep.sock
Oct 14 09:12:06 np0005486759.ooo.test sudo[127548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:06 np0005486759.ooo.test sudo[127548]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:06 np0005486759.ooo.test sudo[127559]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp81domrzt/privsep.sock
Oct 14 09:12:06 np0005486759.ooo.test sudo[127559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:07 np0005486759.ooo.test sudo[127559]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:12:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:12:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:12:07 np0005486759.ooo.test podman[127565]: 2025-10-14 09:12:07.622913651 +0000 UTC m=+0.068196759 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, io.buildah.version=1.33.12, release=1, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64)
Oct 14 09:12:07 np0005486759.ooo.test podman[127566]: 2025-10-14 09:12:07.673637877 +0000 UTC m=+0.115603082 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, version=17.1.9, container_name=nova_compute, release=1, vcs-type=git, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 09:12:07 np0005486759.ooo.test podman[127572]: 2025-10-14 09:12:07.640853019 +0000 UTC m=+0.075542377 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20250721.1, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=2, version=17.1.9, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b)
Oct 14 09:12:07 np0005486759.ooo.test podman[127565]: 2025-10-14 09:12:07.707421766 +0000 UTC m=+0.152704914 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, version=17.1.9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 09:12:07 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:12:07 np0005486759.ooo.test podman[127566]: 2025-10-14 09:12:07.756337486 +0000 UTC m=+0.198302651 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20250721.1, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12)
Oct 14 09:12:07 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:12:07 np0005486759.ooo.test podman[127572]: 2025-10-14 09:12:07.776518513 +0000 UTC m=+0.211207891 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.33.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team)
Oct 14 09:12:07 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:12:07 np0005486759.ooo.test sudo[127634]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpduf8xe4e/privsep.sock
Oct 14 09:12:07 np0005486759.ooo.test sudo[127634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:08 np0005486759.ooo.test sudo[127634]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:08 np0005486759.ooo.test sudo[127647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8xsxmdy9/privsep.sock
Oct 14 09:12:08 np0005486759.ooo.test sudo[127647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:09 np0005486759.ooo.test sudo[127647]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:09 np0005486759.ooo.test sudo[127662]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwehodvi4/privsep.sock
Oct 14 09:12:09 np0005486759.ooo.test sudo[127662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:10 np0005486759.ooo.test sudo[127662]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:10 np0005486759.ooo.test sudo[127673]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp40_4fjg7/privsep.sock
Oct 14 09:12:10 np0005486759.ooo.test sudo[127673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:10 np0005486759.ooo.test sudo[127673]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:11 np0005486759.ooo.test sudo[127684]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph0qn6wb4/privsep.sock
Oct 14 09:12:11 np0005486759.ooo.test sudo[127684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:11 np0005486759.ooo.test sudo[127684]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:12 np0005486759.ooo.test sudo[127695]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprvnr2b84/privsep.sock
Oct 14 09:12:12 np0005486759.ooo.test sudo[127695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:12 np0005486759.ooo.test sudo[127695]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:12 np0005486759.ooo.test sudo[127706]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6r7qfa5u/privsep.sock
Oct 14 09:12:12 np0005486759.ooo.test sudo[127706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:13 np0005486759.ooo.test sudo[127706]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:13 np0005486759.ooo.test sudo[127717]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm584kwpx/privsep.sock
Oct 14 09:12:13 np0005486759.ooo.test sudo[127717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:14 np0005486759.ooo.test sudo[127717]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:14 np0005486759.ooo.test sudo[127734]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg4wevurk/privsep.sock
Oct 14 09:12:14 np0005486759.ooo.test sudo[127734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:15 np0005486759.ooo.test sudo[127734]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:15 np0005486759.ooo.test sudo[127745]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptdyzbb0x/privsep.sock
Oct 14 09:12:15 np0005486759.ooo.test sudo[127745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:16 np0005486759.ooo.test sudo[127745]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:16 np0005486759.ooo.test sudo[127756]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpftgh8zhm/privsep.sock
Oct 14 09:12:16 np0005486759.ooo.test sudo[127756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:17 np0005486759.ooo.test sudo[127756]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:17 np0005486759.ooo.test sudo[127767]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqy3sdbem/privsep.sock
Oct 14 09:12:17 np0005486759.ooo.test sudo[127767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:17 np0005486759.ooo.test sudo[127767]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:12:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:12:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:12:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:12:17 np0005486759.ooo.test podman[127771]: 2025-10-14 09:12:17.917070171 +0000 UTC m=+0.078674325 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9)
Oct 14 09:12:17 np0005486759.ooo.test podman[127771]: 2025-10-14 09:12:17.951237742 +0000 UTC m=+0.112841916 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, release=1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, summary=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.9, build-date=2025-07-21T13:07:52, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:12:17 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:12:17 np0005486759.ooo.test podman[127774]: 2025-10-14 09:12:17.97114285 +0000 UTC m=+0.128335427 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-07-21T14:45:33, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Oct 14 09:12:17 np0005486759.ooo.test podman[127775]: 2025-10-14 09:12:17.939015003 +0000 UTC m=+0.090492032 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, build-date=2025-07-21T13:07:59, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., container_name=metrics_qdr, tcib_managed=true)
Oct 14 09:12:18 np0005486759.ooo.test podman[127774]: 2025-10-14 09:12:18.009255795 +0000 UTC m=+0.166448382 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.9, io.buildah.version=1.33.12, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4)
Oct 14 09:12:18 np0005486759.ooo.test podman[127774]: unhealthy
Oct 14 09:12:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:12:18 np0005486759.ooo.test podman[127781]: 2025-10-14 09:12:18.044969714 +0000 UTC m=+0.194532574 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, version=17.1.9, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, build-date=2025-07-21T15:29:47, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:12:18 np0005486759.ooo.test sudo[127862]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpw263rrvn/privsep.sock
Oct 14 09:12:18 np0005486759.ooo.test sudo[127862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:18 np0005486759.ooo.test podman[127781]: 2025-10-14 09:12:18.083368226 +0000 UTC m=+0.232931106 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1, vcs-type=git, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1)
Oct 14 09:12:18 np0005486759.ooo.test podman[127781]: unhealthy
Oct 14 09:12:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:12:18 np0005486759.ooo.test podman[127775]: 2025-10-14 09:12:18.136369813 +0000 UTC m=+0.287846892 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, batch=17.1_20250721.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Oct 14 09:12:18 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:12:18 np0005486759.ooo.test sudo[127862]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:18 np0005486759.ooo.test sudo[127875]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf7s5do2j/privsep.sock
Oct 14 09:12:18 np0005486759.ooo.test sudo[127875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:18 np0005486759.ooo.test systemd[1]: tmp-crun.cx9AZR.mount: Deactivated successfully.
Oct 14 09:12:19 np0005486759.ooo.test sudo[127875]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:19 np0005486759.ooo.test sudo[127891]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj0nnmz4o/privsep.sock
Oct 14 09:12:19 np0005486759.ooo.test sudo[127891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:20 np0005486759.ooo.test sudo[127891]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:20 np0005486759.ooo.test sudo[127903]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc6jr6y2k/privsep.sock
Oct 14 09:12:20 np0005486759.ooo.test sudo[127903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:21 np0005486759.ooo.test sudo[127903]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:21 np0005486759.ooo.test sudo[127914]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp51or13gs/privsep.sock
Oct 14 09:12:21 np0005486759.ooo.test sudo[127914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:21 np0005486759.ooo.test sudo[127914]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:12:22 np0005486759.ooo.test podman[127918]: 2025-10-14 09:12:22.037730528 +0000 UTC m=+0.058072095 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20250721.1, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, managed_by=tripleo_ansible, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, io.buildah.version=1.33.12)
Oct 14 09:12:22 np0005486759.ooo.test sudo[127947]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1b34cxvt/privsep.sock
Oct 14 09:12:22 np0005486759.ooo.test sudo[127947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:12:22 np0005486759.ooo.test podman[127952]: 2025-10-14 09:12:22.438871548 +0000 UTC m=+0.064341280 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T13:28:44, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, batch=17.1_20250721.1, name=rhosp17/openstack-ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Oct 14 09:12:22 np0005486759.ooo.test podman[127952]: 2025-10-14 09:12:22.455314149 +0000 UTC m=+0.080783931 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, batch=17.1_20250721.1, io.buildah.version=1.33.12, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vcs-type=git, vendor=Red Hat, Inc., release=1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller)
Oct 14 09:12:22 np0005486759.ooo.test podman[127952]: unhealthy
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:12:22 np0005486759.ooo.test podman[127918]: 2025-10-14 09:12:22.467666462 +0000 UTC m=+0.488008029 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20250721.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, version=17.1.9, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:12:22 np0005486759.ooo.test podman[127951]: 2025-10-14 09:12:22.547732069 +0000 UTC m=+0.173982475 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, architecture=x86_64, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:12:22 np0005486759.ooo.test podman[127951]: 2025-10-14 09:12:22.563304184 +0000 UTC m=+0.189554680 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, tcib_managed=true, architecture=x86_64, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, version=17.1.9)
Oct 14 09:12:22 np0005486759.ooo.test podman[127951]: unhealthy
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:22 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:12:22 np0005486759.ooo.test sudo[127947]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:23 np0005486759.ooo.test sudo[128000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjn8i8v7w/privsep.sock
Oct 14 09:12:23 np0005486759.ooo.test sudo[128000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:23 np0005486759.ooo.test sudo[128000]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:23 np0005486759.ooo.test sudo[128011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptrzha7_t/privsep.sock
Oct 14 09:12:23 np0005486759.ooo.test sudo[128011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:24 np0005486759.ooo.test sudo[128011]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:24 np0005486759.ooo.test sudo[128022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoq07q73i/privsep.sock
Oct 14 09:12:24 np0005486759.ooo.test sudo[128022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:25 np0005486759.ooo.test sudo[128022]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:25 np0005486759.ooo.test sudo[128039]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph6g5pulr/privsep.sock
Oct 14 09:12:25 np0005486759.ooo.test sudo[128039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:26 np0005486759.ooo.test sudo[128039]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:26 np0005486759.ooo.test sudo[128050]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcngrmidk/privsep.sock
Oct 14 09:12:26 np0005486759.ooo.test sudo[128050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:27 np0005486759.ooo.test sudo[128050]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:27 np0005486759.ooo.test sudo[128061]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplv50px01/privsep.sock
Oct 14 09:12:27 np0005486759.ooo.test sudo[128061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:27 np0005486759.ooo.test sudo[128061]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:28 np0005486759.ooo.test sudo[128072]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc4bqep6a/privsep.sock
Oct 14 09:12:28 np0005486759.ooo.test sudo[128072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:28 np0005486759.ooo.test sudo[128072]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:29 np0005486759.ooo.test sudo[128083]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcp0h97lm/privsep.sock
Oct 14 09:12:29 np0005486759.ooo.test sudo[128083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:29 np0005486759.ooo.test sudo[128083]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:30 np0005486759.ooo.test sudo[128094]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzn_rbt7z/privsep.sock
Oct 14 09:12:30 np0005486759.ooo.test sudo[128094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:30 np0005486759.ooo.test sudo[128094]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:30 np0005486759.ooo.test sudo[128111]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn5etlzpu/privsep.sock
Oct 14 09:12:30 np0005486759.ooo.test sudo[128111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:31 np0005486759.ooo.test sudo[128111]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:31 np0005486759.ooo.test sudo[128122]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3zdlcdsr/privsep.sock
Oct 14 09:12:31 np0005486759.ooo.test sudo[128122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:32 np0005486759.ooo.test sudo[128122]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:32 np0005486759.ooo.test sudo[128133]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2qauiy3r/privsep.sock
Oct 14 09:12:32 np0005486759.ooo.test sudo[128133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:33 np0005486759.ooo.test sudo[128133]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:33 np0005486759.ooo.test sudo[128144]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx9bpye37/privsep.sock
Oct 14 09:12:33 np0005486759.ooo.test sudo[128144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:33 np0005486759.ooo.test sudo[128144]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:34 np0005486759.ooo.test sudo[128155]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxxsx73gx/privsep.sock
Oct 14 09:12:34 np0005486759.ooo.test sudo[128155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:34 np0005486759.ooo.test sudo[128155]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:35 np0005486759.ooo.test sudo[128166]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpisikjypa/privsep.sock
Oct 14 09:12:35 np0005486759.ooo.test sudo[128166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:35 np0005486759.ooo.test sudo[128166]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:35 np0005486759.ooo.test sudo[128182]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpevazagd6/privsep.sock
Oct 14 09:12:35 np0005486759.ooo.test sudo[128182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:36 np0005486759.ooo.test sudo[128182]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:36 np0005486759.ooo.test sudo[128194]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe1xtdpxx/privsep.sock
Oct 14 09:12:36 np0005486759.ooo.test sudo[128194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:37 np0005486759.ooo.test sudo[128194]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:37 np0005486759.ooo.test sudo[128205]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl6wmr7o_/privsep.sock
Oct 14 09:12:37 np0005486759.ooo.test sudo[128205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:38 np0005486759.ooo.test sudo[128205]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:12:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:12:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:12:38 np0005486759.ooo.test podman[128210]: 2025-10-14 09:12:38.403067612 +0000 UTC m=+0.095578140 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, version=17.1.9, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, distribution-scope=public, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Oct 14 09:12:38 np0005486759.ooo.test podman[128210]: 2025-10-14 09:12:38.411183244 +0000 UTC m=+0.103693772 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20250721.1, vcs-type=git, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, build-date=2025-07-21T13:27:15, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=)
Oct 14 09:12:38 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:12:38 np0005486759.ooo.test podman[128213]: 2025-10-14 09:12:38.455560352 +0000 UTC m=+0.144616463 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 09:12:38 np0005486759.ooo.test podman[128213]: 2025-10-14 09:12:38.46739529 +0000 UTC m=+0.156451351 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.expose-services=, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Oct 14 09:12:38 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:12:38 np0005486759.ooo.test podman[128212]: 2025-10-14 09:12:38.383232126 +0000 UTC m=+0.075673722 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public)
Oct 14 09:12:38 np0005486759.ooo.test podman[128212]: 2025-10-14 09:12:38.516293108 +0000 UTC m=+0.208734704 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-07-21T14:48:37, release=1, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.33.12, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step5, vendor=Red Hat, Inc.)
Oct 14 09:12:38 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:12:38 np0005486759.ooo.test sudo[128277]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9c1sbgn4/privsep.sock
Oct 14 09:12:38 np0005486759.ooo.test sudo[128277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:39 np0005486759.ooo.test sudo[128277]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:39 np0005486759.ooo.test sudo[128288]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7d21f783/privsep.sock
Oct 14 09:12:39 np0005486759.ooo.test sudo[128288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:40 np0005486759.ooo.test sudo[128288]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:40 np0005486759.ooo.test sudo[128299]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqtm5xe7s/privsep.sock
Oct 14 09:12:40 np0005486759.ooo.test sudo[128299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:40 np0005486759.ooo.test sudo[128299]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:41 np0005486759.ooo.test sudo[128312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpast5qql8/privsep.sock
Oct 14 09:12:41 np0005486759.ooo.test sudo[128312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:41 np0005486759.ooo.test sudo[128312]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:41 np0005486759.ooo.test sudo[128327]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi1ruadhw/privsep.sock
Oct 14 09:12:41 np0005486759.ooo.test sudo[128327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:42 np0005486759.ooo.test sudo[128327]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:42 np0005486759.ooo.test sudo[128338]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr7ahn66j/privsep.sock
Oct 14 09:12:42 np0005486759.ooo.test sudo[128338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:43 np0005486759.ooo.test sudo[128338]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:43 np0005486759.ooo.test sudo[128349]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4q097qe5/privsep.sock
Oct 14 09:12:43 np0005486759.ooo.test sudo[128349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:44 np0005486759.ooo.test sudo[128349]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:44 np0005486759.ooo.test sudo[128360]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqdf2u76o/privsep.sock
Oct 14 09:12:44 np0005486759.ooo.test sudo[128360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:45 np0005486759.ooo.test sudo[128360]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:45 np0005486759.ooo.test sudo[128371]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx7hyk5_e/privsep.sock
Oct 14 09:12:45 np0005486759.ooo.test sudo[128371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:46 np0005486759.ooo.test sudo[128371]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:46 np0005486759.ooo.test sudo[128382]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3il8h0d1/privsep.sock
Oct 14 09:12:46 np0005486759.ooo.test sudo[128382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:46 np0005486759.ooo.test sudo[128382]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:47 np0005486759.ooo.test sudo[128399]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe50t0ble/privsep.sock
Oct 14 09:12:47 np0005486759.ooo.test sudo[128399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:47 np0005486759.ooo.test sudo[128399]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:48 np0005486759.ooo.test sudo[128410]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp46qc2q6g/privsep.sock
Oct 14 09:12:48 np0005486759.ooo.test sudo[128410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:12:48 np0005486759.ooo.test podman[128413]: 2025-10-14 09:12:48.259652138 +0000 UTC m=+0.087489518 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, version=17.1.9, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, managed_by=tripleo_ansible, release=1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:45:33, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1)
Oct 14 09:12:48 np0005486759.ooo.test podman[128413]: 2025-10-14 09:12:48.294321936 +0000 UTC m=+0.122159296 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=17.1.9, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4)
Oct 14 09:12:48 np0005486759.ooo.test podman[128413]: unhealthy
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:12:48 np0005486759.ooo.test podman[128414]: 2025-10-14 09:12:48.31026077 +0000 UTC m=+0.140257027 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, build-date=2025-07-21T15:29:47, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:12:48 np0005486759.ooo.test podman[128414]: 2025-10-14 09:12:48.322221613 +0000 UTC m=+0.152217870 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-07-21T15:29:47, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, batch=17.1_20250721.1, io.buildah.version=1.33.12, io.openshift.expose-services=, release=1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4)
Oct 14 09:12:48 np0005486759.ooo.test podman[128414]: unhealthy
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:12:48 np0005486759.ooo.test podman[128415]: 2025-10-14 09:12:48.357697744 +0000 UTC m=+0.184691738 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-qdrouterd, release=1, distribution-scope=public, tcib_managed=true, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, vcs-type=git)
Oct 14 09:12:48 np0005486759.ooo.test podman[128412]: 2025-10-14 09:12:48.232189145 +0000 UTC m=+0.068358154 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.9, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1, batch=17.1_20250721.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:12:48 np0005486759.ooo.test podman[128412]: 2025-10-14 09:12:48.41840921 +0000 UTC m=+0.254578239 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, release=1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., version=17.1.9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:12:48 np0005486759.ooo.test podman[128415]: 2025-10-14 09:12:48.610528067 +0000 UTC m=+0.437522011 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, release=1, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, io.buildah.version=1.33.12, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59)
Oct 14 09:12:48 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:12:48 np0005486759.ooo.test sudo[128410]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:48 np0005486759.ooo.test sudo[128506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5l1v1r4h/privsep.sock
Oct 14 09:12:48 np0005486759.ooo.test sudo[128506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:49 np0005486759.ooo.test sudo[128506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:49 np0005486759.ooo.test sudo[128517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1119d_nz/privsep.sock
Oct 14 09:12:49 np0005486759.ooo.test sudo[128517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:50 np0005486759.ooo.test sudo[128517]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:50 np0005486759.ooo.test sudo[128528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpto9uq2xw/privsep.sock
Oct 14 09:12:50 np0005486759.ooo.test sudo[128528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:51 np0005486759.ooo.test sudo[128528]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:51 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:12:51 np0005486759.ooo.test recover_tripleo_nova_virtqemud[128535]: 47951
Oct 14 09:12:51 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:12:51 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:12:51 np0005486759.ooo.test sudo[128541]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps91w_155/privsep.sock
Oct 14 09:12:51 np0005486759.ooo.test sudo[128541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:52 np0005486759.ooo.test sudo[128541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:52 np0005486759.ooo.test sudo[128558]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphj983ogt/privsep.sock
Oct 14 09:12:52 np0005486759.ooo.test sudo[128558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:52 np0005486759.ooo.test sudo[128558]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:12:53 np0005486759.ooo.test podman[128566]: 2025-10-14 09:12:53.089928237 +0000 UTC m=+0.069761637 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, release=1, container_name=ovn_controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, managed_by=tripleo_ansible, batch=17.1_20250721.1, distribution-scope=public, io.buildah.version=1.33.12)
Oct 14 09:12:53 np0005486759.ooo.test podman[128566]: 2025-10-14 09:12:53.100484735 +0000 UTC m=+0.080318145 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 09:12:53 np0005486759.ooo.test podman[128566]: unhealthy
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:12:53 np0005486759.ooo.test podman[128565]: 2025-10-14 09:12:53.153042678 +0000 UTC m=+0.133528789 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-07-21T16:28:53, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 09:12:53 np0005486759.ooo.test podman[128565]: 2025-10-14 09:12:53.193420172 +0000 UTC m=+0.173906233 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, batch=17.1_20250721.1, distribution-scope=public, build-date=2025-07-21T16:28:53, release=1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:12:53 np0005486759.ooo.test podman[128565]: unhealthy
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:12:53 np0005486759.ooo.test podman[128564]: 2025-10-14 09:12:53.204795095 +0000 UTC m=+0.186569815 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, version=17.1.9)
Oct 14 09:12:53 np0005486759.ooo.test sudo[128627]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9i9kkqen/privsep.sock
Oct 14 09:12:53 np0005486759.ooo.test sudo[128627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:53 np0005486759.ooo.test podman[128564]: 2025-10-14 09:12:53.521193173 +0000 UTC m=+0.502967883 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=17.1.9)
Oct 14 09:12:53 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:12:53 np0005486759.ooo.test sudo[128627]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:54 np0005486759.ooo.test sudo[128639]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuoy01f_x/privsep.sock
Oct 14 09:12:54 np0005486759.ooo.test sudo[128639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:54 np0005486759.ooo.test sudo[128639]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:54 np0005486759.ooo.test sudo[128650]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpyitl85_h/privsep.sock
Oct 14 09:12:54 np0005486759.ooo.test sudo[128650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:55 np0005486759.ooo.test sudo[128650]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:55 np0005486759.ooo.test sudo[128661]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjyiy7eo3/privsep.sock
Oct 14 09:12:55 np0005486759.ooo.test sudo[128661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:56 np0005486759.ooo.test sudo[128661]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:56 np0005486759.ooo.test sudo[128672]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpexehnqo6/privsep.sock
Oct 14 09:12:56 np0005486759.ooo.test sudo[128672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:57 np0005486759.ooo.test sudo[128672]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:57 np0005486759.ooo.test sudo[128689]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpklwem0or/privsep.sock
Oct 14 09:12:57 np0005486759.ooo.test sudo[128689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:58 np0005486759.ooo.test sudo[128689]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:58 np0005486759.ooo.test sudo[128700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaet2ot4x/privsep.sock
Oct 14 09:12:58 np0005486759.ooo.test sudo[128700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:12:59 np0005486759.ooo.test sudo[128700]: pam_unix(sudo:session): session closed for user root
Oct 14 09:12:59 np0005486759.ooo.test sudo[128711]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_7wdjrac/privsep.sock
Oct 14 09:12:59 np0005486759.ooo.test sudo[128711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:00 np0005486759.ooo.test sudo[128711]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:00 np0005486759.ooo.test sudo[128722]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpprg60yxm/privsep.sock
Oct 14 09:13:00 np0005486759.ooo.test sudo[128722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:00 np0005486759.ooo.test sudo[128722]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:01 np0005486759.ooo.test sudo[128733]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_ekl26id/privsep.sock
Oct 14 09:13:01 np0005486759.ooo.test sudo[128733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:01 np0005486759.ooo.test sudo[128733]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:01 np0005486759.ooo.test sudo[128744]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqncq5p1n/privsep.sock
Oct 14 09:13:01 np0005486759.ooo.test sudo[128744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:02 np0005486759.ooo.test sudo[128744]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:02 np0005486759.ooo.test sudo[128758]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc0ldpiww/privsep.sock
Oct 14 09:13:02 np0005486759.ooo.test sudo[128758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:03 np0005486759.ooo.test sudo[128758]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:03 np0005486759.ooo.test sudo[128772]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnxejae2w/privsep.sock
Oct 14 09:13:03 np0005486759.ooo.test sudo[128772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:04 np0005486759.ooo.test sudo[128772]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:04 np0005486759.ooo.test sudo[128783]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpivbnp5yy/privsep.sock
Oct 14 09:13:04 np0005486759.ooo.test sudo[128783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:05 np0005486759.ooo.test sudo[128783]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:05 np0005486759.ooo.test sudo[128794]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptu6ep8ke/privsep.sock
Oct 14 09:13:05 np0005486759.ooo.test sudo[128794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:05 np0005486759.ooo.test sudo[128794]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:06 np0005486759.ooo.test sudo[128805]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2qm5vl_1/privsep.sock
Oct 14 09:13:06 np0005486759.ooo.test sudo[128805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:06 np0005486759.ooo.test sudo[128805]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:06 np0005486759.ooo.test sudo[128816]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph76dv2ru/privsep.sock
Oct 14 09:13:07 np0005486759.ooo.test sudo[128816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:07 np0005486759.ooo.test sudo[128816]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:07 np0005486759.ooo.test sudo[128827]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgd_5arti/privsep.sock
Oct 14 09:13:07 np0005486759.ooo.test sudo[128827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:08 np0005486759.ooo.test sudo[128827]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:13:08 np0005486759.ooo.test podman[128840]: 2025-10-14 09:13:08.596007481 +0000 UTC m=+0.068592582 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=2, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-07-21T13:04:03, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, architecture=x86_64, container_name=collectd, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git)
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: tmp-crun.Txk5nG.mount: Deactivated successfully.
Oct 14 09:13:08 np0005486759.ooo.test podman[128839]: 2025-10-14 09:13:08.672127526 +0000 UTC m=+0.144798029 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-07-21T13:27:15, vcs-type=git, batch=17.1_20250721.1, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9)
Oct 14 09:13:08 np0005486759.ooo.test podman[128839]: 2025-10-14 09:13:08.712545251 +0000 UTC m=+0.185215834 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, version=17.1.9, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, build-date=2025-07-21T13:27:15, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:13:08 np0005486759.ooo.test podman[128867]: 2025-10-14 09:13:08.726947648 +0000 UTC m=+0.122624589 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vcs-type=git, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, tcib_managed=true, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Oct 14 09:13:08 np0005486759.ooo.test podman[128867]: 2025-10-14 09:13:08.756483656 +0000 UTC m=+0.152160567 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.buildah.version=1.33.12, batch=17.1_20250721.1, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, architecture=x86_64, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:13:08 np0005486759.ooo.test sudo[128906]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp92shym7q/privsep.sock
Oct 14 09:13:08 np0005486759.ooo.test sudo[128906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:08 np0005486759.ooo.test podman[128840]: 2025-10-14 09:13:08.779315865 +0000 UTC m=+0.251900996 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, version=17.1.9, release=2, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.33.12, build-date=2025-07-21T13:04:03, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible)
Oct 14 09:13:08 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:13:09 np0005486759.ooo.test sudo[128906]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:09 np0005486759.ooo.test sudo[128917]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzo26jy3v/privsep.sock
Oct 14 09:13:09 np0005486759.ooo.test sudo[128917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:10 np0005486759.ooo.test sudo[128917]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:10 np0005486759.ooo.test sudo[128928]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps2zdqaej/privsep.sock
Oct 14 09:13:10 np0005486759.ooo.test sudo[128928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:11 np0005486759.ooo.test sudo[128928]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:11 np0005486759.ooo.test sudo[128939]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg2upuh6_/privsep.sock
Oct 14 09:13:11 np0005486759.ooo.test sudo[128939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:12 np0005486759.ooo.test sudo[128939]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:12 np0005486759.ooo.test sudo[128950]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpry61xgef/privsep.sock
Oct 14 09:13:12 np0005486759.ooo.test sudo[128950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:12 np0005486759.ooo.test sudo[128950]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:13 np0005486759.ooo.test sudo[128963]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiymn906q/privsep.sock
Oct 14 09:13:13 np0005486759.ooo.test sudo[128963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:13 np0005486759.ooo.test sudo[128963]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:14 np0005486759.ooo.test sudo[128978]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp41c1qfyj/privsep.sock
Oct 14 09:13:14 np0005486759.ooo.test sudo[128978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:14 np0005486759.ooo.test sudo[128978]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:15 np0005486759.ooo.test sudo[128989]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfi223a2e/privsep.sock
Oct 14 09:13:15 np0005486759.ooo.test sudo[128989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:15 np0005486759.ooo.test sudo[128989]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:15 np0005486759.ooo.test sudo[129000]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp136kr119/privsep.sock
Oct 14 09:13:15 np0005486759.ooo.test sudo[129000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:16 np0005486759.ooo.test sudo[129000]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:16 np0005486759.ooo.test sudo[129011]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7r1o_z1p/privsep.sock
Oct 14 09:13:16 np0005486759.ooo.test sudo[129011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:17 np0005486759.ooo.test sudo[129011]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:17 np0005486759.ooo.test sudo[129022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2och24e5/privsep.sock
Oct 14 09:13:17 np0005486759.ooo.test sudo[129022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:18 np0005486759.ooo.test sudo[129022]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: tmp-crun.XWYCbb.mount: Deactivated successfully.
Oct 14 09:13:18 np0005486759.ooo.test podman[129029]: 2025-10-14 09:13:18.459481171 +0000 UTC m=+0.080787539 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, build-date=2025-07-21T15:29:47)
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:13:18 np0005486759.ooo.test podman[129029]: 2025-10-14 09:13:18.510437355 +0000 UTC m=+0.131743743 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, release=1, tcib_managed=true, version=17.1.9, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:13:18 np0005486759.ooo.test podman[129029]: unhealthy
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:13:18 np0005486759.ooo.test sudo[129077]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1f72g3iy/privsep.sock
Oct 14 09:13:18 np0005486759.ooo.test sudo[129077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:18 np0005486759.ooo.test podman[129058]: 2025-10-14 09:13:18.558654982 +0000 UTC m=+0.072899795 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.buildah.version=1.33.12, release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, architecture=x86_64, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, build-date=2025-07-21T13:07:52, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:13:18 np0005486759.ooo.test podman[129058]: 2025-10-14 09:13:18.562439 +0000 UTC m=+0.076683743 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, tcib_managed=true, com.redhat.component=openstack-cron-container)
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:13:18 np0005486759.ooo.test podman[129028]: 2025-10-14 09:13:18.516308737 +0000 UTC m=+0.141612030 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, version=17.1.9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1, batch=17.1_20250721.1, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, build-date=2025-07-21T14:45:33, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Oct 14 09:13:18 np0005486759.ooo.test podman[129028]: 2025-10-14 09:13:18.645996195 +0000 UTC m=+0.271299498 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, version=17.1.9, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, tcib_managed=true)
Oct 14 09:13:18 np0005486759.ooo.test podman[129028]: unhealthy
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:13:18 np0005486759.ooo.test podman[129094]: 2025-10-14 09:13:18.762155694 +0000 UTC m=+0.081611957 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.9, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, config_id=tripleo_step1, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:13:18 np0005486759.ooo.test podman[129094]: 2025-10-14 09:13:18.952292429 +0000 UTC m=+0.271748662 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:07:59, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed)
Oct 14 09:13:18 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:13:19 np0005486759.ooo.test sudo[129077]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:19 np0005486759.ooo.test sudo[129136]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1awx79kw/privsep.sock
Oct 14 09:13:19 np0005486759.ooo.test sudo[129136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:20 np0005486759.ooo.test sudo[129136]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:20 np0005486759.ooo.test sudo[129147]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0q17_nav/privsep.sock
Oct 14 09:13:20 np0005486759.ooo.test sudo[129147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:20 np0005486759.ooo.test sudo[129147]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:21 np0005486759.ooo.test sudo[129158]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmqmjlxsm/privsep.sock
Oct 14 09:13:21 np0005486759.ooo.test sudo[129158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:21 np0005486759.ooo.test sudo[129158]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:22 np0005486759.ooo.test sudo[129169]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqv0vp24_/privsep.sock
Oct 14 09:13:22 np0005486759.ooo.test sudo[129169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:22 np0005486759.ooo.test sudo[129169]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:22 np0005486759.ooo.test sudo[129180]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxei757dh/privsep.sock
Oct 14 09:13:22 np0005486759.ooo.test sudo[129180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: tmp-crun.yGG8co.mount: Deactivated successfully.
Oct 14 09:13:23 np0005486759.ooo.test podman[129184]: 2025-10-14 09:13:23.455275282 +0000 UTC m=+0.079291773 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.9, build-date=2025-07-21T13:28:44, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Oct 14 09:13:23 np0005486759.ooo.test podman[129184]: 2025-10-14 09:13:23.495419279 +0000 UTC m=+0.119435790 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, build-date=2025-07-21T13:28:44, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_controller, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:13:23 np0005486759.ooo.test podman[129184]: unhealthy
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:13:23 np0005486759.ooo.test podman[129183]: 2025-10-14 09:13:23.514313686 +0000 UTC m=+0.139921597 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, batch=17.1_20250721.1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-07-21T16:28:53, config_id=tripleo_step4, io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team)
Oct 14 09:13:23 np0005486759.ooo.test podman[129183]: 2025-10-14 09:13:23.530294603 +0000 UTC m=+0.155902504 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, config_id=tripleo_step4, io.buildah.version=1.33.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.9)
Oct 14 09:13:23 np0005486759.ooo.test podman[129183]: unhealthy
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:13:23 np0005486759.ooo.test podman[129221]: 2025-10-14 09:13:23.616405427 +0000 UTC m=+0.060056846 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, architecture=x86_64, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, config_id=tripleo_step4)
Oct 14 09:13:23 np0005486759.ooo.test sudo[129180]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:23 np0005486759.ooo.test sudo[129251]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvioh026z/privsep.sock
Oct 14 09:13:23 np0005486759.ooo.test sudo[129251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:23 np0005486759.ooo.test podman[129221]: 2025-10-14 09:13:23.9503146 +0000 UTC m=+0.393966029 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, distribution-scope=public, build-date=2025-07-21T14:48:37, maintainer=OpenStack TripleO Team)
Oct 14 09:13:23 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:13:24 np0005486759.ooo.test systemd[1]: tmp-crun.dHssvb.mount: Deactivated successfully.
Oct 14 09:13:24 np0005486759.ooo.test sudo[129251]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:24 np0005486759.ooo.test sudo[129268]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphz4yja3h/privsep.sock
Oct 14 09:13:24 np0005486759.ooo.test sudo[129268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:25 np0005486759.ooo.test sudo[129268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:25 np0005486759.ooo.test sudo[129279]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpawealp6g/privsep.sock
Oct 14 09:13:25 np0005486759.ooo.test sudo[129279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:26 np0005486759.ooo.test sudo[129279]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:26 np0005486759.ooo.test sudo[129290]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpic0di2qn/privsep.sock
Oct 14 09:13:26 np0005486759.ooo.test sudo[129290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:27 np0005486759.ooo.test sudo[129290]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:27 np0005486759.ooo.test sudo[129301]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptqi82xgj/privsep.sock
Oct 14 09:13:27 np0005486759.ooo.test sudo[129301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:27 np0005486759.ooo.test sudo[129301]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:28 np0005486759.ooo.test sudo[129312]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps03nrda3/privsep.sock
Oct 14 09:13:28 np0005486759.ooo.test sudo[129312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:28 np0005486759.ooo.test sudo[129312]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:28 np0005486759.ooo.test sudo[129323]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprui6lb0m/privsep.sock
Oct 14 09:13:28 np0005486759.ooo.test sudo[129323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:29 np0005486759.ooo.test sudo[129323]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:29 np0005486759.ooo.test sudo[129340]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmph06kuon_/privsep.sock
Oct 14 09:13:29 np0005486759.ooo.test sudo[129340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:30 np0005486759.ooo.test sudo[129340]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:30 np0005486759.ooo.test sudo[129351]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjrcmh263/privsep.sock
Oct 14 09:13:30 np0005486759.ooo.test sudo[129351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:31 np0005486759.ooo.test sudo[129351]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:31 np0005486759.ooo.test sudo[129362]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpr2et91_v/privsep.sock
Oct 14 09:13:31 np0005486759.ooo.test sudo[129362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:32 np0005486759.ooo.test sudo[129362]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:32 np0005486759.ooo.test sudo[129373]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptad13pyd/privsep.sock
Oct 14 09:13:32 np0005486759.ooo.test sudo[129373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:32 np0005486759.ooo.test sudo[129373]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:33 np0005486759.ooo.test sudo[129384]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxvnvot5u/privsep.sock
Oct 14 09:13:33 np0005486759.ooo.test sudo[129384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:33 np0005486759.ooo.test sudo[129384]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:33 np0005486759.ooo.test sudo[129395]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpxqhgmy1y/privsep.sock
Oct 14 09:13:33 np0005486759.ooo.test sudo[129395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:34 np0005486759.ooo.test sudo[129395]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:34 np0005486759.ooo.test sudo[129406]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwkitqi8c/privsep.sock
Oct 14 09:13:34 np0005486759.ooo.test sudo[129406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:35 np0005486759.ooo.test sudo[129406]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:35 np0005486759.ooo.test sudo[129423]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpkwiaotk0/privsep.sock
Oct 14 09:13:35 np0005486759.ooo.test sudo[129423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:36 np0005486759.ooo.test sudo[129423]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:36 np0005486759.ooo.test sudo[129434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpsaokptlr/privsep.sock
Oct 14 09:13:36 np0005486759.ooo.test sudo[129434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:37 np0005486759.ooo.test sudo[129434]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:37 np0005486759.ooo.test sudo[129445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp7mz_0pr_/privsep.sock
Oct 14 09:13:37 np0005486759.ooo.test sudo[129445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:37 np0005486759.ooo.test sudo[129445]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:38 np0005486759.ooo.test sudo[129456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl0btllnu/privsep.sock
Oct 14 09:13:38 np0005486759.ooo.test sudo[129456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:38 np0005486759.ooo.test sudo[129456]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:13:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:13:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:13:38 np0005486759.ooo.test podman[129462]: 2025-10-14 09:13:38.974320748 +0000 UTC m=+0.074010200 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Oct 14 09:13:39 np0005486759.ooo.test podman[129463]: 2025-10-14 09:13:39.041142453 +0000 UTC m=+0.136814250 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step5, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vendor=Red Hat, Inc., tcib_managed=true)
Oct 14 09:13:39 np0005486759.ooo.test podman[129463]: 2025-10-14 09:13:39.073260471 +0000 UTC m=+0.168932248 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1, tcib_managed=true, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 09:13:39 np0005486759.ooo.test systemd[1]: tmp-crun.EdxjTO.mount: Deactivated successfully.
Oct 14 09:13:39 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:13:39 np0005486759.ooo.test podman[129464]: 2025-10-14 09:13:39.094599013 +0000 UTC m=+0.187273757 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-collectd-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=2, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd, io.buildah.version=1.33.12, version=17.1.9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, config_id=tripleo_step3, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Oct 14 09:13:39 np0005486759.ooo.test podman[129464]: 2025-10-14 09:13:39.102174749 +0000 UTC m=+0.194849523 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.9, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, release=2, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, build-date=2025-07-21T13:04:03, distribution-scope=public)
Oct 14 09:13:39 np0005486759.ooo.test podman[129462]: 2025-10-14 09:13:39.113258203 +0000 UTC m=+0.212947685 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-type=git, container_name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9)
Oct 14 09:13:39 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:13:39 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:13:39 np0005486759.ooo.test sudo[129529]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqg9pkr43/privsep.sock
Oct 14 09:13:39 np0005486759.ooo.test sudo[129529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:39 np0005486759.ooo.test sudo[129529]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:39 np0005486759.ooo.test sudo[129540]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpuffwks3r/privsep.sock
Oct 14 09:13:39 np0005486759.ooo.test sudo[129540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:40 np0005486759.ooo.test sudo[129540]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:40 np0005486759.ooo.test sudo[129557]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf6w_m5k7/privsep.sock
Oct 14 09:13:40 np0005486759.ooo.test sudo[129557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:41 np0005486759.ooo.test sudo[129557]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:41 np0005486759.ooo.test sudo[129568]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpagna_dft/privsep.sock
Oct 14 09:13:41 np0005486759.ooo.test sudo[129568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:42 np0005486759.ooo.test sudo[129568]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:42 np0005486759.ooo.test sudo[129579]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9upfwsy4/privsep.sock
Oct 14 09:13:42 np0005486759.ooo.test sudo[129579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:43 np0005486759.ooo.test sudo[129579]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:43 np0005486759.ooo.test sudo[129590]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_x2muwfa/privsep.sock
Oct 14 09:13:43 np0005486759.ooo.test sudo[129590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:43 np0005486759.ooo.test sudo[129590]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:44 np0005486759.ooo.test sudo[129601]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpeebvkwxy/privsep.sock
Oct 14 09:13:44 np0005486759.ooo.test sudo[129601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:44 np0005486759.ooo.test sudo[129601]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:45 np0005486759.ooo.test sudo[129612]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmxc3yizg/privsep.sock
Oct 14 09:13:45 np0005486759.ooo.test sudo[129612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:45 np0005486759.ooo.test sudo[129612]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:45 np0005486759.ooo.test sudo[129629]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6__8tar7/privsep.sock
Oct 14 09:13:45 np0005486759.ooo.test sudo[129629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:46 np0005486759.ooo.test sudo[129629]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:46 np0005486759.ooo.test sudo[129640]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg5n9p1vk/privsep.sock
Oct 14 09:13:46 np0005486759.ooo.test sudo[129640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:47 np0005486759.ooo.test sudo[129640]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:47 np0005486759.ooo.test sudo[129651]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1538u8bn/privsep.sock
Oct 14 09:13:47 np0005486759.ooo.test sudo[129651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:48 np0005486759.ooo.test sudo[129651]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:48 np0005486759.ooo.test sudo[129662]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaixk8hh8/privsep.sock
Oct 14 09:13:48 np0005486759.ooo.test sudo[129662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: tmp-crun.p5skvw.mount: Deactivated successfully.
Oct 14 09:13:48 np0005486759.ooo.test podman[129665]: 2025-10-14 09:13:48.715628324 +0000 UTC m=+0.064330909 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-07-21T15:29:47, distribution-scope=public, version=17.1.9, release=1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, managed_by=tripleo_ansible)
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:13:48 np0005486759.ooo.test podman[129664]: 2025-10-14 09:13:48.749757524 +0000 UTC m=+0.098826980 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20250721.1, managed_by=tripleo_ansible, release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:13:48 np0005486759.ooo.test podman[129664]: 2025-10-14 09:13:48.764564974 +0000 UTC m=+0.113634480 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, build-date=2025-07-21T13:07:52, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:13:48 np0005486759.ooo.test podman[129665]: 2025-10-14 09:13:48.802120841 +0000 UTC m=+0.150823406 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20250721.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, build-date=2025-07-21T15:29:47, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:13:48 np0005486759.ooo.test podman[129665]: unhealthy
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:13:48 np0005486759.ooo.test podman[129694]: 2025-10-14 09:13:48.81626896 +0000 UTC m=+0.072009138 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, release=1, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64)
Oct 14 09:13:48 np0005486759.ooo.test podman[129694]: 2025-10-14 09:13:48.824086823 +0000 UTC m=+0.079826991 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, release=1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.9, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 09:13:48 np0005486759.ooo.test podman[129694]: unhealthy
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:48 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:13:49 np0005486759.ooo.test sudo[129662]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:13:49 np0005486759.ooo.test podman[129726]: 2025-10-14 09:13:49.289037146 +0000 UTC m=+0.055770644 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vcs-type=git, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, config_id=tripleo_step1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, container_name=metrics_qdr, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Oct 14 09:13:49 np0005486759.ooo.test podman[129726]: 2025-10-14 09:13:49.473188975 +0000 UTC m=+0.239922513 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., release=1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:07:59, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.component=openstack-qdrouterd-container, version=17.1.9)
Oct 14 09:13:49 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:13:49 np0005486759.ooo.test sudo[129761]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpl00z5gci/privsep.sock
Oct 14 09:13:49 np0005486759.ooo.test sudo[129761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:50 np0005486759.ooo.test sudo[129761]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:50 np0005486759.ooo.test sudo[129772]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqy9g3row/privsep.sock
Oct 14 09:13:50 np0005486759.ooo.test sudo[129772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:50 np0005486759.ooo.test sudo[129772]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:51 np0005486759.ooo.test sudo[129788]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9411gcuo/privsep.sock
Oct 14 09:13:51 np0005486759.ooo.test sudo[129788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:51 np0005486759.ooo.test sudo[129788]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:52 np0005486759.ooo.test sudo[129800]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcpwc5tfq/privsep.sock
Oct 14 09:13:52 np0005486759.ooo.test sudo[129800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:52 np0005486759.ooo.test sudo[129800]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:53 np0005486759.ooo.test sudo[129811]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_qcgvcom/privsep.sock
Oct 14 09:13:53 np0005486759.ooo.test sudo[129811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:53 np0005486759.ooo.test sudo[129811]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:13:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:13:53 np0005486759.ooo.test podman[129818]: 2025-10-14 09:13:53.666140178 +0000 UTC m=+0.055005920 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20250721.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, version=17.1.9, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:13:53 np0005486759.ooo.test podman[129818]: 2025-10-14 09:13:53.677318985 +0000 UTC m=+0.066184647 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.33.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:13:53 np0005486759.ooo.test podman[129818]: unhealthy
Oct 14 09:13:53 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:53 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:13:53 np0005486759.ooo.test podman[129817]: 2025-10-14 09:13:53.728082482 +0000 UTC m=+0.118381938 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., version=17.1.9, io.openshift.expose-services=, release=1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-07-21T16:28:53, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:13:53 np0005486759.ooo.test podman[129817]: 2025-10-14 09:13:53.744217043 +0000 UTC m=+0.134516459 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, version=17.1.9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, release=1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:13:53 np0005486759.ooo.test podman[129817]: unhealthy
Oct 14 09:13:53 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:13:53 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:13:53 np0005486759.ooo.test sudo[129856]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfqaw662n/privsep.sock
Oct 14 09:13:53 np0005486759.ooo.test sudo[129856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:13:54 np0005486759.ooo.test sudo[129856]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:54 np0005486759.ooo.test systemd[1]: tmp-crun.JvM7Y7.mount: Deactivated successfully.
Oct 14 09:13:54 np0005486759.ooo.test podman[129860]: 2025-10-14 09:13:54.445427624 +0000 UTC m=+0.078792618 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, release=1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:13:54 np0005486759.ooo.test sudo[129888]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp88d_wxnr/privsep.sock
Oct 14 09:13:54 np0005486759.ooo.test sudo[129888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:54 np0005486759.ooo.test podman[129860]: 2025-10-14 09:13:54.783426263 +0000 UTC m=+0.416791297 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, io.buildah.version=1.33.12, release=1)
Oct 14 09:13:54 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:13:55 np0005486759.ooo.test sudo[129888]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:55 np0005486759.ooo.test sudo[129901]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbifeyhlr/privsep.sock
Oct 14 09:13:55 np0005486759.ooo.test sudo[129901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:56 np0005486759.ooo.test sudo[129901]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:56 np0005486759.ooo.test sudo[129912]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnuzoewgk/privsep.sock
Oct 14 09:13:56 np0005486759.ooo.test sudo[129912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:56 np0005486759.ooo.test sudo[129912]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:57 np0005486759.ooo.test sudo[129929]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2nko0i0r/privsep.sock
Oct 14 09:13:57 np0005486759.ooo.test sudo[129929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:57 np0005486759.ooo.test sudo[129929]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:58 np0005486759.ooo.test sudo[129940]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpts4c0m7a/privsep.sock
Oct 14 09:13:58 np0005486759.ooo.test sudo[129940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:58 np0005486759.ooo.test sudo[129940]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:58 np0005486759.ooo.test sudo[129951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp288xv0qf/privsep.sock
Oct 14 09:13:58 np0005486759.ooo.test sudo[129951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:13:59 np0005486759.ooo.test sudo[129951]: pam_unix(sudo:session): session closed for user root
Oct 14 09:13:59 np0005486759.ooo.test sudo[129962]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptvn6l0o7/privsep.sock
Oct 14 09:13:59 np0005486759.ooo.test sudo[129962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:00 np0005486759.ooo.test sudo[129962]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:00 np0005486759.ooo.test sudo[129973]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpz5gbnv25/privsep.sock
Oct 14 09:14:00 np0005486759.ooo.test sudo[129973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:01 np0005486759.ooo.test sudo[129973]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:01 np0005486759.ooo.test sudo[129984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp27zkgdq8/privsep.sock
Oct 14 09:14:01 np0005486759.ooo.test sudo[129984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:01 np0005486759.ooo.test sudo[129984]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:02 np0005486759.ooo.test sudo[130001]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps67ds98x/privsep.sock
Oct 14 09:14:02 np0005486759.ooo.test sudo[130001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:02 np0005486759.ooo.test sudo[130001]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:02 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:14:02 np0005486759.ooo.test recover_tripleo_nova_virtqemud[130008]: 47951
Oct 14 09:14:02 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:14:02 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:14:03 np0005486759.ooo.test sudo[130014]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpiztnk_tl/privsep.sock
Oct 14 09:14:03 np0005486759.ooo.test sudo[130014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:03 np0005486759.ooo.test sudo[130014]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:03 np0005486759.ooo.test sudo[130025]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp99ydyt2b/privsep.sock
Oct 14 09:14:03 np0005486759.ooo.test sudo[130025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:04 np0005486759.ooo.test sudo[130025]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:04 np0005486759.ooo.test sudo[130036]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfjm8vh51/privsep.sock
Oct 14 09:14:04 np0005486759.ooo.test sudo[130036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:05 np0005486759.ooo.test sudo[130036]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:05 np0005486759.ooo.test sudo[130047]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaosmxx92/privsep.sock
Oct 14 09:14:05 np0005486759.ooo.test sudo[130047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:06 np0005486759.ooo.test sudo[130047]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:06 np0005486759.ooo.test sudo[130058]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_slhh174/privsep.sock
Oct 14 09:14:06 np0005486759.ooo.test sudo[130058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:07 np0005486759.ooo.test sudo[130058]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:07 np0005486759.ooo.test sudo[130072]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqoum1jyy/privsep.sock
Oct 14 09:14:07 np0005486759.ooo.test sudo[130072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:07 np0005486759.ooo.test sudo[130072]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:08 np0005486759.ooo.test sudo[130086]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpzbkhwi5u/privsep.sock
Oct 14 09:14:08 np0005486759.ooo.test sudo[130086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:08 np0005486759.ooo.test sudo[130086]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:09 np0005486759.ooo.test sudo[130097]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmps0fxwjxq/privsep.sock
Oct 14 09:14:09 np0005486759.ooo.test sudo[130097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:14:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:14:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:14:09 np0005486759.ooo.test podman[130102]: 2025-10-14 09:14:09.439120661 +0000 UTC m=+0.067492428 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, release=2, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, architecture=x86_64, batch=17.1_20250721.1)
Oct 14 09:14:09 np0005486759.ooo.test podman[130100]: 2025-10-14 09:14:09.450078551 +0000 UTC m=+0.078232191 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Oct 14 09:14:09 np0005486759.ooo.test podman[130102]: 2025-10-14 09:14:09.471195018 +0000 UTC m=+0.099566765 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, name=rhosp17/openstack-collectd, config_id=tripleo_step3, tcib_managed=true, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03)
Oct 14 09:14:09 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:14:09 np0005486759.ooo.test podman[130100]: 2025-10-14 09:14:09.486241085 +0000 UTC m=+0.114394725 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, version=17.1.9, build-date=2025-07-21T13:27:15, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, batch=17.1_20250721.1, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Oct 14 09:14:09 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:14:09 np0005486759.ooo.test podman[130101]: 2025-10-14 09:14:09.55627707 +0000 UTC m=+0.183724597 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.9, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step5, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1)
Oct 14 09:14:09 np0005486759.ooo.test podman[130101]: 2025-10-14 09:14:09.57428258 +0000 UTC m=+0.201730087 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, version=17.1.9, release=1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.33.12)
Oct 14 09:14:09 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:14:09 np0005486759.ooo.test sudo[130097]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:09 np0005486759.ooo.test sudo[130171]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp75t4hj0h/privsep.sock
Oct 14 09:14:09 np0005486759.ooo.test sudo[130171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:10 np0005486759.ooo.test sudo[130171]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:10 np0005486759.ooo.test systemd[1]: tmp-crun.Du3dkA.mount: Deactivated successfully.
Oct 14 09:14:10 np0005486759.ooo.test sudo[130182]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpvuf3pr45/privsep.sock
Oct 14 09:14:10 np0005486759.ooo.test sudo[130182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:11 np0005486759.ooo.test sudo[130182]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:11 np0005486759.ooo.test sudo[130193]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpryqlzh2u/privsep.sock
Oct 14 09:14:11 np0005486759.ooo.test sudo[130193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:11 np0005486759.ooo.test sshd[130196]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:14:11 np0005486759.ooo.test sshd[130196]: Accepted publickey for zuul from 192.168.122.30 port 52882 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:14:11 np0005486759.ooo.test systemd-logind[759]: New session 19 of user zuul.
Oct 14 09:14:11 np0005486759.ooo.test systemd[1]: Started Session 19 of User zuul.
Oct 14 09:14:11 np0005486759.ooo.test sshd[130196]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:14:12 np0005486759.ooo.test sudo[130193]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:12 np0005486759.ooo.test sudo[130281]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmzf2r4gh/privsep.sock
Oct 14 09:14:12 np0005486759.ooo.test sudo[130281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:12 np0005486759.ooo.test sudo[130299]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swvfdahyihqczygatkdygcvpzjeapwfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433252.074131-22-107401483862021/AnsiballZ_stat.py
Oct 14 09:14:12 np0005486759.ooo.test sudo[130299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:12 np0005486759.ooo.test python3.9[130301]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:14:12 np0005486759.ooo.test sudo[130299]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:13 np0005486759.ooo.test sudo[130281]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:13 np0005486759.ooo.test sudo[130382]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcqhojx41/privsep.sock
Oct 14 09:14:13 np0005486759.ooo.test sudo[130382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:13 np0005486759.ooo.test sudo[130410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdznenefrxdcfomeyukmjorvzqawdhrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433252.9436815-34-201573131783818/AnsiballZ_command.py
Oct 14 09:14:13 np0005486759.ooo.test sudo[130410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:13 np0005486759.ooo.test python3.9[130412]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                          _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:14:13 np0005486759.ooo.test sudo[130410]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:14 np0005486759.ooo.test sudo[130382]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:14 np0005486759.ooo.test sudo[130505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntljhdtvzndnesvuiicjelidvnohlkar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433253.8238292-42-83683258745895/AnsiballZ_stat.py
Oct 14 09:14:14 np0005486759.ooo.test sudo[130505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:14 np0005486759.ooo.test sudo[130515]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0829f76t/privsep.sock
Oct 14 09:14:14 np0005486759.ooo.test sudo[130515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:14 np0005486759.ooo.test python3.9[130509]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:14:14 np0005486759.ooo.test sudo[130505]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:14 np0005486759.ooo.test sudo[130609]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhgbcdxqqqgvgomkuugmjzpdziepnnde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433254.439598-50-16515772934329/AnsiballZ_command.py
Oct 14 09:14:14 np0005486759.ooo.test sudo[130609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:14 np0005486759.ooo.test sudo[130515]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:14 np0005486759.ooo.test python3.9[130611]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                          _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:14:14 np0005486759.ooo.test sudo[130609]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:15 np0005486759.ooo.test sudo[130658]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp2kou2sou/privsep.sock
Oct 14 09:14:15 np0005486759.ooo.test sudo[130658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:15 np0005486759.ooo.test sudo[130713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-catngmqqlsdqcejlmrkahtswcynzfqpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433255.11647-59-76657758087627/AnsiballZ_command.py
Oct 14 09:14:15 np0005486759.ooo.test sudo[130713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:15 np0005486759.ooo.test python3.9[130715]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:14:15 np0005486759.ooo.test sudo[130713]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:15 np0005486759.ooo.test sudo[130658]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:15 np0005486759.ooo.test sudo[130771]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpfl95n17o/privsep.sock
Oct 14 09:14:15 np0005486759.ooo.test sudo[130771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:16 np0005486759.ooo.test python3.9[130817]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 14 09:14:16 np0005486759.ooo.test sudo[130771]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:16 np0005486759.ooo.test sudo[130840]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1dqrrbwy/privsep.sock
Oct 14 09:14:16 np0005486759.ooo.test sudo[130840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:17 np0005486759.ooo.test sudo[130840]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:17 np0005486759.ooo.test python3.9[130918]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:14:17 np0005486759.ooo.test sudo[130944]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpmum51wn2/privsep.sock
Oct 14 09:14:17 np0005486759.ooo.test sudo[130944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:18 np0005486759.ooo.test python3.9[131021]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 14 09:14:18 np0005486759.ooo.test sudo[130944]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:18 np0005486759.ooo.test sudo[131082]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjpo2qmzy/privsep.sock
Oct 14 09:14:18 np0005486759.ooo.test sudo[131082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:19 np0005486759.ooo.test sudo[131082]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:14:19 np0005486759.ooo.test python3.9[131128]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: tmp-crun.jPXYlJ.mount: Deactivated successfully.
Oct 14 09:14:19 np0005486759.ooo.test podman[131134]: 2025-10-14 09:14:19.279045391 +0000 UTC m=+0.086300872 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, version=17.1.9, build-date=2025-07-21T15:29:47, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20250721.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4)
Oct 14 09:14:19 np0005486759.ooo.test podman[131134]: 2025-10-14 09:14:19.315520544 +0000 UTC m=+0.122776015 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-07-21T15:29:47, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Oct 14 09:14:19 np0005486759.ooo.test podman[131134]: unhealthy
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:14:19 np0005486759.ooo.test podman[131132]: 2025-10-14 09:14:19.338225539 +0000 UTC m=+0.149399091 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:07:52, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-cron, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.9)
Oct 14 09:14:19 np0005486759.ooo.test podman[131132]: 2025-10-14 09:14:19.349418587 +0000 UTC m=+0.160592109 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20250721.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, build-date=2025-07-21T13:07:52, io.buildah.version=1.33.12, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, distribution-scope=public, tcib_managed=true)
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:14:19 np0005486759.ooo.test podman[131133]: 2025-10-14 09:14:19.259726711 +0000 UTC m=+0.073263407 container health_status 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.33.12, container_name=ceilometer_agent_compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:14:19 np0005486759.ooo.test podman[131133]: 2025-10-14 09:14:19.394415134 +0000 UTC m=+0.207951840 container exec_died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20250721.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-07-21T14:45:33, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.33.12)
Oct 14 09:14:19 np0005486759.ooo.test podman[131133]: unhealthy
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:19 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:14:19 np0005486759.ooo.test sudo[131197]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpy1itd_qd/privsep.sock
Oct 14 09:14:19 np0005486759.ooo.test sudo[131197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:19 np0005486759.ooo.test python3.9[131243]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:14:19 np0005486759.ooo.test sudo[131197]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:14:20 np0005486759.ooo.test podman[131247]: 2025-10-14 09:14:20.083562431 +0000 UTC m=+0.068454437 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, release=1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, vendor=Red Hat, Inc.)
Oct 14 09:14:20 np0005486759.ooo.test sudo[131284]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp20v1cuc9/privsep.sock
Oct 14 09:14:20 np0005486759.ooo.test sudo[131284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:20 np0005486759.ooo.test podman[131247]: 2025-10-14 09:14:20.333233146 +0000 UTC m=+0.318125142 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.33.12, vcs-type=git, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, version=17.1.9)
Oct 14 09:14:20 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:14:20 np0005486759.ooo.test sshd[130196]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:14:20 np0005486759.ooo.test systemd[1]: session-19.scope: Deactivated successfully.
Oct 14 09:14:20 np0005486759.ooo.test systemd[1]: session-19.scope: Consumed 4.514s CPU time.
Oct 14 09:14:20 np0005486759.ooo.test systemd-logind[759]: Session 19 logged out. Waiting for processes to exit.
Oct 14 09:14:20 np0005486759.ooo.test systemd-logind[759]: Removed session 19.
Oct 14 09:14:20 np0005486759.ooo.test sudo[131284]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:21 np0005486759.ooo.test sudo[131309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpe6_rmi1t/privsep.sock
Oct 14 09:14:21 np0005486759.ooo.test sudo[131309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:21 np0005486759.ooo.test sudo[131309]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:21 np0005486759.ooo.test sudo[131320]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp92i4_agl/privsep.sock
Oct 14 09:14:21 np0005486759.ooo.test sudo[131320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:22 np0005486759.ooo.test sudo[131320]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:22 np0005486759.ooo.test sudo[131331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp82qd08gy/privsep.sock
Oct 14 09:14:22 np0005486759.ooo.test sudo[131331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:23 np0005486759.ooo.test sudo[131331]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:23 np0005486759.ooo.test sudo[131347]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpnz2hiafm/privsep.sock
Oct 14 09:14:23 np0005486759.ooo.test sudo[131347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: tmp-crun.ieE12j.mount: Deactivated successfully.
Oct 14 09:14:23 np0005486759.ooo.test podman[131349]: 2025-10-14 09:14:23.892194846 +0000 UTC m=+0.101622379 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, version=17.1.9, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 09:14:23 np0005486759.ooo.test podman[131351]: 2025-10-14 09:14:23.856083024 +0000 UTC m=+0.067529549 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, release=1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:28:44)
Oct 14 09:14:23 np0005486759.ooo.test podman[131351]: 2025-10-14 09:14:23.938266206 +0000 UTC m=+0.149712741 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.9, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, managed_by=tripleo_ansible)
Oct 14 09:14:23 np0005486759.ooo.test podman[131351]: unhealthy
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:14:23 np0005486759.ooo.test podman[131349]: 2025-10-14 09:14:23.958103222 +0000 UTC m=+0.167530715 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:14:23 np0005486759.ooo.test podman[131349]: unhealthy
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:23 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:14:24 np0005486759.ooo.test sudo[131347]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:24 np0005486759.ooo.test sudo[131399]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp91vaoqsi/privsep.sock
Oct 14 09:14:24 np0005486759.ooo.test sudo[131399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:24 np0005486759.ooo.test systemd[1]: tmp-crun.PcsOvv.mount: Deactivated successfully.
Oct 14 09:14:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:14:24 np0005486759.ooo.test systemd[1]: tmp-crun.Rvunjf.mount: Deactivated successfully.
Oct 14 09:14:24 np0005486759.ooo.test podman[131402]: 2025-10-14 09:14:24.935847583 +0000 UTC m=+0.074719521 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, batch=17.1_20250721.1, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, build-date=2025-07-21T14:48:37, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:14:25 np0005486759.ooo.test sudo[131399]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:25 np0005486759.ooo.test podman[131402]: 2025-10-14 09:14:25.310267564 +0000 UTC m=+0.449139532 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.33.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Oct 14 09:14:25 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:14:25 np0005486759.ooo.test sudo[131434]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3533guge/privsep.sock
Oct 14 09:14:25 np0005486759.ooo.test sudo[131434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:26 np0005486759.ooo.test sudo[131434]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:26 np0005486759.ooo.test sudo[131445]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6pozxjzy/privsep.sock
Oct 14 09:14:26 np0005486759.ooo.test sudo[131445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:27 np0005486759.ooo.test sudo[131445]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:27 np0005486759.ooo.test sudo[131456]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp6m3vkiqc/privsep.sock
Oct 14 09:14:27 np0005486759.ooo.test sudo[131456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:27 np0005486759.ooo.test sudo[131456]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:28 np0005486759.ooo.test sudo[131467]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpi2r7891m/privsep.sock
Oct 14 09:14:28 np0005486759.ooo.test sudo[131467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:28 np0005486759.ooo.test sudo[131467]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:28 np0005486759.ooo.test sudo[131480]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbyvosxo9/privsep.sock
Oct 14 09:14:28 np0005486759.ooo.test sudo[131480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:29 np0005486759.ooo.test sudo[131480]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:29 np0005486759.ooo.test sudo[131495]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp5ylmi6nr/privsep.sock
Oct 14 09:14:29 np0005486759.ooo.test sudo[131495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:30 np0005486759.ooo.test sudo[131495]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:30 np0005486759.ooo.test sudo[131506]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpodz0xxc5/privsep.sock
Oct 14 09:14:30 np0005486759.ooo.test sudo[131506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:31 np0005486759.ooo.test sudo[131506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:31 np0005486759.ooo.test sudo[131517]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4l5xqev3/privsep.sock
Oct 14 09:14:31 np0005486759.ooo.test sudo[131517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:32 np0005486759.ooo.test sudo[131517]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:32 np0005486759.ooo.test sudo[131528]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc3s85_nw/privsep.sock
Oct 14 09:14:32 np0005486759.ooo.test sudo[131528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:32 np0005486759.ooo.test sudo[131528]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:33 np0005486759.ooo.test sudo[131539]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpd2cqiywa/privsep.sock
Oct 14 09:14:33 np0005486759.ooo.test sudo[131539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:33 np0005486759.ooo.test sudo[131539]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:33 np0005486759.ooo.test sudo[131550]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpwsq201yg/privsep.sock
Oct 14 09:14:33 np0005486759.ooo.test sudo[131550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16767 DF PROTO=TCP SPT=46788 DPT=9100 SEQ=226305877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76BB970000000001030307) 
Oct 14 09:14:34 np0005486759.ooo.test sudo[131550]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:34 np0005486759.ooo.test sudo[131567]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpbtoo6x64/privsep.sock
Oct 14 09:14:34 np0005486759.ooo.test sudo[131567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16768 DF PROTO=TCP SPT=46788 DPT=9100 SEQ=226305877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76BF810000000001030307) 
Oct 14 09:14:35 np0005486759.ooo.test sudo[131567]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:35 np0005486759.ooo.test sudo[131578]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9j4z1hei/privsep.sock
Oct 14 09:14:35 np0005486759.ooo.test sudo[131578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:36 np0005486759.ooo.test sudo[131578]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:36 np0005486759.ooo.test sudo[131589]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp9zospfyt/privsep.sock
Oct 14 09:14:36 np0005486759.ooo.test sudo[131589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:36 np0005486759.ooo.test sshd[131590]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:14:36 np0005486759.ooo.test sshd[131590]: Accepted publickey for zuul from 192.168.122.31 port 38466 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:14:36 np0005486759.ooo.test systemd-logind[759]: New session 20 of user zuul.
Oct 14 09:14:36 np0005486759.ooo.test systemd[1]: Started Session 20 of User zuul.
Oct 14 09:14:36 np0005486759.ooo.test sshd[131590]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:14:37 np0005486759.ooo.test sudo[131589]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16769 DF PROTO=TCP SPT=46788 DPT=9100 SEQ=226305877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76C7810000000001030307) 
Oct 14 09:14:37 np0005486759.ooo.test sudo[131690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpqulemqbwribhmpmjxteeysdwckensb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433276.8594956-21-173572302449926/AnsiballZ_systemd_service.py
Oct 14 09:14:37 np0005486759.ooo.test sudo[131690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:37 np0005486759.ooo.test sudo[131696]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp860qwwbv/privsep.sock
Oct 14 09:14:37 np0005486759.ooo.test sudo[131696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:37 np0005486759.ooo.test python3.9[131694]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:14:37 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:14:37 np0005486759.ooo.test systemd-rc-local-generator[131725]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:14:37 np0005486759.ooo.test systemd-sysv-generator[131728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:14:37 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:14:38 np0005486759.ooo.test sudo[131690]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:38 np0005486759.ooo.test sudo[131696]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:38 np0005486759.ooo.test sudo[131789]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpn56iqsm4/privsep.sock
Oct 14 09:14:38 np0005486759.ooo.test sudo[131789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:38 np0005486759.ooo.test python3.9[131835]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:14:38 np0005486759.ooo.test network[131852]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:14:38 np0005486759.ooo.test network[131853]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:14:38 np0005486759.ooo.test network[131854]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:14:38 np0005486759.ooo.test sudo[131789]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:39 np0005486759.ooo.test sudo[131868]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp3gwsdc81/privsep.sock
Oct 14 09:14:39 np0005486759.ooo.test sudo[131868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:14:39 np0005486759.ooo.test podman[131882]: 2025-10-14 09:14:39.579018922 +0000 UTC m=+0.067249979 container health_status b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=2, container_name=collectd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, vendor=Red Hat, Inc., build-date=2025-07-21T13:04:03, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.9, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, batch=17.1_20250721.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Oct 14 09:14:39 np0005486759.ooo.test podman[131882]: 2025-10-14 09:14:39.589292382 +0000 UTC m=+0.077523439 container exec_died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, version=17.1.9, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:04:03, batch=17.1_20250721.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true)
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Deactivated successfully.
Oct 14 09:14:39 np0005486759.ooo.test podman[131884]: 2025-10-14 09:14:39.633075042 +0000 UTC m=+0.119427691 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, build-date=2025-07-21T13:27:15, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, release=1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9)
Oct 14 09:14:39 np0005486759.ooo.test podman[131884]: 2025-10-14 09:14:39.642326969 +0000 UTC m=+0.128679688 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-07-21T13:27:15, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-iscsid, release=1, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64)
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:14:39 np0005486759.ooo.test podman[131917]: 2025-10-14 09:14:39.729530438 +0000 UTC m=+0.114371004 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible)
Oct 14 09:14:39 np0005486759.ooo.test sudo[131868]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:14:39 np0005486759.ooo.test podman[131917]: 2025-10-14 09:14:39.817415817 +0000 UTC m=+0.202256413 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, architecture=x86_64, container_name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:14:39 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:14:40 np0005486759.ooo.test sudo[131996]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp1ow86t05/privsep.sock
Oct 14 09:14:40 np0005486759.ooo.test sudo[131996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:40 np0005486759.ooo.test sudo[131996]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:40 np0005486759.ooo.test sudo[132033]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpugqw5813/privsep.sock
Oct 14 09:14:40 np0005486759.ooo.test sudo[132033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16770 DF PROTO=TCP SPT=46788 DPT=9100 SEQ=226305877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76D7410000000001030307) 
Oct 14 09:14:41 np0005486759.ooo.test sudo[132033]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:41 np0005486759.ooo.test sudo[132065]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp4ezut5b_/privsep.sock
Oct 14 09:14:41 np0005486759.ooo.test sudo[132065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:42 np0005486759.ooo.test sudo[132065]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54565 DF PROTO=TCP SPT=33102 DPT=9882 SEQ=911579509 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76DAE70000000001030307) 
Oct 14 09:14:42 np0005486759.ooo.test python3.9[132164]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:14:42 np0005486759.ooo.test sudo[132173]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpt7ul_qsi/privsep.sock
Oct 14 09:14:42 np0005486759.ooo.test sudo[132173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:42 np0005486759.ooo.test network[132190]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:14:42 np0005486759.ooo.test network[132191]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:14:42 np0005486759.ooo.test network[132192]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:14:43 np0005486759.ooo.test sudo[132173]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:43 np0005486759.ooo.test sudo[132227]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpehrfwynp/privsep.sock
Oct 14 09:14:43 np0005486759.ooo.test sudo[132227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54566 DF PROTO=TCP SPT=33102 DPT=9882 SEQ=911579509 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76DF010000000001030307) 
Oct 14 09:14:43 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:14:43 np0005486759.ooo.test sudo[132227]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:44 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15116 DF PROTO=TCP SPT=52090 DPT=9105 SEQ=41646384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76E19B0000000001030307) 
Oct 14 09:14:44 np0005486759.ooo.test sudo[132298]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpp06898wl/privsep.sock
Oct 14 09:14:44 np0005486759.ooo.test sudo[132298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:44 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40115 DF PROTO=TCP SPT=49212 DPT=9101 SEQ=942358125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76E3550000000001030307) 
Oct 14 09:14:44 np0005486759.ooo.test sudo[132298]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:45 np0005486759.ooo.test sudo[132344]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp37l04eg1/privsep.sock
Oct 14 09:14:45 np0005486759.ooo.test sudo[132344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:45 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15117 DF PROTO=TCP SPT=52090 DPT=9105 SEQ=41646384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76E5C10000000001030307) 
Oct 14 09:14:45 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54567 DF PROTO=TCP SPT=33102 DPT=9882 SEQ=911579509 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76E7010000000001030307) 
Oct 14 09:14:45 np0005486759.ooo.test sudo[132427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oobzdlexrchgjlgdmwitgzofxnwswrnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433285.1539843-51-129668749303537/AnsiballZ_systemd_service.py
Oct 14 09:14:45 np0005486759.ooo.test sudo[132427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:45 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40116 DF PROTO=TCP SPT=49212 DPT=9101 SEQ=942358125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76E7410000000001030307) 
Oct 14 09:14:45 np0005486759.ooo.test sudo[132344]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:45 np0005486759.ooo.test python3.9[132429]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:14:45 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:14:45 np0005486759.ooo.test sudo[132442]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpf1iyswq0/privsep.sock
Oct 14 09:14:45 np0005486759.ooo.test sudo[132442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:45 np0005486759.ooo.test systemd-rc-local-generator[132465]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:14:45 np0005486759.ooo.test systemd-sysv-generator[132470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:14:45 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:14:46 np0005486759.ooo.test systemd[1]: Stopping ceilometer_agent_compute container...
Oct 14 09:14:46 np0005486759.ooo.test systemd[1]: tmp-crun.pFgtSo.mount: Deactivated successfully.
Oct 14 09:14:46 np0005486759.ooo.test sudo[132442]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:46 np0005486759.ooo.test sudo[132505]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpqqij07zy/privsep.sock
Oct 14 09:14:46 np0005486759.ooo.test sudo[132505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15118 DF PROTO=TCP SPT=52090 DPT=9105 SEQ=41646384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76EDC10000000001030307) 
Oct 14 09:14:47 np0005486759.ooo.test sudo[132505]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40117 DF PROTO=TCP SPT=49212 DPT=9101 SEQ=942358125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76EF420000000001030307) 
Oct 14 09:14:47 np0005486759.ooo.test sudo[132516]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp_dgoyb1z/privsep.sock
Oct 14 09:14:47 np0005486759.ooo.test sudo[132516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:48 np0005486759.ooo.test sudo[132516]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:48 np0005486759.ooo.test sudo[132527]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpc340riju/privsep.sock
Oct 14 09:14:48 np0005486759.ooo.test sudo[132527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:49 np0005486759.ooo.test sudo[132527]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:49 np0005486759.ooo.test sudo[132538]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm794cwgy/privsep.sock
Oct 14 09:14:49 np0005486759.ooo.test sudo[132538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:14:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54568 DF PROTO=TCP SPT=33102 DPT=9882 SEQ=911579509 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76F6C10000000001030307) 
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: tmp-crun.1EyDKY.mount: Deactivated successfully.
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:14:49 np0005486759.ooo.test podman[132540]: 2025-10-14 09:14:49.479463801 +0000 UTC m=+0.110752261 container health_status 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20250721.1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T15:29:47, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:14:49 np0005486759.ooo.test podman[132541]: 2025-10-14 09:14:49.448114207 +0000 UTC m=+0.077024683 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vcs-type=git, build-date=2025-07-21T13:07:52, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, batch=17.1_20250721.1, architecture=x86_64, com.redhat.component=openstack-cron-container)
Oct 14 09:14:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5180 DF PROTO=TCP SPT=50170 DPT=9102 SEQ=1481676381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76F7170000000001030307) 
Oct 14 09:14:49 np0005486759.ooo.test podman[132566]: Error: container 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c is not running
Oct 14 09:14:49 np0005486759.ooo.test podman[132540]: 2025-10-14 09:14:49.524472549 +0000 UTC m=+0.155760969 container exec_died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Main process exited, code=exited, status=125/n/a
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed with result 'exit-code'.
Oct 14 09:14:49 np0005486759.ooo.test podman[132540]: unhealthy
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed with result 'exit-code'.
Oct 14 09:14:49 np0005486759.ooo.test podman[132541]: 2025-10-14 09:14:49.582622085 +0000 UTC m=+0.211532551 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, batch=17.1_20250721.1, container_name=logrotate_crond, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-07-21T13:07:52, distribution-scope=public)
Oct 14 09:14:49 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:14:49 np0005486759.ooo.test sudo[132538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:50 np0005486759.ooo.test sudo[132599]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpoxkfng59/privsep.sock
Oct 14 09:14:50 np0005486759.ooo.test sudo[132599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: tmp-crun.PEzr21.mount: Deactivated successfully.
Oct 14 09:14:50 np0005486759.ooo.test podman[132602]: 2025-10-14 09:14:50.466078198 +0000 UTC m=+0.096841760 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., config_id=tripleo_step1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd)
Oct 14 09:14:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5181 DF PROTO=TCP SPT=50170 DPT=9102 SEQ=1481676381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76FB020000000001030307) 
Oct 14 09:14:50 np0005486759.ooo.test podman[132602]: 2025-10-14 09:14:50.673887243 +0000 UTC m=+0.304650825 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.33.12, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr)
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:14:50 np0005486759.ooo.test sudo[132599]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: libpod-48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.scope: Deactivated successfully.
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: libpod-48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.scope: Consumed 4min 9.054s CPU time.
Oct 14 09:14:50 np0005486759.ooo.test podman[132482]: 2025-10-14 09:14:50.871927824 +0000 UTC m=+4.751366039 container died 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-07-21T14:45:33, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.timer: Deactivated successfully.
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed to open /run/systemd/transient/48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: No such file or directory
Oct 14 09:14:50 np0005486759.ooo.test podman[132482]: 2025-10-14 09:14:50.91556644 +0000 UTC m=+4.795004655 container cleanup 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20250721.1, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, build-date=2025-07-21T14:45:33, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.buildah.version=1.33.12)
Oct 14 09:14:50 np0005486759.ooo.test podman[132482]: ceilometer_agent_compute
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.timer: Failed to open /run/systemd/transient/48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.timer: No such file or directory
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed to open /run/systemd/transient/48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: No such file or directory
Oct 14 09:14:50 np0005486759.ooo.test podman[132638]: 2025-10-14 09:14:50.948648347 +0000 UTC m=+0.070846491 container cleanup 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.33.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step4, distribution-scope=public, version=17.1.9, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Oct 14 09:14:50 np0005486759.ooo.test systemd[1]: libpod-conmon-48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.scope: Deactivated successfully.
Oct 14 09:14:51 np0005486759.ooo.test sudo[132668]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp8l56h6xl/privsep.sock
Oct 14 09:14:51 np0005486759.ooo.test sudo[132668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.timer: Failed to open /run/systemd/transient/48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.timer: No such file or directory
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: Failed to open /run/systemd/transient/48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c.service: No such file or directory
Oct 14 09:14:51 np0005486759.ooo.test podman[132654]: 2025-10-14 09:14:51.030528281 +0000 UTC m=+0.055314010 container cleanup 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:14:51 np0005486759.ooo.test podman[132654]: ceilometer_agent_compute
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: Stopped ceilometer_agent_compute container.
Oct 14 09:14:51 np0005486759.ooo.test sudo[132427]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:51 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15119 DF PROTO=TCP SPT=52090 DPT=9105 SEQ=41646384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76FD820000000001030307) 
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0f024985d8551ca70e6842fc9e2330bedf170ac29e8faf8031d1077d2a8fc6f8-merged.mount: Deactivated successfully.
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c-userdata-shm.mount: Deactivated successfully.
Oct 14 09:14:51 np0005486759.ooo.test sudo[132762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddqtgqtksvtfugnmnasxvbbqngfvscue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433291.177867-51-257837759516809/AnsiballZ_systemd_service.py
Oct 14 09:14:51 np0005486759.ooo.test sudo[132762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:51 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40118 DF PROTO=TCP SPT=49212 DPT=9101 SEQ=942358125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F76FF020000000001030307) 
Oct 14 09:14:51 np0005486759.ooo.test sudo[132668]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:51 np0005486759.ooo.test python3.9[132764]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:14:51 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:14:51 np0005486759.ooo.test sudo[132776]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmphco5zv88/privsep.sock
Oct 14 09:14:51 np0005486759.ooo.test sudo[132776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:14:51 np0005486759.ooo.test systemd-sysv-generator[132803]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:14:51 np0005486759.ooo.test systemd-rc-local-generator[132797]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: Stopping ceilometer_agent_ipmi container...
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: tmp-crun.5crxFH.mount: Deactivated successfully.
Oct 14 09:14:52 np0005486759.ooo.test sudo[132776]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5182 DF PROTO=TCP SPT=50170 DPT=9102 SEQ=1481676381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7703010000000001030307) 
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: libpod-7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.scope: Deactivated successfully.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: libpod-7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.scope: Consumed 1h 2min 12.182s CPU time.
Oct 14 09:14:52 np0005486759.ooo.test podman[132815]: 2025-10-14 09:14:52.884407047 +0000 UTC m=+0.667471584 container died 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20250721.1, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container)
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.timer: Deactivated successfully.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed to open /run/systemd/transient/7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: No such file or directory
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: tmp-crun.JVQa4X.mount: Deactivated successfully.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6-userdata-shm.mount: Deactivated successfully.
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d3e85029358dd8d56349cc32f50b1a52c41f0f3e6ed9b12e81f053986ea14f71-merged.mount: Deactivated successfully.
Oct 14 09:14:52 np0005486759.ooo.test podman[132815]: 2025-10-14 09:14:52.961659626 +0000 UTC m=+0.744724123 container cleanup 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.9, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, container_name=ceilometer_agent_ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-07-21T15:29:47, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public)
Oct 14 09:14:52 np0005486759.ooo.test podman[132815]: ceilometer_agent_ipmi
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.timer: Failed to open /run/systemd/transient/7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.timer: No such file or directory
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed to open /run/systemd/transient/7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: No such file or directory
Oct 14 09:14:52 np0005486759.ooo.test podman[132835]: 2025-10-14 09:14:52.978340574 +0000 UTC m=+0.084259388 container cleanup 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.33.12, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.9, config_id=tripleo_step4, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git)
Oct 14 09:14:52 np0005486759.ooo.test systemd[1]: libpod-conmon-7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.scope: Deactivated successfully.
Oct 14 09:14:53 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.timer: Failed to open /run/systemd/transient/7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.timer: No such file or directory
Oct 14 09:14:53 np0005486759.ooo.test systemd[1]: 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: Failed to open /run/systemd/transient/7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6.service: No such file or directory
Oct 14 09:14:53 np0005486759.ooo.test podman[132851]: 2025-10-14 09:14:53.075329137 +0000 UTC m=+0.068755877 container cleanup 7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=fb6ae8bb9cf127a94f881f2787c60d4d2018020f, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, release=1, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T15:29:47, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-ipmi/images/17.1.9-1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible)
Oct 14 09:14:53 np0005486759.ooo.test podman[132851]: ceilometer_agent_ipmi
Oct 14 09:14:53 np0005486759.ooo.test systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Oct 14 09:14:53 np0005486759.ooo.test systemd[1]: Stopped ceilometer_agent_ipmi container.
Oct 14 09:14:53 np0005486759.ooo.test sudo[132762]: pam_unix(sudo:session): session closed for user root
Oct 14 09:14:53 np0005486759.ooo.test sudo[132951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehwlunpkcbhcpukftskmvlprnownhiug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433293.2598588-51-181613268128610/AnsiballZ_systemd_service.py
Oct 14 09:14:53 np0005486759.ooo.test sudo[132951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:14:53 np0005486759.ooo.test python3.9[132953]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:14:53 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:14:53 np0005486759.ooo.test systemd-rc-local-generator[132982]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:14:53 np0005486759.ooo.test systemd-sysv-generator[132986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: Stopping collectd container...
Oct 14 09:14:54 np0005486759.ooo.test podman[132994]: 2025-10-14 09:14:54.292062901 +0000 UTC m=+0.077884380 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, batch=17.1_20250721.1, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, version=17.1.9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: tmp-crun.Uitmux.mount: Deactivated successfully.
Oct 14 09:14:54 np0005486759.ooo.test podman[132993]: 2025-10-14 09:14:54.400999225 +0000 UTC m=+0.187248078 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 09:14:54 np0005486759.ooo.test podman[132994]: 2025-10-14 09:14:54.43270704 +0000 UTC m=+0.218528799 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, version=17.1.9, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, maintainer=OpenStack TripleO Team, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:14:54 np0005486759.ooo.test podman[132994]: unhealthy
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:14:54 np0005486759.ooo.test podman[132993]: 2025-10-14 09:14:54.445425135 +0000 UTC m=+0.231674058 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, version=17.1.9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1)
Oct 14 09:14:54 np0005486759.ooo.test podman[132993]: unhealthy
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:14:54 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:14:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:14:55 np0005486759.ooo.test podman[133049]: 2025-10-14 09:14:55.449528816 +0000 UTC m=+0.073361161 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, release=1, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, version=17.1.9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.33.12)
Oct 14 09:14:55 np0005486759.ooo.test podman[133049]: 2025-10-14 09:14:55.860280444 +0000 UTC m=+0.484112899 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, version=17.1.9, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true)
Oct 14 09:14:55 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:14:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5183 DF PROTO=TCP SPT=50170 DPT=9102 SEQ=1481676381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7712C20000000001030307) 
Oct 14 09:15:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20523 DF PROTO=TCP SPT=40236 DPT=9100 SEQ=3871071855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7730C70000000001030307) 
Oct 14 09:15:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20524 DF PROTO=TCP SPT=40236 DPT=9100 SEQ=3871071855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7734C10000000001030307) 
Oct 14 09:15:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20525 DF PROTO=TCP SPT=40236 DPT=9100 SEQ=3871071855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F773CC10000000001030307) 
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: tmp-crun.gbv20u.mount: Deactivated successfully.
Oct 14 09:15:10 np0005486759.ooo.test podman[133074]: 2025-10-14 09:15:10.205101223 +0000 UTC m=+0.087112917 container health_status 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-type=git, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, config_id=tripleo_step3, distribution-scope=public, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2)
Oct 14 09:15:10 np0005486759.ooo.test podman[133074]: 2025-10-14 09:15:10.215298099 +0000 UTC m=+0.097309773 container exec_died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-07-21T13:27:15, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, name=rhosp17/openstack-iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.9, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, release=1, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20250721.1)
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Deactivated successfully.
Oct 14 09:15:10 np0005486759.ooo.test podman[133075]: 2025-10-14 09:15:10.256918372 +0000 UTC m=+0.130538065 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, release=1)
Oct 14 09:15:10 np0005486759.ooo.test podman[133076]: Error: container b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 is not running
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Main process exited, code=exited, status=125/n/a
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Failed with result 'exit-code'.
Oct 14 09:15:10 np0005486759.ooo.test podman[133075]: 2025-10-14 09:15:10.328849887 +0000 UTC m=+0.202469630 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, release=1)
Oct 14 09:15:10 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:15:11 np0005486759.ooo.test systemd[1]: tmp-crun.1eZUc0.mount: Deactivated successfully.
Oct 14 09:15:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20526 DF PROTO=TCP SPT=40236 DPT=9100 SEQ=3871071855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F774C820000000001030307) 
Oct 14 09:15:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7530 DF PROTO=TCP SPT=44724 DPT=9882 SEQ=3285435344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7750180000000001030307) 
Oct 14 09:15:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7531 DF PROTO=TCP SPT=44724 DPT=9882 SEQ=3285435344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7754020000000001030307) 
Oct 14 09:15:14 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30625 DF PROTO=TCP SPT=44626 DPT=9105 SEQ=898723188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7756CC0000000001030307) 
Oct 14 09:15:14 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28047 DF PROTO=TCP SPT=40092 DPT=9101 SEQ=2290872425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7758860000000001030307) 
Oct 14 09:15:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30627 DF PROTO=TCP SPT=44626 DPT=9105 SEQ=898723188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7762C10000000001030307) 
Oct 14 09:15:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7533 DF PROTO=TCP SPT=44724 DPT=9882 SEQ=3285435344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F776BC10000000001030307) 
Oct 14 09:15:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:15:19 np0005486759.ooo.test podman[133126]: 2025-10-14 09:15:19.959259366 +0000 UTC m=+0.077321893 container health_status 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, release=1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, name=rhosp17/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc.)
Oct 14 09:15:19 np0005486759.ooo.test podman[133126]: 2025-10-14 09:15:19.968214904 +0000 UTC m=+0.086277381 container exec_died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, version=17.1.9, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, managed_by=tripleo_ansible, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, build-date=2025-07-21T13:07:52, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Oct 14 09:15:19 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Deactivated successfully.
Oct 14 09:15:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:15:21 np0005486759.ooo.test systemd[1]: tmp-crun.uun6iq.mount: Deactivated successfully.
Oct 14 09:15:21 np0005486759.ooo.test podman[133146]: 2025-10-14 09:15:21.441509748 +0000 UTC m=+0.071423190 container health_status fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, name=rhosp17/openstack-qdrouterd, version=17.1.9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, container_name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, tcib_managed=true)
Oct 14 09:15:21 np0005486759.ooo.test podman[133146]: 2025-10-14 09:15:21.623618945 +0000 UTC m=+0.253532357 container exec_died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20250721.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, build-date=2025-07-21T13:07:59, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, distribution-scope=public)
Oct 14 09:15:21 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Deactivated successfully.
Oct 14 09:15:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26982 DF PROTO=TCP SPT=58714 DPT=9102 SEQ=2204221430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7778420000000001030307) 
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:15:24 np0005486759.ooo.test podman[133176]: 2025-10-14 09:15:24.689462176 +0000 UTC m=+0.060119328 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-07-21T13:28:44, container_name=ovn_controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, version=17.1.9, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.33.12, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc.)
Oct 14 09:15:24 np0005486759.ooo.test podman[133176]: 2025-10-14 09:15:24.733381971 +0000 UTC m=+0.104039113 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., release=1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, io.buildah.version=1.33.12)
Oct 14 09:15:24 np0005486759.ooo.test podman[133176]: unhealthy
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: tmp-crun.n8SoDZ.mount: Deactivated successfully.
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:15:24 np0005486759.ooo.test podman[133175]: 2025-10-14 09:15:24.757729877 +0000 UTC m=+0.128539013 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.33.12, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=17.1.9, container_name=ovn_metadata_agent, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1)
Oct 14 09:15:24 np0005486759.ooo.test podman[133175]: 2025-10-14 09:15:24.771277417 +0000 UTC m=+0.142086483 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.33.12, build-date=2025-07-21T16:28:53, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:15:24 np0005486759.ooo.test podman[133175]: unhealthy
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:15:24 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:15:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:15:26 np0005486759.ooo.test podman[133212]: 2025-10-14 09:15:26.441651693 +0000 UTC m=+0.070206822 container health_status 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-07-21T14:48:37, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1)
Oct 14 09:15:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26983 DF PROTO=TCP SPT=58714 DPT=9102 SEQ=2204221430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7788020000000001030307) 
Oct 14 09:15:26 np0005486759.ooo.test podman[133212]: 2025-10-14 09:15:26.817553799 +0000 UTC m=+0.446108898 container exec_died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20250721.1, build-date=2025-07-21T14:48:37, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, vendor=Red Hat, Inc.)
Oct 14 09:15:26 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Deactivated successfully.
Oct 14 09:15:28 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:15:28 np0005486759.ooo.test recover_tripleo_nova_virtqemud[133236]: 47951
Oct 14 09:15:28 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:15:28 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:15:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8866 DF PROTO=TCP SPT=60352 DPT=9100 SEQ=4158542402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77A5F80000000001030307) 
Oct 14 09:15:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8867 DF PROTO=TCP SPT=60352 DPT=9100 SEQ=4158542402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77AA020000000001030307) 
Oct 14 09:15:36 np0005486759.ooo.test podman[132996]: time="2025-10-14T09:15:36Z" level=warning msg="StopSignal SIGTERM failed to stop container collectd in 42 seconds, resorting to SIGKILL"
Oct 14 09:15:36 np0005486759.ooo.test podman[132996]: 2025-10-14 09:15:36.351964696 +0000 UTC m=+42.136368051 container stop b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, version=17.1.9, container_name=collectd, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, build-date=2025-07-21T13:04:03, release=2, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: libpod-b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.scope: Deactivated successfully.
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: libpod-b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.scope: Consumed 1.851s CPU time.
Oct 14 09:15:36 np0005486759.ooo.test podman[132996]: 2025-10-14 09:15:36.385178968 +0000 UTC m=+42.169582333 container died b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=2, config_id=tripleo_step3, io.buildah.version=1.33.12, version=17.1.9, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, tcib_managed=true, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-07-21T13:04:03, container_name=collectd, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2)
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.timer: Deactivated successfully.
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Failed to open /run/systemd/transient/b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: No such file or directory
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26-userdata-shm.mount: Deactivated successfully.
Oct 14 09:15:36 np0005486759.ooo.test podman[132996]: 2025-10-14 09:15:36.428264616 +0000 UTC m=+42.212667961 container cleanup b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, release=2, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T13:04:03, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.33.12)
Oct 14 09:15:36 np0005486759.ooo.test podman[132996]: collectd
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.timer: Failed to open /run/systemd/transient/b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.timer: No such file or directory
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Failed to open /run/systemd/transient/b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: No such file or directory
Oct 14 09:15:36 np0005486759.ooo.test podman[133238]: 2025-10-14 09:15:36.459310111 +0000 UTC m=+0.092684711 container cleanup b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=2, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, batch=17.1_20250721.1, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, container_name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, version=17.1.9, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-07-21T13:04:03)
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: libpod-conmon-b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.scope: Deactivated successfully.
Oct 14 09:15:36 np0005486759.ooo.test podman[133269]: error opening file `/run/crun/b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26/status`: No such file or directory
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.timer: Failed to open /run/systemd/transient/b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.timer: No such file or directory
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: Failed to open /run/systemd/transient/b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26.service: No such file or directory
Oct 14 09:15:36 np0005486759.ooo.test podman[133257]: 2025-10-14 09:15:36.532115262 +0000 UTC m=+0.051930374 container cleanup b6fde1450ca47a2b8585776ca6ec1df3cc516d8d33cc9f34f03fbc4ba771fb26 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=2, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-collectd/images/17.1.9-2, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20250721.1, build-date=2025-07-21T13:04:03, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.33.12, version=17.1.9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '324199e84b6ced954fd0cecf75a965ca'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-ref=1c67cc222531545f43af554407dce9103c5ddf0b, container_name=collectd)
Oct 14 09:15:36 np0005486759.ooo.test podman[133257]: collectd
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: tripleo_collectd.service: Deactivated successfully.
Oct 14 09:15:36 np0005486759.ooo.test systemd[1]: Stopped collectd container.
Oct 14 09:15:36 np0005486759.ooo.test sudo[132951]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:36 np0005486759.ooo.test sudo[133360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufldbxldchzsgbgltyengbgksvhraoaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433336.6701727-51-161576400698727/AnsiballZ_systemd_service.py
Oct 14 09:15:36 np0005486759.ooo.test sudo[133360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:37 np0005486759.ooo.test python3.9[133362]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:15:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8868 DF PROTO=TCP SPT=60352 DPT=9100 SEQ=4158542402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77B2010000000001030307) 
Oct 14 09:15:37 np0005486759.ooo.test systemd-rc-local-generator[133387]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:15:37 np0005486759.ooo.test systemd-sysv-generator[133391]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6160dcf8522bce4a4dbb2d001d64131000991ae258c303cf32d4eb798c35afe5-merged.mount: Deactivated successfully.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: Stopping iscsid container...
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: libpod-6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.scope: Deactivated successfully.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: libpod-6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.scope: Consumed 1.040s CPU time.
Oct 14 09:15:37 np0005486759.ooo.test podman[133402]: 2025-10-14 09:15:37.783474991 +0000 UTC m=+0.078609122 container died 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.33.12, container_name=iscsid, version=17.1.9, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, io.openshift.expose-services=)
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.timer: Deactivated successfully.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Failed to open /run/systemd/transient/6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: No such file or directory
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: tmp-crun.ZyZ7E5.mount: Deactivated successfully.
Oct 14 09:15:37 np0005486759.ooo.test podman[133402]: 2025-10-14 09:15:37.830154692 +0000 UTC m=+0.125288823 container cleanup 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, version=17.1.9, config_id=tripleo_step3, release=1, maintainer=OpenStack TripleO Team, build-date=2025-07-21T13:27:15, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Oct 14 09:15:37 np0005486759.ooo.test podman[133402]: iscsid
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.timer: Failed to open /run/systemd/transient/6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.timer: No such file or directory
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Failed to open /run/systemd/transient/6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: No such file or directory
Oct 14 09:15:37 np0005486759.ooo.test podman[133417]: 2025-10-14 09:15:37.865437048 +0000 UTC m=+0.069615543 container cleanup 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.33.12, vendor=Red Hat, Inc., release=1, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20250721.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-07-21T13:27:15, name=rhosp17/openstack-iscsid, version=17.1.9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1)
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: libpod-conmon-6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.scope: Deactivated successfully.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.timer: Failed to open /run/systemd/transient/6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.timer: No such file or directory
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: Failed to open /run/systemd/transient/6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301.service: No such file or directory
Oct 14 09:15:37 np0005486759.ooo.test podman[133429]: 2025-10-14 09:15:37.965499016 +0000 UTC m=+0.072894026 container cleanup 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-type=git, io.buildah.version=1.33.12, config_id=tripleo_step3, version=17.1.9, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=iscsid, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, architecture=x86_64, build-date=2025-07-21T13:27:15, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.openshift.expose-services=)
Oct 14 09:15:37 np0005486759.ooo.test podman[133429]: iscsid
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Oct 14 09:15:37 np0005486759.ooo.test systemd[1]: Stopped iscsid container.
Oct 14 09:15:38 np0005486759.ooo.test sudo[133360]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:38 np0005486759.ooo.test sudo[133531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiqqhnfwtvidbigiyxdqpdnmuncqadsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433338.1346862-51-136098192982454/AnsiballZ_systemd_service.py
Oct 14 09:15:38 np0005486759.ooo.test sudo[133531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46260fec5bdd83621572ffa7df8d8d0e559aaff55a250b87810024aca47c9813-merged.mount: Deactivated successfully.
Oct 14 09:15:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301-userdata-shm.mount: Deactivated successfully.
Oct 14 09:15:38 np0005486759.ooo.test python3.9[133533]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:38 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:15:38 np0005486759.ooo.test systemd-rc-local-generator[133564]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:15:38 np0005486759.ooo.test systemd-sysv-generator[133567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:15:38 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: Stopping logrotate_crond container...
Oct 14 09:15:39 np0005486759.ooo.test crond[53838]: (CRON) INFO (Shutting down)
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: libpod-0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.scope: Deactivated successfully.
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: libpod-0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.scope: Consumed 1.244s CPU time.
Oct 14 09:15:39 np0005486759.ooo.test podman[133575]: 2025-10-14 09:15:39.17193064 +0000 UTC m=+0.056730753 container died 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.9, name=rhosp17/openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, container_name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20250721.1, build-date=2025-07-21T13:07:52, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.timer: Deactivated successfully.
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Failed to open /run/systemd/transient/0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: No such file or directory
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9-userdata-shm.mount: Deactivated successfully.
Oct 14 09:15:39 np0005486759.ooo.test podman[133575]: 2025-10-14 09:15:39.219341203 +0000 UTC m=+0.104141286 container cleanup 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.33.12, version=17.1.9, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, summary=Red Hat OpenStack Platform 17.1 cron, release=1, vcs-type=git, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c)
Oct 14 09:15:39 np0005486759.ooo.test podman[133575]: logrotate_crond
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.timer: Failed to open /run/systemd/transient/0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.timer: No such file or directory
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Failed to open /run/systemd/transient/0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: No such file or directory
Oct 14 09:15:39 np0005486759.ooo.test podman[133589]: 2025-10-14 09:15:39.296646974 +0000 UTC m=+0.115902641 container cleanup 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-07-21T13:07:52, config_id=tripleo_step4, io.buildah.version=1.33.12, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, batch=17.1_20250721.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64)
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: libpod-conmon-0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.scope: Deactivated successfully.
Oct 14 09:15:39 np0005486759.ooo.test podman[133619]: error opening file `/run/crun/0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9/status`: No such file or directory
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.timer: Failed to open /run/systemd/transient/0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.timer: No such file or directory
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: Failed to open /run/systemd/transient/0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9.service: No such file or directory
Oct 14 09:15:39 np0005486759.ooo.test podman[133607]: 2025-10-14 09:15:39.391697676 +0000 UTC m=+0.060163979 container cleanup 0a6068f62063b2114debbb1e033b719c0ad06d080f9b0b96d4ac889b9cb772b9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=1cbdeb2f9fe67da66c8007dc1c7f4220cefddf6c, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20250721.1, release=1, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=logrotate_crond, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-07-21T13:07:52, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-cron/images/17.1.9-1, version=17.1.9, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron)
Oct 14 09:15:39 np0005486759.ooo.test podman[133607]: logrotate_crond
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: Stopped logrotate_crond container.
Oct 14 09:15:39 np0005486759.ooo.test sudo[133531]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ada07e432d0e43eebd550951648a1927a38ab08f9b982361ae15057deb14876d-merged.mount: Deactivated successfully.
Oct 14 09:15:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:15:40 np0005486759.ooo.test podman[133681]: 2025-10-14 09:15:40.429580485 +0000 UTC m=+0.064438872 container health_status aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20250721.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, config_id=tripleo_step5, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_compute)
Oct 14 09:15:40 np0005486759.ooo.test podman[133681]: 2025-10-14 09:15:40.454795119 +0000 UTC m=+0.089653466 container exec_died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, container_name=nova_compute, version=17.1.9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, release=1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, build-date=2025-07-21T14:48:37, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Oct 14 09:15:40 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Deactivated successfully.
Oct 14 09:15:40 np0005486759.ooo.test sudo[133735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lecjmlhpjiayeeduznxdsouqksybtyrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433340.1114168-51-80466270971118/AnsiballZ_systemd_service.py
Oct 14 09:15:40 np0005486759.ooo.test sudo[133735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:40 np0005486759.ooo.test python3.9[133737]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:40 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:15:40 np0005486759.ooo.test systemd-sysv-generator[133764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:15:40 np0005486759.ooo.test systemd-rc-local-generator[133761]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: Stopping metrics_qdr container...
Oct 14 09:15:41 np0005486759.ooo.test kernel: qdrouterd[42562]: segfault at 0 ip 00007f65d6f4e7cb sp 00007ffe80353d30 error 4 in libc.so.6[7f65d6eeb000+175000]
Oct 14 09:15:41 np0005486759.ooo.test kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: Created slice Slice /system/systemd-coredump.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: Started Process Core Dump (PID 133791/UID 0).
Oct 14 09:15:41 np0005486759.ooo.test systemd-coredump[133792]: Resource limits disable core dumping for process 42562 (qdrouterd).
Oct 14 09:15:41 np0005486759.ooo.test systemd-coredump[133792]: Process 42562 (qdrouterd) of user 42465 dumped core.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: systemd-coredump@0-133791-0.service: Deactivated successfully.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: libpod-fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.scope: Deactivated successfully.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: libpod-fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.scope: Consumed 25.921s CPU time.
Oct 14 09:15:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8869 DF PROTO=TCP SPT=60352 DPT=9100 SEQ=4158542402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77C1C20000000001030307) 
Oct 14 09:15:41 np0005486759.ooo.test podman[133778]: 2025-10-14 09:15:41.395098376 +0000 UTC m=+0.193111099 container died fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, vcs-type=git, build-date=2025-07-21T13:07:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-qdrouterd-container)
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.timer: Deactivated successfully.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Failed to open /run/systemd/transient/fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: No such file or directory
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e-userdata-shm.mount: Deactivated successfully.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19103ee73e807f98da7458e81c363a7b60ad569c451bd8f50c0daf39d2101319-merged.mount: Deactivated successfully.
Oct 14 09:15:41 np0005486759.ooo.test podman[133778]: 2025-10-14 09:15:41.428517464 +0000 UTC m=+0.226530127 container cleanup fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.9, build-date=2025-07-21T13:07:59, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20250721.1, architecture=x86_64, release=1, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1)
Oct 14 09:15:41 np0005486759.ooo.test podman[133778]: metrics_qdr
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.timer: Failed to open /run/systemd/transient/fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.timer: No such file or directory
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Failed to open /run/systemd/transient/fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: No such file or directory
Oct 14 09:15:41 np0005486759.ooo.test podman[133796]: 2025-10-14 09:15:41.461087276 +0000 UTC m=+0.057639312 container cleanup fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, release=1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20250721.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, version=17.1.9, build-date=2025-07-21T13:07:59, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=metrics_qdr, managed_by=tripleo_ansible, architecture=x86_64)
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: libpod-conmon-fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.scope: Deactivated successfully.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.timer: Failed to open /run/systemd/transient/fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.timer: No such file or directory
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: Failed to open /run/systemd/transient/fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e.service: No such file or directory
Oct 14 09:15:41 np0005486759.ooo.test podman[133808]: 2025-10-14 09:15:41.536382185 +0000 UTC m=+0.039799737 container cleanup fa97b98c7788d63d1ebd0dde02b2a140ef99b6fdf8dd0de34a87e4830fa9a85e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1, config_id=tripleo_step1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2887d8c13a95df5ab0d7c0a262884982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-qdrouterd/images/17.1.9-1, build-date=2025-07-21T13:07:59, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=4a9cf7084a7631a8cf28014f76f8f9d6da5b1fed, version=17.1.9, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1, io.openshift.expose-services=, batch=17.1_20250721.1)
Oct 14 09:15:41 np0005486759.ooo.test podman[133808]: metrics_qdr
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Oct 14 09:15:41 np0005486759.ooo.test systemd[1]: Stopped metrics_qdr container.
Oct 14 09:15:41 np0005486759.ooo.test sudo[133735]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:41 np0005486759.ooo.test sudo[133910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxbjzmjfgsjhuzoeabccqwcsxlerauue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433341.6732898-51-253047232108712/AnsiballZ_systemd_service.py
Oct 14 09:15:41 np0005486759.ooo.test sudo[133910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:42 np0005486759.ooo.test python3.9[133912]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16721 DF PROTO=TCP SPT=59386 DPT=9882 SEQ=378290367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77C5480000000001030307) 
Oct 14 09:15:42 np0005486759.ooo.test sudo[133910]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:43 np0005486759.ooo.test sudo[134003]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftpabwfhzxrewsesqiycohpihgthewym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433342.9135842-51-150938187282382/AnsiballZ_systemd_service.py
Oct 14 09:15:43 np0005486759.ooo.test sudo[134003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16722 DF PROTO=TCP SPT=59386 DPT=9882 SEQ=378290367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77C9410000000001030307) 
Oct 14 09:15:43 np0005486759.ooo.test python3.9[134005]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:43 np0005486759.ooo.test sudo[134003]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:43 np0005486759.ooo.test sudo[134096]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqlksxnxzhodntqtmxlturltvbqavjly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433343.6181166-51-188757991779123/AnsiballZ_systemd_service.py
Oct 14 09:15:43 np0005486759.ooo.test sudo[134096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:44 np0005486759.ooo.test python3.9[134098]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:44 np0005486759.ooo.test sudo[134096]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:44 np0005486759.ooo.test sudo[134189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqakjasrollxahkmfgkqnmwasjsvpizq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433344.3178852-51-48116032071641/AnsiballZ_systemd_service.py
Oct 14 09:15:44 np0005486759.ooo.test sudo[134189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:44 np0005486759.ooo.test python3.9[134191]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:44 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:15:45 np0005486759.ooo.test systemd-sysv-generator[134220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:15:45 np0005486759.ooo.test systemd-rc-local-generator[134216]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:15:45 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:15:45 np0005486759.ooo.test systemd[1]: Stopping nova_compute container...
Oct 14 09:15:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49339 DF PROTO=TCP SPT=41468 DPT=9105 SEQ=958177309 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77D8020000000001030307) 
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: libpod-aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.scope: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: libpod-aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.scope: Consumed 18.026s CPU time.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: session-c11.scope: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: session-c12.scope: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test podman[134232]: 2025-10-14 09:15:47.328293804 +0000 UTC m=+2.086874203 container died aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, version=17.1.9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-07-21T14:48:37, config_id=tripleo_step5, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.buildah.version=1.33.12, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container)
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.timer: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Failed to open /run/systemd/transient/aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: No such file or directory
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-dda03ea32496994d88a46cb581656ffbbab2c391369247a306314cbff2d505cd-merged.mount: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test podman[134232]: 2025-10-14 09:15:47.383444637 +0000 UTC m=+2.142024956 container cleanup aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, version=17.1.9, container_name=nova_compute, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 09:15:47 np0005486759.ooo.test podman[134232]: nova_compute
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.timer: Failed to open /run/systemd/transient/aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.timer: No such file or directory
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Failed to open /run/systemd/transient/aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: No such file or directory
Oct 14 09:15:47 np0005486759.ooo.test podman[134246]: 2025-10-14 09:15:47.458359615 +0000 UTC m=+0.119314928 container cleanup aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-07-21T14:48:37, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.9, tcib_managed=true, release=1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d)
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: libpod-conmon-aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.scope: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.timer: Failed to open /run/systemd/transient/aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.timer: No such file or directory
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: Failed to open /run/systemd/transient/aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb.service: No such file or directory
Oct 14 09:15:47 np0005486759.ooo.test podman[134260]: 2025-10-14 09:15:47.543443657 +0000 UTC m=+0.050959984 container cleanup aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-07-21T14:48:37, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.33.12, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, release=1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.9, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Oct 14 09:15:47 np0005486759.ooo.test podman[134260]: nova_compute
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Oct 14 09:15:47 np0005486759.ooo.test systemd[1]: Stopped nova_compute container.
Oct 14 09:15:47 np0005486759.ooo.test sudo[134189]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:47 np0005486759.ooo.test sudo[134361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhumxjzwbbexnlmimnjqgwlaompaonfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433347.6818097-51-1113759860915/AnsiballZ_systemd_service.py
Oct 14 09:15:47 np0005486759.ooo.test sudo[134361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:48 np0005486759.ooo.test python3.9[134363]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:15:48 np0005486759.ooo.test systemd-sysv-generator[134390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:15:48 np0005486759.ooo.test systemd-rc-local-generator[134386]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: Stopping nova_migration_target container...
Oct 14 09:15:48 np0005486759.ooo.test sshd[53907]: Received signal 15; terminating.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: libpod-44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.scope: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: libpod-44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.scope: Consumed 39.737s CPU time.
Oct 14 09:15:48 np0005486759.ooo.test podman[134404]: 2025-10-14 09:15:48.71443451 +0000 UTC m=+0.065539086 container died 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1, distribution-scope=public, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git)
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.timer: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Failed to open /run/systemd/transient/44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: No such file or directory
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: tmp-crun.kl8PDJ.mount: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929-userdata-shm.mount: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-137c1921cf29a18f5788ee7ca89cb32d77b40e4f2bc3359cd1d75d04c15761c5-merged.mount: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test podman[134404]: 2025-10-14 09:15:48.767699735 +0000 UTC m=+0.118804331 container cleanup 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, container_name=nova_migration_target, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, config_id=tripleo_step4, distribution-scope=public, release=1, name=rhosp17/openstack-nova-compute, batch=17.1_20250721.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, build-date=2025-07-21T14:48:37, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Oct 14 09:15:48 np0005486759.ooo.test podman[134404]: nova_migration_target
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.timer: Failed to open /run/systemd/transient/44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.timer: No such file or directory
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Failed to open /run/systemd/transient/44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: No such file or directory
Oct 14 09:15:48 np0005486759.ooo.test podman[134416]: 2025-10-14 09:15:48.835201571 +0000 UTC m=+0.115679924 container cleanup 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.9, batch=17.1_20250721.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., release=1, vcs-type=git, build-date=2025-07-21T14:48:37, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, io.buildah.version=1.33.12)
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: libpod-conmon-44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.scope: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test systemd-journald[35787]: Data hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.0 (53727 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Oct 14 09:15:48 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 09:15:48 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.timer: Failed to open /run/systemd/transient/44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.timer: No such file or directory
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: Failed to open /run/systemd/transient/44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929.service: No such file or directory
Oct 14 09:15:48 np0005486759.ooo.test podman[134430]: 2025-10-14 09:15:48.927463397 +0000 UTC m=+0.065553557 container cleanup 44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.33.12, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, build-date=2025-07-21T14:48:37, vendor=Red Hat, Inc., release=1)
Oct 14 09:15:48 np0005486759.ooo.test podman[134430]: nova_migration_target
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Oct 14 09:15:48 np0005486759.ooo.test systemd[1]: Stopped nova_migration_target container.
Oct 14 09:15:48 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:15:48 np0005486759.ooo.test sudo[134361]: pam_unix(sudo:session): session closed for user root
Oct 14 09:15:49 np0005486759.ooo.test sudo[134533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvcmerlzywbxzfbmyhvhkyhdcjwzudsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433349.0588298-51-134722273987479/AnsiballZ_systemd_service.py
Oct 14 09:15:49 np0005486759.ooo.test sudo[134533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:15:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16724 DF PROTO=TCP SPT=59386 DPT=9882 SEQ=378290367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77E1010000000001030307) 
Oct 14 09:15:49 np0005486759.ooo.test python3.9[134535]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:15:49 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:15:49 np0005486759.ooo.test systemd-rc-local-generator[134560]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:15:49 np0005486759.ooo.test systemd-sysv-generator[134563]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:15:49 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:15:50 np0005486759.ooo.test systemd[1]: Stopping nova_virtlogd_wrapper container...
Oct 14 09:15:50 np0005486759.ooo.test systemd[1]: libpod-87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef.scope: Deactivated successfully.
Oct 14 09:15:50 np0005486759.ooo.test podman[134575]: 2025-10-14 09:15:50.100267437 +0000 UTC m=+0.062414869 container died 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59, io.openshift.expose-services=, distribution-scope=public, container_name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., batch=17.1_20250721.1, com.redhat.component=openstack-nova-libvirt-container, version=17.1.9, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, release=2, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']})
Oct 14 09:15:50 np0005486759.ooo.test systemd[1]: tmp-crun.PhFAPA.mount: Deactivated successfully.
Oct 14 09:15:50 np0005486759.ooo.test podman[134575]: 2025-10-14 09:15:50.140888429 +0000 UTC m=+0.103035831 container cleanup 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, architecture=x86_64, build-date=2025-07-21T14:56:59, release=2, container_name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team)
Oct 14 09:15:50 np0005486759.ooo.test podman[134575]: nova_virtlogd_wrapper
Oct 14 09:15:50 np0005486759.ooo.test podman[134588]: 2025-10-14 09:15:50.161251282 +0000 UTC m=+0.052712329 container cleanup 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.33.12, vcs-type=git, container_name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., release=2, version=17.1.9, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:56:59, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2)
Oct 14 09:15:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb-merged.mount: Deactivated successfully.
Oct 14 09:15:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef-userdata-shm.mount: Deactivated successfully.
Oct 14 09:15:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47593 DF PROTO=TCP SPT=36232 DPT=9102 SEQ=2129856686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77ED810000000001030307) 
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:15:55 np0005486759.ooo.test podman[134604]: 2025-10-14 09:15:55.209804189 +0000 UTC m=+0.089436340 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20250721.1, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.33.12)
Oct 14 09:15:55 np0005486759.ooo.test podman[134604]: 2025-10-14 09:15:55.247973304 +0000 UTC m=+0.127605435 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.9, distribution-scope=public, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team)
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: tmp-crun.ixFkI3.mount: Deactivated successfully.
Oct 14 09:15:55 np0005486759.ooo.test podman[134604]: unhealthy
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:15:55 np0005486759.ooo.test podman[134605]: 2025-10-14 09:15:55.275464658 +0000 UTC m=+0.152105176 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20250721.1, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, container_name=ovn_controller, release=1, build-date=2025-07-21T13:28:44, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:15:55 np0005486759.ooo.test podman[134605]: 2025-10-14 09:15:55.288339208 +0000 UTC m=+0.164979726 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1, tcib_managed=true, build-date=2025-07-21T13:28:44, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20250721.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.33.12)
Oct 14 09:15:55 np0005486759.ooo.test podman[134605]: unhealthy
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:15:55 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:15:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47594 DF PROTO=TCP SPT=36232 DPT=9102 SEQ=2129856686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F77FD410000000001030307) 
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Activating special unit Exit the Session...
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Removed slice User Background Tasks Slice.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped target Main User Target.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped target Basic System.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped target Paths.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped target Sockets.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped target Timers.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Closed D-Bus User Message Bus Socket.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Stopped Create User's Volatile Files and Directories.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Removed slice User Application Slice.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Reached target Shutdown.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Finished Exit the Session.
Oct 14 09:15:57 np0005486759.ooo.test systemd[93754]: Reached target Exit the Session.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: user@0.service: Consumed 3.339s CPU time, no IO.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 09:15:57 np0005486759.ooo.test systemd[1]: user-0.slice: Consumed 5.083s CPU time.
Oct 14 09:16:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20012 DF PROTO=TCP SPT=46566 DPT=9100 SEQ=1860859516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F781B280000000001030307) 
Oct 14 09:16:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20013 DF PROTO=TCP SPT=46566 DPT=9100 SEQ=1860859516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F781F420000000001030307) 
Oct 14 09:16:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20014 DF PROTO=TCP SPT=46566 DPT=9100 SEQ=1860859516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7827410000000001030307) 
Oct 14 09:16:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20015 DF PROTO=TCP SPT=46566 DPT=9100 SEQ=1860859516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7837010000000001030307) 
Oct 14 09:16:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52892 DF PROTO=TCP SPT=54590 DPT=9882 SEQ=4187923453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F783A770000000001030307) 
Oct 14 09:16:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52893 DF PROTO=TCP SPT=54590 DPT=9882 SEQ=4187923453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F783E810000000001030307) 
Oct 14 09:16:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41718 DF PROTO=TCP SPT=40562 DPT=9105 SEQ=186599023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F784D410000000001030307) 
Oct 14 09:16:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52895 DF PROTO=TCP SPT=54590 DPT=9882 SEQ=4187923453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7856410000000001030307) 
Oct 14 09:16:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35879 DF PROTO=TCP SPT=33442 DPT=9102 SEQ=611250407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7862C10000000001030307) 
Oct 14 09:16:24 np0005486759.ooo.test sshd[134647]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:16:25 np0005486759.ooo.test sshd[134647]: Connection closed by authenticating user root 199.195.254.152 port 47394 [preauth]
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: tmp-crun.poA8Td.mount: Deactivated successfully.
Oct 14 09:16:25 np0005486759.ooo.test podman[134649]: 2025-10-14 09:16:25.479266813 +0000 UTC m=+0.099222052 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, release=1, version=17.1.9)
Oct 14 09:16:25 np0005486759.ooo.test podman[134650]: 2025-10-14 09:16:25.521117904 +0000 UTC m=+0.136671637 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.33.12, managed_by=tripleo_ansible, version=17.1.9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, build-date=2025-07-21T13:28:44, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, release=1, batch=17.1_20250721.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git)
Oct 14 09:16:25 np0005486759.ooo.test podman[134650]: 2025-10-14 09:16:25.540378392 +0000 UTC m=+0.155932145 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-07-21T13:28:44, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, vcs-type=git, io.buildah.version=1.33.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.9, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:16:25 np0005486759.ooo.test podman[134650]: unhealthy
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:16:25 np0005486759.ooo.test podman[134649]: 2025-10-14 09:16:25.593162392 +0000 UTC m=+0.213117661 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, release=1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, version=17.1.9, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Oct 14 09:16:25 np0005486759.ooo.test podman[134649]: unhealthy
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:16:25 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:16:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35880 DF PROTO=TCP SPT=33442 DPT=9102 SEQ=611250407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7872820000000001030307) 
Oct 14 09:16:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27219 DF PROTO=TCP SPT=33420 DPT=9100 SEQ=3297079265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7890580000000001030307) 
Oct 14 09:16:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27220 DF PROTO=TCP SPT=33420 DPT=9100 SEQ=3297079265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7894410000000001030307) 
Oct 14 09:16:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27221 DF PROTO=TCP SPT=33420 DPT=9100 SEQ=3297079265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F789C410000000001030307) 
Oct 14 09:16:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27222 DF PROTO=TCP SPT=33420 DPT=9100 SEQ=3297079265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78AC010000000001030307) 
Oct 14 09:16:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65094 DF PROTO=TCP SPT=60982 DPT=9882 SEQ=3472679171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78AFA80000000001030307) 
Oct 14 09:16:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65095 DF PROTO=TCP SPT=60982 DPT=9882 SEQ=3472679171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78B3C10000000001030307) 
Oct 14 09:16:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47377 DF PROTO=TCP SPT=42924 DPT=9105 SEQ=1254262273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78C2810000000001030307) 
Oct 14 09:16:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65097 DF PROTO=TCP SPT=60982 DPT=9882 SEQ=3472679171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78CB810000000001030307) 
Oct 14 09:16:51 np0005486759.ooo.test systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Oct 14 09:16:51 np0005486759.ooo.test recover_tripleo_nova_virtqemud[134690]: 47951
Oct 14 09:16:51 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Oct 14 09:16:51 np0005486759.ooo.test systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Oct 14 09:16:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19355 DF PROTO=TCP SPT=54016 DPT=9102 SEQ=3340854708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78D7C10000000001030307) 
Oct 14 09:16:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:16:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:16:55 np0005486759.ooo.test podman[134691]: 2025-10-14 09:16:55.698007991 +0000 UTC m=+0.077948112 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-07-21T13:28:44, version=17.1.9, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Oct 14 09:16:55 np0005486759.ooo.test podman[134691]: 2025-10-14 09:16:55.712407008 +0000 UTC m=+0.092347159 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.33.12, version=17.1.9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, config_id=tripleo_step4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, distribution-scope=public, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-07-21T13:28:44, batch=17.1_20250721.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:16:55 np0005486759.ooo.test podman[134691]: unhealthy
Oct 14 09:16:55 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:16:55 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:16:55 np0005486759.ooo.test podman[134692]: 2025-10-14 09:16:55.755094514 +0000 UTC m=+0.128170681 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, tcib_managed=true, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Oct 14 09:16:55 np0005486759.ooo.test podman[134692]: 2025-10-14 09:16:55.791928889 +0000 UTC m=+0.165005106 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, batch=17.1_20250721.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Oct 14 09:16:55 np0005486759.ooo.test podman[134692]: unhealthy
Oct 14 09:16:55 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:16:55 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:16:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19356 DF PROTO=TCP SPT=54016 DPT=9102 SEQ=3340854708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F78E7810000000001030307) 
Oct 14 09:17:00 np0005486759.ooo.test sshd[134729]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:17:00 np0005486759.ooo.test sshd[134729]: Invalid user  from 64.62.156.56 port 48427
Oct 14 09:17:04 np0005486759.ooo.test sshd[134729]: Connection closed by invalid user  64.62.156.56 port 48427 [preauth]
Oct 14 09:17:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60676 DF PROTO=TCP SPT=35672 DPT=9100 SEQ=3991269445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7905870000000001030307) 
Oct 14 09:17:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60677 DF PROTO=TCP SPT=35672 DPT=9100 SEQ=3991269445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7909810000000001030307) 
Oct 14 09:17:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60678 DF PROTO=TCP SPT=35672 DPT=9100 SEQ=3991269445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7911810000000001030307) 
Oct 14 09:17:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60679 DF PROTO=TCP SPT=35672 DPT=9100 SEQ=3991269445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7921410000000001030307) 
Oct 14 09:17:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49191 DF PROTO=TCP SPT=43070 DPT=9882 SEQ=3273838588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7924D80000000001030307) 
Oct 14 09:17:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49192 DF PROTO=TCP SPT=43070 DPT=9882 SEQ=3273838588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7928C10000000001030307) 
Oct 14 09:17:14 np0005486759.ooo.test systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Oct 14 09:17:14 np0005486759.ooo.test systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 47186 (conmon) with signal SIGKILL.
Oct 14 09:17:14 np0005486759.ooo.test systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Oct 14 09:17:14 np0005486759.ooo.test systemd[1]: libpod-conmon-87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef.scope: Deactivated successfully.
Oct 14 09:17:14 np0005486759.ooo.test podman[134744]: error opening file `/run/crun/87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef/status`: No such file or directory
Oct 14 09:17:14 np0005486759.ooo.test podman[134731]: 2025-10-14 09:17:14.439119159 +0000 UTC m=+0.069699226 container cleanup 87879d57734056982cda987ed75262ab23bc4a0ebcf634cbcfbe8091be0f6fef (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, version=17.1.9, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, build-date=2025-07-21T14:56:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Oct 14 09:17:14 np0005486759.ooo.test podman[134731]: nova_virtlogd_wrapper
Oct 14 09:17:14 np0005486759.ooo.test systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Oct 14 09:17:14 np0005486759.ooo.test systemd[1]: Stopped nova_virtlogd_wrapper container.
Oct 14 09:17:14 np0005486759.ooo.test sudo[134533]: pam_unix(sudo:session): session closed for user root
Oct 14 09:17:14 np0005486759.ooo.test sudo[134835]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyqxnubutqppnvvasmrjljieimrepmdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433434.5886168-51-160552282885690/AnsiballZ_systemd_service.py
Oct 14 09:17:14 np0005486759.ooo.test sudo[134835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:17:15 np0005486759.ooo.test python3.9[134837]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:17:15 np0005486759.ooo.test systemd-rc-local-generator[134864]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:17:15 np0005486759.ooo.test systemd-sysv-generator[134868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: Stopping nova_virtnodedevd container...
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: libpod-609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42.scope: Deactivated successfully.
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: libpod-609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42.scope: Consumed 2.133s CPU time.
Oct 14 09:17:15 np0005486759.ooo.test podman[134878]: 2025-10-14 09:17:15.63158296 +0000 UTC m=+0.078065466 container died 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, tcib_managed=true, release=2, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, name=rhosp17/openstack-nova-libvirt)
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42-userdata-shm.mount: Deactivated successfully.
Oct 14 09:17:15 np0005486759.ooo.test podman[134878]: 2025-10-14 09:17:15.669032153 +0000 UTC m=+0.115514659 container cleanup 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.33.12, version=17.1.9, batch=17.1_20250721.1, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=nova_virtnodedevd, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, release=2, tcib_managed=true)
Oct 14 09:17:15 np0005486759.ooo.test podman[134878]: nova_virtnodedevd
Oct 14 09:17:15 np0005486759.ooo.test podman[134893]: 2025-10-14 09:17:15.712685979 +0000 UTC m=+0.068275972 container cleanup 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, release=2, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, distribution-scope=public, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, container_name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: libpod-conmon-609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42.scope: Deactivated successfully.
Oct 14 09:17:15 np0005486759.ooo.test podman[134920]: error opening file `/run/crun/609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42/status`: No such file or directory
Oct 14 09:17:15 np0005486759.ooo.test podman[134908]: 2025-10-14 09:17:15.82122176 +0000 UTC m=+0.073788363 container cleanup 609ca321cc5ea2eaa491b9ba06c1c68e8298a1828b5dbe425d5fb7a4acd78e42 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, release=2, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, container_name=nova_virtnodedevd, io.buildah.version=1.33.12, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, build-date=2025-07-21T14:56:59, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.9)
Oct 14 09:17:15 np0005486759.ooo.test podman[134908]: nova_virtnodedevd
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Oct 14 09:17:15 np0005486759.ooo.test systemd[1]: Stopped nova_virtnodedevd container.
Oct 14 09:17:15 np0005486759.ooo.test sudo[134835]: pam_unix(sudo:session): session closed for user root
Oct 14 09:17:16 np0005486759.ooo.test sudo[135012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtqpecwizmhevwyllprndmiugsexlvrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433435.9909198-51-56564704775977/AnsiballZ_systemd_service.py
Oct 14 09:17:16 np0005486759.ooo.test sudo[135012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:17:16 np0005486759.ooo.test python3.9[135014]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:17:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25-merged.mount: Deactivated successfully.
Oct 14 09:17:16 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:17:16 np0005486759.ooo.test systemd-rc-local-generator[135036]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:17:16 np0005486759.ooo.test systemd-sysv-generator[135040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:17:16 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:17:16 np0005486759.ooo.test systemd[1]: Stopping nova_virtproxyd container...
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: libpod-3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3.scope: Deactivated successfully.
Oct 14 09:17:17 np0005486759.ooo.test podman[135054]: 2025-10-14 09:17:17.028133169 +0000 UTC m=+0.076250459 container died 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc., release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, container_name=nova_virtproxyd, tcib_managed=true, build-date=2025-07-21T14:56:59, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, batch=17.1_20250721.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=openstack-nova-libvirt-container)
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: tmp-crun.k77jok.mount: Deactivated successfully.
Oct 14 09:17:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48164 DF PROTO=TCP SPT=34400 DPT=9105 SEQ=848380632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7937810000000001030307) 
Oct 14 09:17:17 np0005486759.ooo.test podman[135054]: 2025-10-14 09:17:17.07226097 +0000 UTC m=+0.120378150 container cleanup 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, version=17.1.9, release=2, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.33.12, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Oct 14 09:17:17 np0005486759.ooo.test podman[135054]: nova_virtproxyd
Oct 14 09:17:17 np0005486759.ooo.test podman[135067]: 2025-10-14 09:17:17.100709504 +0000 UTC m=+0.067986433 container cleanup 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vcs-type=git, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, config_id=tripleo_step3, build-date=2025-07-21T14:56:59, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, container_name=nova_virtproxyd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=2, batch=17.1_20250721.1)
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: libpod-conmon-3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3.scope: Deactivated successfully.
Oct 14 09:17:17 np0005486759.ooo.test podman[135094]: error opening file `/run/crun/3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3/status`: No such file or directory
Oct 14 09:17:17 np0005486759.ooo.test podman[135082]: 2025-10-14 09:17:17.181072721 +0000 UTC m=+0.054188385 container cleanup 3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, batch=17.1_20250721.1, tcib_managed=true, architecture=x86_64, version=17.1.9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, release=2, build-date=2025-07-21T14:56:59, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtproxyd, managed_by=tripleo_ansible, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, name=rhosp17/openstack-nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0)
Oct 14 09:17:17 np0005486759.ooo.test podman[135082]: nova_virtproxyd
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: Stopped nova_virtproxyd container.
Oct 14 09:17:17 np0005486759.ooo.test sudo[135012]: pam_unix(sudo:session): session closed for user root
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4-merged.mount: Deactivated successfully.
Oct 14 09:17:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ef418e48aca7a283e92939f6ba65a015f30ce01d827b993cee6d2ffb1ebcff3-userdata-shm.mount: Deactivated successfully.
Oct 14 09:17:17 np0005486759.ooo.test sudo[135185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suemfthszterwywuqtklhwyccgbynauk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433437.3613174-51-175661170894900/AnsiballZ_systemd_service.py
Oct 14 09:17:17 np0005486759.ooo.test sudo[135185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:17:17 np0005486759.ooo.test python3.9[135187]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:17:18 np0005486759.ooo.test systemd-rc-local-generator[135212]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:17:18 np0005486759.ooo.test systemd-sysv-generator[135217]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: Stopping nova_virtqemud container...
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: tmp-crun.d1ejRk.mount: Deactivated successfully.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: libpod-2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49.scope: Deactivated successfully.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: libpod-2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49.scope: Consumed 5.529s CPU time.
Oct 14 09:17:18 np0005486759.ooo.test podman[135228]: 2025-10-14 09:17:18.417596339 +0000 UTC m=+0.065095693 container died 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, container_name=nova_virtqemud, batch=17.1_20250721.1, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, build-date=2025-07-21T14:56:59, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., release=2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64)
Oct 14 09:17:18 np0005486759.ooo.test podman[135228]: 2025-10-14 09:17:18.440735337 +0000 UTC m=+0.088234691 container cleanup 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20250721.1, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.9, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=2, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc.)
Oct 14 09:17:18 np0005486759.ooo.test podman[135228]: nova_virtqemud
Oct 14 09:17:18 np0005486759.ooo.test podman[135242]: 2025-10-14 09:17:18.492709582 +0000 UTC m=+0.061135990 container cleanup 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, version=17.1.9, tcib_managed=true, architecture=x86_64, io.buildah.version=1.33.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, container_name=nova_virtqemud, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, config_id=tripleo_step3, vcs-type=git, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, release=2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2)
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6-merged.mount: Deactivated successfully.
Oct 14 09:17:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49-userdata-shm.mount: Deactivated successfully.
Oct 14 09:17:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49194 DF PROTO=TCP SPT=43070 DPT=9882 SEQ=3273838588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7940810000000001030307) 
Oct 14 09:17:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52815 DF PROTO=TCP SPT=45112 DPT=9102 SEQ=522925166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F794D010000000001030307) 
Oct 14 09:17:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:17:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:17:25 np0005486759.ooo.test podman[135257]: 2025-10-14 09:17:25.943207001 +0000 UTC m=+0.070449839 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, release=1, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9)
Oct 14 09:17:25 np0005486759.ooo.test podman[135257]: 2025-10-14 09:17:25.957097632 +0000 UTC m=+0.084340450 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., release=1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/agreements, tcib_managed=true, version=17.1.9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:17:25 np0005486759.ooo.test podman[135257]: unhealthy
Oct 14 09:17:25 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:17:25 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:17:26 np0005486759.ooo.test systemd[1]: tmp-crun.XTObwF.mount: Deactivated successfully.
Oct 14 09:17:26 np0005486759.ooo.test podman[135258]: 2025-10-14 09:17:26.00784375 +0000 UTC m=+0.132525109 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, version=17.1.9, release=1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Oct 14 09:17:26 np0005486759.ooo.test podman[135258]: 2025-10-14 09:17:26.024273089 +0000 UTC m=+0.148954408 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, distribution-scope=public, release=1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-07-21T13:28:44, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container)
Oct 14 09:17:26 np0005486759.ooo.test podman[135258]: unhealthy
Oct 14 09:17:26 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:17:26 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:17:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52816 DF PROTO=TCP SPT=45112 DPT=9102 SEQ=522925166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F795CC10000000001030307) 
Oct 14 09:17:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45436 DF PROTO=TCP SPT=36418 DPT=9100 SEQ=4056704815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F797AB80000000001030307) 
Oct 14 09:17:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45437 DF PROTO=TCP SPT=36418 DPT=9100 SEQ=4056704815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F797EC20000000001030307) 
Oct 14 09:17:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45438 DF PROTO=TCP SPT=36418 DPT=9100 SEQ=4056704815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7986C10000000001030307) 
Oct 14 09:17:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45439 DF PROTO=TCP SPT=36418 DPT=9100 SEQ=4056704815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7996810000000001030307) 
Oct 14 09:17:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61252 DF PROTO=TCP SPT=47658 DPT=9882 SEQ=1709102679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F799A080000000001030307) 
Oct 14 09:17:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61253 DF PROTO=TCP SPT=47658 DPT=9882 SEQ=1709102679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F799E020000000001030307) 
Oct 14 09:17:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28598 DF PROTO=TCP SPT=49574 DPT=9105 SEQ=4244660272 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79ACC10000000001030307) 
Oct 14 09:17:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61255 DF PROTO=TCP SPT=47658 DPT=9882 SEQ=1709102679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79B5C10000000001030307) 
Oct 14 09:17:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56128 DF PROTO=TCP SPT=54456 DPT=9102 SEQ=533501826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79C2410000000001030307) 
Oct 14 09:17:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:17:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:17:56 np0005486759.ooo.test podman[135295]: 2025-10-14 09:17:56.188502364 +0000 UTC m=+0.069510141 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, release=1, version=17.1.9, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.33.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:17:56 np0005486759.ooo.test podman[135296]: 2025-10-14 09:17:56.251011165 +0000 UTC m=+0.127021276 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.33.12, batch=17.1_20250721.1, vendor=Red Hat, Inc., version=17.1.9, build-date=2025-07-21T13:28:44, vcs-type=git, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245)
Oct 14 09:17:56 np0005486759.ooo.test podman[135296]: 2025-10-14 09:17:56.266313741 +0000 UTC m=+0.142323852 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.9, io.buildah.version=1.33.12, container_name=ovn_controller, release=1, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, batch=17.1_20250721.1, vcs-type=git)
Oct 14 09:17:56 np0005486759.ooo.test podman[135296]: unhealthy
Oct 14 09:17:56 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:17:56 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:17:56 np0005486759.ooo.test podman[135295]: 2025-10-14 09:17:56.278280782 +0000 UTC m=+0.159288629 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, container_name=ovn_metadata_agent, io.openshift.expose-services=, distribution-scope=public, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20250721.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:17:56 np0005486759.ooo.test podman[135295]: unhealthy
Oct 14 09:17:56 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:17:56 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:17:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56129 DF PROTO=TCP SPT=54456 DPT=9102 SEQ=533501826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79D2020000000001030307) 
Oct 14 09:18:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57859 DF PROTO=TCP SPT=56250 DPT=9100 SEQ=2636615426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79EFE80000000001030307) 
Oct 14 09:18:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57860 DF PROTO=TCP SPT=56250 DPT=9100 SEQ=2636615426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79F4010000000001030307) 
Oct 14 09:18:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57861 DF PROTO=TCP SPT=56250 DPT=9100 SEQ=2636615426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F79FC020000000001030307) 
Oct 14 09:18:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57862 DF PROTO=TCP SPT=56250 DPT=9100 SEQ=2636615426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A0BC10000000001030307) 
Oct 14 09:18:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47534 DF PROTO=TCP SPT=51668 DPT=9882 SEQ=3218741965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A0F380000000001030307) 
Oct 14 09:18:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47535 DF PROTO=TCP SPT=51668 DPT=9882 SEQ=3218741965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A13410000000001030307) 
Oct 14 09:18:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19505 DF PROTO=TCP SPT=37154 DPT=9105 SEQ=1907151945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A22010000000001030307) 
Oct 14 09:18:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47537 DF PROTO=TCP SPT=51668 DPT=9882 SEQ=3218741965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A2B010000000001030307) 
Oct 14 09:18:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6185 DF PROTO=TCP SPT=59594 DPT=9102 SEQ=1585923503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A37820000000001030307) 
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: tmp-crun.MJlZPB.mount: Deactivated successfully.
Oct 14 09:18:26 np0005486759.ooo.test podman[135335]: 2025-10-14 09:18:26.472944153 +0000 UTC m=+0.090182682 container health_status c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Oct 14 09:18:26 np0005486759.ooo.test podman[135335]: 2025-10-14 09:18:26.515493195 +0000 UTC m=+0.132731784 container exec_died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, name=rhosp17/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, release=1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, container_name=ovn_controller, version=17.1.9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, build-date=2025-07-21T13:28:44, managed_by=tripleo_ansible)
Oct 14 09:18:26 np0005486759.ooo.test podman[135335]: unhealthy
Oct 14 09:18:26 np0005486759.ooo.test podman[135334]: 2025-10-14 09:18:26.52434537 +0000 UTC m=+0.139503945 container health_status 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.9, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, build-date=2025-07-21T16:28:53, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3)
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed with result 'exit-code'.
Oct 14 09:18:26 np0005486759.ooo.test podman[135334]: 2025-10-14 09:18:26.541247235 +0000 UTC m=+0.156405790 container exec_died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, vcs-type=git, version=17.1.9, container_name=ovn_metadata_agent, release=1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.license_terms=https://www.redhat.com/agreements, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:18:26 np0005486759.ooo.test podman[135334]: unhealthy
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:18:26 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed with result 'exit-code'.
Oct 14 09:18:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6186 DF PROTO=TCP SPT=59594 DPT=9102 SEQ=1585923503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A47410000000001030307) 
Oct 14 09:18:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30996 DF PROTO=TCP SPT=41812 DPT=9100 SEQ=2504107894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A65170000000001030307) 
Oct 14 09:18:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30997 DF PROTO=TCP SPT=41812 DPT=9100 SEQ=2504107894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A69010000000001030307) 
Oct 14 09:18:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30998 DF PROTO=TCP SPT=41812 DPT=9100 SEQ=2504107894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A71010000000001030307) 
Oct 14 09:18:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30999 DF PROTO=TCP SPT=41812 DPT=9100 SEQ=2504107894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A80C10000000001030307) 
Oct 14 09:18:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9627 DF PROTO=TCP SPT=47512 DPT=9882 SEQ=2883396386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A84680000000001030307) 
Oct 14 09:18:42 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Oct 14 09:18:42 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud.service: Killing process 47947 (conmon) with signal SIGKILL.
Oct 14 09:18:42 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Oct 14 09:18:42 np0005486759.ooo.test systemd[1]: libpod-conmon-2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49.scope: Deactivated successfully.
Oct 14 09:18:42 np0005486759.ooo.test podman[135382]: error opening file `/run/crun/2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49/status`: No such file or directory
Oct 14 09:18:42 np0005486759.ooo.test podman[135371]: 2025-10-14 09:18:42.702851178 +0000 UTC m=+0.076454036 container cleanup 2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.9, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=2, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, io.buildah.version=1.33.12, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64, container_name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']})
Oct 14 09:18:42 np0005486759.ooo.test podman[135371]: nova_virtqemud
Oct 14 09:18:42 np0005486759.ooo.test systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Oct 14 09:18:42 np0005486759.ooo.test systemd[1]: Stopped nova_virtqemud container.
Oct 14 09:18:42 np0005486759.ooo.test sudo[135185]: pam_unix(sudo:session): session closed for user root
Oct 14 09:18:43 np0005486759.ooo.test sudo[135473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfcokqtvayvffcnypclnsxbrpoquezgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433522.9117537-51-161286849316254/AnsiballZ_systemd_service.py
Oct 14 09:18:43 np0005486759.ooo.test sudo[135473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:18:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9628 DF PROTO=TCP SPT=47512 DPT=9882 SEQ=2883396386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A88810000000001030307) 
Oct 14 09:18:43 np0005486759.ooo.test python3.9[135475]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:18:43 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:18:43 np0005486759.ooo.test systemd-rc-local-generator[135500]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:18:43 np0005486759.ooo.test systemd-sysv-generator[135507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:18:43 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:18:43 np0005486759.ooo.test sudo[135473]: pam_unix(sudo:session): session closed for user root
Oct 14 09:18:45 np0005486759.ooo.test sudo[135603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouihumnmacjezgmgsouxhqwfpbhlthad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433524.9453409-51-266577532547905/AnsiballZ_systemd_service.py
Oct 14 09:18:45 np0005486759.ooo.test sudo[135603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:18:45 np0005486759.ooo.test python3.9[135605]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:18:45 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:18:45 np0005486759.ooo.test systemd-rc-local-generator[135628]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:18:45 np0005486759.ooo.test systemd-sysv-generator[135634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:18:45 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:18:45 np0005486759.ooo.test systemd[1]: Stopping nova_virtsecretd container...
Oct 14 09:18:45 np0005486759.ooo.test systemd[1]: libpod-fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461.scope: Deactivated successfully.
Oct 14 09:18:45 np0005486759.ooo.test podman[135646]: 2025-10-14 09:18:45.990524781 +0000 UTC m=+0.079503721 container died fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, batch=17.1_20250721.1, io.openshift.expose-services=, architecture=x86_64, container_name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step3, vcs-type=git, release=2, version=17.1.9, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, io.buildah.version=1.33.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public)
Oct 14 09:18:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461-userdata-shm.mount: Deactivated successfully.
Oct 14 09:18:46 np0005486759.ooo.test podman[135646]: 2025-10-14 09:18:46.032033629 +0000 UTC m=+0.121012559 container cleanup fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, version=17.1.9, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, io.buildah.version=1.33.12, container_name=nova_virtsecretd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, batch=17.1_20250721.1, build-date=2025-07-21T14:56:59, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-libvirt, release=2, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 09:18:46 np0005486759.ooo.test podman[135646]: nova_virtsecretd
Oct 14 09:18:46 np0005486759.ooo.test podman[135661]: 2025-10-14 09:18:46.074069425 +0000 UTC m=+0.070852861 container cleanup fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-nova-libvirt, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_virtsecretd, tcib_managed=true, release=2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, batch=17.1_20250721.1, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 09:18:46 np0005486759.ooo.test systemd[1]: libpod-conmon-fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461.scope: Deactivated successfully.
Oct 14 09:18:46 np0005486759.ooo.test podman[135689]: error opening file `/run/crun/fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461/status`: No such file or directory
Oct 14 09:18:46 np0005486759.ooo.test podman[135678]: 2025-10-14 09:18:46.170812261 +0000 UTC m=+0.069503880 container cleanup fe9f45d1312a5670519c39fec3a3f0b5244cd36990cb4c212c1e0a25d4267461 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, batch=17.1_20250721.1, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.9, config_id=tripleo_step3, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-07-21T14:56:59, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, container_name=nova_virtsecretd, release=2, io.openshift.tags=rhosp osp openstack osp-17.1)
Oct 14 09:18:46 np0005486759.ooo.test podman[135678]: nova_virtsecretd
Oct 14 09:18:46 np0005486759.ooo.test systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Oct 14 09:18:46 np0005486759.ooo.test systemd[1]: Stopped nova_virtsecretd container.
Oct 14 09:18:46 np0005486759.ooo.test sudo[135603]: pam_unix(sudo:session): session closed for user root
Oct 14 09:18:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5-merged.mount: Deactivated successfully.
Oct 14 09:18:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59406 DF PROTO=TCP SPT=51064 DPT=9105 SEQ=1935773937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7A97410000000001030307) 
Oct 14 09:18:47 np0005486759.ooo.test sudo[135780]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgidomayalynowpoechcbklegsykgfve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433527.3354623-51-123582175847016/AnsiballZ_systemd_service.py
Oct 14 09:18:47 np0005486759.ooo.test sudo[135780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:18:47 np0005486759.ooo.test python3.9[135782]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:18:47 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:18:48 np0005486759.ooo.test systemd-rc-local-generator[135807]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:18:48 np0005486759.ooo.test systemd-sysv-generator[135814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: Stopping nova_virtstoraged container...
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: libpod-17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92.scope: Deactivated successfully.
Oct 14 09:18:48 np0005486759.ooo.test podman[135823]: 2025-10-14 09:18:48.397583682 +0000 UTC m=+0.076367972 container died 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20250721.1, maintainer=OpenStack TripleO Team, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtstoraged, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=2, tcib_managed=true, version=17.1.9, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible)
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: tmp-crun.GBmSVG.mount: Deactivated successfully.
Oct 14 09:18:48 np0005486759.ooo.test podman[135823]: 2025-10-14 09:18:48.437181586 +0000 UTC m=+0.115965846 container cleanup 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T14:56:59, version=17.1.9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20250721.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.33.12, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_virtstoraged, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, release=2)
Oct 14 09:18:48 np0005486759.ooo.test podman[135823]: nova_virtstoraged
Oct 14 09:18:48 np0005486759.ooo.test podman[135836]: 2025-10-14 09:18:48.486488337 +0000 UTC m=+0.080628202 container cleanup 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., batch=17.1_20250721.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, release=2, container_name=nova_virtstoraged, managed_by=tripleo_ansible, tcib_managed=true, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-07-21T14:56:59, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, vcs-type=git)
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: libpod-conmon-17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92.scope: Deactivated successfully.
Oct 14 09:18:48 np0005486759.ooo.test podman[135865]: error opening file `/run/crun/17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92/status`: No such file or directory
Oct 14 09:18:48 np0005486759.ooo.test podman[135853]: 2025-10-14 09:18:48.576165316 +0000 UTC m=+0.060896698 container cleanup 17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b29b30662a12a8864f5ea0f40846b2cc'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.33.12, architecture=x86_64, config_id=tripleo_step3, build-date=2025-07-21T14:56:59, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20250721.1, container_name=nova_virtstoraged, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 09:18:48 np0005486759.ooo.test podman[135853]: nova_virtstoraged
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Oct 14 09:18:48 np0005486759.ooo.test systemd[1]: Stopped nova_virtstoraged container.
Oct 14 09:18:48 np0005486759.ooo.test sudo[135780]: pam_unix(sudo:session): session closed for user root
Oct 14 09:18:49 np0005486759.ooo.test sudo[135956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmgekjnmuwkvaukueteognvakrovyqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433528.7479832-51-30902026578812/AnsiballZ_systemd_service.py
Oct 14 09:18:49 np0005486759.ooo.test sudo[135956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149-merged.mount: Deactivated successfully.
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92-userdata-shm.mount: Deactivated successfully.
Oct 14 09:18:49 np0005486759.ooo.test python3.9[135958]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:18:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9630 DF PROTO=TCP SPT=47512 DPT=9882 SEQ=2883396386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7AA0410000000001030307) 
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:18:49 np0005486759.ooo.test systemd-sysv-generator[135986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:18:49 np0005486759.ooo.test systemd-rc-local-generator[135981]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: Stopping ovn_controller container...
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: libpod-c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.scope: Deactivated successfully.
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: libpod-c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.scope: Consumed 2.709s CPU time.
Oct 14 09:18:49 np0005486759.ooo.test podman[135999]: 2025-10-14 09:18:49.859150963 +0000 UTC m=+0.075896578 container died c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, architecture=x86_64, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.9, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1)
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.timer: Deactivated successfully.
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed to open /run/systemd/transient/c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: No such file or directory
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6-userdata-shm.mount: Deactivated successfully.
Oct 14 09:18:49 np0005486759.ooo.test podman[135999]: 2025-10-14 09:18:49.896700904 +0000 UTC m=+0.113446499 container cleanup c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-07-21T13:28:44, tcib_managed=true, managed_by=tripleo_ansible, release=1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, config_id=tripleo_step4)
Oct 14 09:18:49 np0005486759.ooo.test podman[135999]: ovn_controller
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.timer: Failed to open /run/systemd/transient/c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.timer: No such file or directory
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed to open /run/systemd/transient/c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: No such file or directory
Oct 14 09:18:49 np0005486759.ooo.test podman[136012]: 2025-10-14 09:18:49.939518186 +0000 UTC m=+0.066534881 container cleanup c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.9, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1, io.openshift.tags=rhosp osp openstack osp-17.1, tcib_managed=true, batch=17.1_20250721.1, build-date=2025-07-21T13:28:44, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, name=rhosp17/openstack-ovn-controller, vcs-type=git)
Oct 14 09:18:49 np0005486759.ooo.test systemd[1]: libpod-conmon-c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.scope: Deactivated successfully.
Oct 14 09:18:50 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.timer: Failed to open /run/systemd/transient/c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.timer: No such file or directory
Oct 14 09:18:50 np0005486759.ooo.test systemd[1]: c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: Failed to open /run/systemd/transient/c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6.service: No such file or directory
Oct 14 09:18:50 np0005486759.ooo.test podman[136024]: 2025-10-14 09:18:50.005556681 +0000 UTC m=+0.040296787 container cleanup c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20250721.1, release=1, build-date=2025-07-21T13:28:44, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.9, architecture=x86_64, tcib_managed=true, io.buildah.version=1.33.12, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Oct 14 09:18:50 np0005486759.ooo.test podman[136024]: ovn_controller
Oct 14 09:18:50 np0005486759.ooo.test systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Oct 14 09:18:50 np0005486759.ooo.test systemd[1]: Stopped ovn_controller container.
Oct 14 09:18:50 np0005486759.ooo.test sudo[135956]: pam_unix(sudo:session): session closed for user root
Oct 14 09:18:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a90b1721303d95e8939b4592e60f18afa0e38ea17042ea3b2a8546358a562b24-merged.mount: Deactivated successfully.
Oct 14 09:18:50 np0005486759.ooo.test sudo[136125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxefitvnbacnloyrbujxfvylpqabdygj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433530.1786687-51-199351307311774/AnsiballZ_systemd_service.py
Oct 14 09:18:50 np0005486759.ooo.test sudo[136125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:18:50 np0005486759.ooo.test python3.9[136127]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:18:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:18:50 np0005486759.ooo.test systemd-rc-local-generator[136154]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:18:50 np0005486759.ooo.test systemd-sysv-generator[136157]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: Stopping ovn_metadata_agent container...
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: libpod-46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.scope: Deactivated successfully.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: libpod-46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.scope: Consumed 10.989s CPU time.
Oct 14 09:18:51 np0005486759.ooo.test podman[136168]: 2025-10-14 09:18:51.304319721 +0000 UTC m=+0.149885525 container died 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.33.12, io.openshift.tags=rhosp osp openstack osp-17.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, distribution-scope=public, batch=17.1_20250721.1, build-date=2025-07-21T16:28:53, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=)
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.timer: Deactivated successfully.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed to open /run/systemd/transient/46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: No such file or directory
Oct 14 09:18:51 np0005486759.ooo.test podman[136168]: 2025-10-14 09:18:51.378535197 +0000 UTC m=+0.224100971 container cleanup 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2025-07-21T16:28:53, vendor=Red Hat, Inc., vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Oct 14 09:18:51 np0005486759.ooo.test podman[136168]: ovn_metadata_agent
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b2f3843305e98ca43fa4870afdb54b5c368e3c8047a8e821c108bd75c91b0d14-merged.mount: Deactivated successfully.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307-userdata-shm.mount: Deactivated successfully.
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.timer: Failed to open /run/systemd/transient/46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.timer: No such file or directory
Oct 14 09:18:51 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed to open /run/systemd/transient/46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: No such file or directory
Oct 14 09:18:51 np0005486759.ooo.test podman[136183]: 2025-10-14 09:18:51.504396125 +0000 UTC m=+0.185113246 container cleanup 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, batch=17.1_20250721.1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.openshift.expose-services=, tcib_managed=true)
Oct 14 09:18:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45565 DF PROTO=TCP SPT=34884 DPT=9102 SEQ=1314366297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7AAC810000000001030307) 
Oct 14 09:18:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45566 DF PROTO=TCP SPT=34884 DPT=9102 SEQ=1314366297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7ABC410000000001030307) 
Oct 14 09:19:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8174 DF PROTO=TCP SPT=38102 DPT=9100 SEQ=1964680495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7ADA480000000001030307) 
Oct 14 09:19:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8175 DF PROTO=TCP SPT=38102 DPT=9100 SEQ=1964680495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7ADE420000000001030307) 
Oct 14 09:19:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8176 DF PROTO=TCP SPT=38102 DPT=9100 SEQ=1964680495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7AE6420000000001030307) 
Oct 14 09:19:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8177 DF PROTO=TCP SPT=38102 DPT=9100 SEQ=1964680495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7AF6010000000001030307) 
Oct 14 09:19:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50018 DF PROTO=TCP SPT=49650 DPT=9882 SEQ=1333606372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7AF9980000000001030307) 
Oct 14 09:19:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50019 DF PROTO=TCP SPT=49650 DPT=9882 SEQ=1333606372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7AFD810000000001030307) 
Oct 14 09:19:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42485 DF PROTO=TCP SPT=40210 DPT=9105 SEQ=972709724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B0C410000000001030307) 
Oct 14 09:19:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50021 DF PROTO=TCP SPT=49650 DPT=9882 SEQ=1333606372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B15410000000001030307) 
Oct 14 09:19:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5140 DF PROTO=TCP SPT=36248 DPT=9102 SEQ=180441134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B21C20000000001030307) 
Oct 14 09:19:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5141 DF PROTO=TCP SPT=36248 DPT=9102 SEQ=180441134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B31810000000001030307) 
Oct 14 09:19:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9499 DF PROTO=TCP SPT=52262 DPT=9100 SEQ=3988154638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B4F780000000001030307) 
Oct 14 09:19:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9500 DF PROTO=TCP SPT=52262 DPT=9100 SEQ=3988154638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B53810000000001030307) 
Oct 14 09:19:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9501 DF PROTO=TCP SPT=52262 DPT=9100 SEQ=3988154638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B5B810000000001030307) 
Oct 14 09:19:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9502 DF PROTO=TCP SPT=52262 DPT=9100 SEQ=3988154638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B6B410000000001030307) 
Oct 14 09:19:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62491 DF PROTO=TCP SPT=40340 DPT=9882 SEQ=93167031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B6EC80000000001030307) 
Oct 14 09:19:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62492 DF PROTO=TCP SPT=40340 DPT=9882 SEQ=93167031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B72C20000000001030307) 
Oct 14 09:19:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20126 DF PROTO=TCP SPT=51226 DPT=9105 SEQ=2056084785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B81810000000001030307) 
Oct 14 09:19:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62494 DF PROTO=TCP SPT=40340 DPT=9882 SEQ=93167031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B8A810000000001030307) 
Oct 14 09:19:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20486 DF PROTO=TCP SPT=40388 DPT=9102 SEQ=442373296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7B97010000000001030307) 
Oct 14 09:19:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20487 DF PROTO=TCP SPT=40388 DPT=9102 SEQ=442373296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BA6C10000000001030307) 
Oct 14 09:20:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7476 DF PROTO=TCP SPT=58808 DPT=9100 SEQ=2902407905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BC4A70000000001030307) 
Oct 14 09:20:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7477 DF PROTO=TCP SPT=58808 DPT=9100 SEQ=2902407905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BC8C10000000001030307) 
Oct 14 09:20:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7478 DF PROTO=TCP SPT=58808 DPT=9100 SEQ=2902407905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BD0C10000000001030307) 
Oct 14 09:20:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7479 DF PROTO=TCP SPT=58808 DPT=9100 SEQ=2902407905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BE0820000000001030307) 
Oct 14 09:20:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38313 DF PROTO=TCP SPT=45842 DPT=9882 SEQ=3178498508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BE3F80000000001030307) 
Oct 14 09:20:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38314 DF PROTO=TCP SPT=45842 DPT=9882 SEQ=3178498508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BE8010000000001030307) 
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 54396 (conmon) with signal SIGKILL.
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: libpod-conmon-46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.scope: Deactivated successfully.
Oct 14 09:20:15 np0005486759.ooo.test podman[136213]: error opening file `/run/crun/46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307/status`: No such file or directory
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: tmp-crun.8Cu5c1.mount: Deactivated successfully.
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.timer: Failed to open /run/systemd/transient/46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.timer: No such file or directory
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: Failed to open /run/systemd/transient/46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307.service: No such file or directory
Oct 14 09:20:15 np0005486759.ooo.test podman[136200]: 2025-10-14 09:20:15.710716938 +0000 UTC m=+0.087989749 container cleanup 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, io.buildah.version=1.33.12, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, architecture=x86_64)
Oct 14 09:20:15 np0005486759.ooo.test podman[136200]: ovn_metadata_agent
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Oct 14 09:20:15 np0005486759.ooo.test systemd[1]: Stopped ovn_metadata_agent container.
Oct 14 09:20:15 np0005486759.ooo.test sudo[136125]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:16 np0005486759.ooo.test sudo[136304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhubuyvvejelnhusaccrcbmsjamkxtwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433615.862321-51-72066287213811/AnsiballZ_systemd_service.py
Oct 14 09:20:16 np0005486759.ooo.test sudo[136304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:16 np0005486759.ooo.test python3.9[136306]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:20:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55253 DF PROTO=TCP SPT=46208 DPT=9105 SEQ=2986066476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BF6C10000000001030307) 
Oct 14 09:20:17 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:20:17 np0005486759.ooo.test systemd-rc-local-generator[136334]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:20:17 np0005486759.ooo.test systemd-sysv-generator[136337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:20:17 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:20:17 np0005486759.ooo.test sudo[136304]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:18 np0005486759.ooo.test sudo[136435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijzvtnptwokitxfwgiskmkquqpesufme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433618.3131769-201-222740119841695/AnsiballZ_file.py
Oct 14 09:20:18 np0005486759.ooo.test sudo[136435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:18 np0005486759.ooo.test python3.9[136437]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:18 np0005486759.ooo.test sudo[136435]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:19 np0005486759.ooo.test sudo[136527]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgnlqzixtpjvrloubngkizybuifmznwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433619.0928366-201-265395157747958/AnsiballZ_file.py
Oct 14 09:20:19 np0005486759.ooo.test sudo[136527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38316 DF PROTO=TCP SPT=45842 DPT=9882 SEQ=3178498508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7BFFC10000000001030307) 
Oct 14 09:20:19 np0005486759.ooo.test python3.9[136529]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:19 np0005486759.ooo.test sudo[136527]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:20 np0005486759.ooo.test sudo[136619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrfzfjbzywdsrkvbypmaieiolgackgwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433619.765699-201-12323449193149/AnsiballZ_file.py
Oct 14 09:20:20 np0005486759.ooo.test sudo[136619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:20 np0005486759.ooo.test python3.9[136621]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:20 np0005486759.ooo.test sudo[136619]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:20 np0005486759.ooo.test sudo[136711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuxybpilrueaqiukjbtzglklbklvuapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433620.3700786-201-119304156437158/AnsiballZ_file.py
Oct 14 09:20:20 np0005486759.ooo.test sudo[136711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:20 np0005486759.ooo.test python3.9[136713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:20 np0005486759.ooo.test sudo[136711]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:21 np0005486759.ooo.test sudo[136803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlgsrkmmfhdrbcabqzwaggtnnyltdyay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433621.0248256-201-89326933567720/AnsiballZ_file.py
Oct 14 09:20:21 np0005486759.ooo.test sudo[136803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:21 np0005486759.ooo.test python3.9[136805]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:21 np0005486759.ooo.test sudo[136803]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:21 np0005486759.ooo.test sudo[136895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxviuyrnnfajtxpruacohsaofqkcuehw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433621.62354-201-24903420448597/AnsiballZ_file.py
Oct 14 09:20:21 np0005486759.ooo.test sudo[136895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:22 np0005486759.ooo.test python3.9[136897]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:22 np0005486759.ooo.test sudo[136895]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:22 np0005486759.ooo.test sudo[136987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkqqyfygzzljvvpjafaucukbgzncmwom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433622.205884-201-176550091840259/AnsiballZ_file.py
Oct 14 09:20:22 np0005486759.ooo.test sudo[136987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51898 DF PROTO=TCP SPT=43930 DPT=9102 SEQ=1741596681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C0C410000000001030307) 
Oct 14 09:20:22 np0005486759.ooo.test python3.9[136989]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:22 np0005486759.ooo.test sudo[136987]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:23 np0005486759.ooo.test sudo[137079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imbfixtgazbnqreejjbjklozsrsmyfxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433622.822847-201-86454559473422/AnsiballZ_file.py
Oct 14 09:20:23 np0005486759.ooo.test sudo[137079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:23 np0005486759.ooo.test python3.9[137081]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:23 np0005486759.ooo.test sudo[137079]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:23 np0005486759.ooo.test sudo[137171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmmgwcwqwwgiaxceobmoldcjrzzwoqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433623.4353464-201-148556960359133/AnsiballZ_file.py
Oct 14 09:20:23 np0005486759.ooo.test sudo[137171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:23 np0005486759.ooo.test python3.9[137173]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:23 np0005486759.ooo.test sudo[137171]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:24 np0005486759.ooo.test sudo[137263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jigzukpwfzukcswtvbkiramnqpmjrbsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433624.066504-201-235155039017488/AnsiballZ_file.py
Oct 14 09:20:24 np0005486759.ooo.test sudo[137263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:24 np0005486759.ooo.test sshd[137266]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:20:24 np0005486759.ooo.test python3.9[137265]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:24 np0005486759.ooo.test sudo[137263]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:24 np0005486759.ooo.test sshd[137266]: Invalid user  from 129.212.189.30 port 44100
Oct 14 09:20:25 np0005486759.ooo.test sudo[137357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erookdridgnwuijcuigbatwevcsuyqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433624.758197-201-54778768496280/AnsiballZ_file.py
Oct 14 09:20:25 np0005486759.ooo.test sudo[137357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:25 np0005486759.ooo.test python3.9[137359]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:25 np0005486759.ooo.test sudo[137357]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:25 np0005486759.ooo.test sudo[137449]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyowecacjwmwbiiumrapcpqenqbhkbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433625.3669698-201-143345723923838/AnsiballZ_file.py
Oct 14 09:20:25 np0005486759.ooo.test sudo[137449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:25 np0005486759.ooo.test python3.9[137451]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:25 np0005486759.ooo.test sudo[137449]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:26 np0005486759.ooo.test sudo[137541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hniaivmxwkptywqwkbjzaptthdxyhgnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433625.9470935-201-280252947851187/AnsiballZ_file.py
Oct 14 09:20:26 np0005486759.ooo.test sudo[137541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:26 np0005486759.ooo.test python3.9[137543]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:26 np0005486759.ooo.test sudo[137541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51899 DF PROTO=TCP SPT=43930 DPT=9102 SEQ=1741596681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C1C010000000001030307) 
Oct 14 09:20:26 np0005486759.ooo.test sudo[137633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdpfxetqmikwpybcnticlmbelubklago ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433626.502394-201-161361624392807/AnsiballZ_file.py
Oct 14 09:20:26 np0005486759.ooo.test sudo[137633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:27 np0005486759.ooo.test python3.9[137635]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:27 np0005486759.ooo.test sudo[137633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:27 np0005486759.ooo.test sudo[137725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byymcqmpjjsqnrrcoszosijyadddines ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433627.1216903-201-98778818671849/AnsiballZ_file.py
Oct 14 09:20:27 np0005486759.ooo.test sudo[137725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:27 np0005486759.ooo.test python3.9[137727]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:27 np0005486759.ooo.test sudo[137725]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:27 np0005486759.ooo.test sudo[137817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbxwgjqhaemouxacvwbwpwromlrjwwrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433627.711641-201-204225127675949/AnsiballZ_file.py
Oct 14 09:20:27 np0005486759.ooo.test sudo[137817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:28 np0005486759.ooo.test python3.9[137819]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:28 np0005486759.ooo.test sudo[137817]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:28 np0005486759.ooo.test sudo[137909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkmqlkwlzcjvzxgqaoodppqcqnoyhqhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433628.39656-201-211312239270567/AnsiballZ_file.py
Oct 14 09:20:28 np0005486759.ooo.test sudo[137909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:28 np0005486759.ooo.test python3.9[137911]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:28 np0005486759.ooo.test sudo[137909]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:29 np0005486759.ooo.test sudo[138001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mllafabwuyvbiuxrwyqbnitbhlbsvaug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433628.9607186-201-191757400165475/AnsiballZ_file.py
Oct 14 09:20:29 np0005486759.ooo.test sudo[138001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:29 np0005486759.ooo.test python3.9[138003]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:29 np0005486759.ooo.test sudo[138001]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:29 np0005486759.ooo.test sudo[138093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jooibdtvvopxibquyfmbtzhaerzjkgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433629.566001-201-225059896785398/AnsiballZ_file.py
Oct 14 09:20:29 np0005486759.ooo.test sudo[138093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:30 np0005486759.ooo.test python3.9[138095]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:30 np0005486759.ooo.test sudo[138093]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:30 np0005486759.ooo.test sudo[138185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjlaapymcgnncwbalacgefxcmndoohhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433630.2080233-201-164251684356209/AnsiballZ_file.py
Oct 14 09:20:30 np0005486759.ooo.test sudo[138185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:30 np0005486759.ooo.test python3.9[138187]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:30 np0005486759.ooo.test sudo[138185]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:31 np0005486759.ooo.test sudo[138277]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnfknwsaddndqyzxsebjqgzvtgdadrkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433630.8552206-201-38163680256840/AnsiballZ_file.py
Oct 14 09:20:31 np0005486759.ooo.test sudo[138277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:31 np0005486759.ooo.test python3.9[138279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:31 np0005486759.ooo.test sudo[138277]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:31 np0005486759.ooo.test sudo[138369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgnwsmmrjotnyfhsoqkfkffxvbvdyxjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433631.553486-349-53374236738418/AnsiballZ_file.py
Oct 14 09:20:31 np0005486759.ooo.test sudo[138369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:32 np0005486759.ooo.test python3.9[138371]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:32 np0005486759.ooo.test sudo[138369]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:32 np0005486759.ooo.test sudo[138461]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omxotuzwykkhgmbhzccskaporanxncmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433632.173641-349-151420576583495/AnsiballZ_file.py
Oct 14 09:20:32 np0005486759.ooo.test sudo[138461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:32 np0005486759.ooo.test sshd[137266]: Connection closed by invalid user  129.212.189.30 port 44100 [preauth]
Oct 14 09:20:32 np0005486759.ooo.test python3.9[138463]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:32 np0005486759.ooo.test sudo[138461]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:33 np0005486759.ooo.test sudo[138553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlbjinatlqgdotkpckybmpyjrzkhganr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433632.7743676-349-231153266946228/AnsiballZ_file.py
Oct 14 09:20:33 np0005486759.ooo.test sudo[138553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:33 np0005486759.ooo.test python3.9[138555]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:33 np0005486759.ooo.test sudo[138553]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:33 np0005486759.ooo.test sudo[138645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbyjxvveozebvnkmzqbybqjljpbbcrog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433633.3847442-349-171479001830975/AnsiballZ_file.py
Oct 14 09:20:33 np0005486759.ooo.test sudo[138645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:33 np0005486759.ooo.test python3.9[138647]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:33 np0005486759.ooo.test sudo[138645]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:34 np0005486759.ooo.test sudo[138737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iattktuewezrqcadlymbxiirlcfptzlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433633.9703531-349-80695778984448/AnsiballZ_file.py
Oct 14 09:20:34 np0005486759.ooo.test sudo[138737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52990 DF PROTO=TCP SPT=45968 DPT=9100 SEQ=460751295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C39D70000000001030307) 
Oct 14 09:20:34 np0005486759.ooo.test python3.9[138739]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:34 np0005486759.ooo.test sudo[138737]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:34 np0005486759.ooo.test sudo[138829]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apaagopoqaqqapkzmgqplubggbqsglld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433634.6066186-349-116181770931701/AnsiballZ_file.py
Oct 14 09:20:34 np0005486759.ooo.test sudo[138829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:35 np0005486759.ooo.test python3.9[138831]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:35 np0005486759.ooo.test sudo[138829]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52991 DF PROTO=TCP SPT=45968 DPT=9100 SEQ=460751295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C3DC10000000001030307) 
Oct 14 09:20:35 np0005486759.ooo.test sudo[138921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lalceiamhoodsxccgesuipvinjvyjvnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433635.1877615-349-214118300799089/AnsiballZ_file.py
Oct 14 09:20:35 np0005486759.ooo.test sudo[138921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:35 np0005486759.ooo.test python3.9[138923]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:35 np0005486759.ooo.test sudo[138921]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:35 np0005486759.ooo.test sudo[139013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olykqwsmtzkhmqxorbdjatkghhetzgvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433635.7470775-349-98639448593032/AnsiballZ_file.py
Oct 14 09:20:35 np0005486759.ooo.test sudo[139013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:36 np0005486759.ooo.test python3.9[139015]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:36 np0005486759.ooo.test sudo[139013]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:36 np0005486759.ooo.test sudo[139105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joodzxmqzdzkshjbebrtixxnbfbfuixi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433636.2874343-349-275558647721984/AnsiballZ_file.py
Oct 14 09:20:36 np0005486759.ooo.test sudo[139105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:36 np0005486759.ooo.test python3.9[139107]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:36 np0005486759.ooo.test sudo[139105]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52992 DF PROTO=TCP SPT=45968 DPT=9100 SEQ=460751295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C45C10000000001030307) 
Oct 14 09:20:37 np0005486759.ooo.test sudo[139197]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfftugwggpadjvfcwijwpuocargfzxqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433637.6098974-349-225842231879560/AnsiballZ_file.py
Oct 14 09:20:37 np0005486759.ooo.test sudo[139197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:38 np0005486759.ooo.test python3.9[139199]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:38 np0005486759.ooo.test sudo[139197]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:38 np0005486759.ooo.test sudo[139289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrduvouuquwvvdxrnljnvqmljhfnlcef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433638.1329195-349-112279722472346/AnsiballZ_file.py
Oct 14 09:20:38 np0005486759.ooo.test sudo[139289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:38 np0005486759.ooo.test python3.9[139291]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:38 np0005486759.ooo.test sudo[139289]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:38 np0005486759.ooo.test sudo[139381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xovvcrdzxvrtxijaqbxxvnnvykwudmug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433638.6512113-349-76080654632854/AnsiballZ_file.py
Oct 14 09:20:38 np0005486759.ooo.test sudo[139381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:39 np0005486759.ooo.test python3.9[139383]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:39 np0005486759.ooo.test sudo[139381]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:40 np0005486759.ooo.test sudo[139473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtxyluhgwfxlclcvrcltqgfgqfybymfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433639.9289308-349-52340532193745/AnsiballZ_file.py
Oct 14 09:20:40 np0005486759.ooo.test sudo[139473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:40 np0005486759.ooo.test python3.9[139475]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:40 np0005486759.ooo.test sudo[139473]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:40 np0005486759.ooo.test sudo[139565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsbimpafpkgchiuohwkqufdphiyeaqsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433640.558909-349-15044783511919/AnsiballZ_file.py
Oct 14 09:20:40 np0005486759.ooo.test sudo[139565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:40 np0005486759.ooo.test python3.9[139567]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:40 np0005486759.ooo.test sudo[139565]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:41 np0005486759.ooo.test sudo[139657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzxzcrmjftevdsvjuopixmcxbwbdpyoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433641.0636532-349-157245821887348/AnsiballZ_file.py
Oct 14 09:20:41 np0005486759.ooo.test sudo[139657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52993 DF PROTO=TCP SPT=45968 DPT=9100 SEQ=460751295 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C55810000000001030307) 
Oct 14 09:20:41 np0005486759.ooo.test python3.9[139659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:41 np0005486759.ooo.test sudo[139657]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:41 np0005486759.ooo.test sudo[139749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aypjojajtqgjktrygzwnfimcvhtcyfwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433641.6824682-349-14787028298352/AnsiballZ_file.py
Oct 14 09:20:41 np0005486759.ooo.test sudo[139749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:42 np0005486759.ooo.test python3.9[139751]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:42 np0005486759.ooo.test sudo[139749]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30527 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=2583125511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C59280000000001030307) 
Oct 14 09:20:42 np0005486759.ooo.test sudo[139841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juvggvkfnmpgzzyhqqgkmswvycnzvavk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433642.2311084-349-60005158255327/AnsiballZ_file.py
Oct 14 09:20:42 np0005486759.ooo.test sudo[139841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:42 np0005486759.ooo.test python3.9[139843]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:42 np0005486759.ooo.test sudo[139841]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:43 np0005486759.ooo.test sudo[139933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giwddwstxtjcjfqvvsjvclfyxjllgvqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433642.875096-349-20475690287032/AnsiballZ_file.py
Oct 14 09:20:43 np0005486759.ooo.test sudo[139933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30528 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=2583125511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C5D410000000001030307) 
Oct 14 09:20:43 np0005486759.ooo.test python3.9[139935]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:43 np0005486759.ooo.test sudo[139933]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:43 np0005486759.ooo.test sudo[140025]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmgvvmcrmgsbsnlsvrkolscwbbszgmkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433643.508976-349-152889760553106/AnsiballZ_file.py
Oct 14 09:20:43 np0005486759.ooo.test sudo[140025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:43 np0005486759.ooo.test python3.9[140027]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:43 np0005486759.ooo.test sudo[140025]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:44 np0005486759.ooo.test sudo[140117]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wavktpmrekocmwaqkzlnjqwkrzawntmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433644.0779464-349-219502840940323/AnsiballZ_file.py
Oct 14 09:20:44 np0005486759.ooo.test sudo[140117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:44 np0005486759.ooo.test python3.9[140119]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:44 np0005486759.ooo.test sudo[140117]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:44 np0005486759.ooo.test sudo[140209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcirmrdogwgqzqqjlxbyrbndezouzugr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433644.7096417-349-212223818880303/AnsiballZ_file.py
Oct 14 09:20:44 np0005486759.ooo.test sudo[140209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:45 np0005486759.ooo.test python3.9[140211]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:20:45 np0005486759.ooo.test sudo[140209]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:45 np0005486759.ooo.test sudo[140301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usbcckgdtchmtwbjocjnuqahkhqtmypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433645.45803-498-112146248623084/AnsiballZ_command.py
Oct 14 09:20:45 np0005486759.ooo.test sudo[140301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:46 np0005486759.ooo.test python3.9[140303]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                           systemctl disable --now certmonger.service
                                                           test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                         fi
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:46 np0005486759.ooo.test sudo[140301]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:46 np0005486759.ooo.test python3.9[140395]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:20:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47628 DF PROTO=TCP SPT=58414 DPT=9105 SEQ=2561061166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C6C010000000001030307) 
Oct 14 09:20:47 np0005486759.ooo.test sudo[140485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nljigaljbwyfsiacsiuicmkopqozwfry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433647.058722-516-174862788971139/AnsiballZ_systemd_service.py
Oct 14 09:20:47 np0005486759.ooo.test sudo[140485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:47 np0005486759.ooo.test python3.9[140487]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:20:47 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:20:47 np0005486759.ooo.test systemd-rc-local-generator[140511]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:20:47 np0005486759.ooo.test systemd-sysv-generator[140514]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:20:47 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:20:47 np0005486759.ooo.test systemd[1]: Starting dnf makecache...
Oct 14 09:20:47 np0005486759.ooo.test sudo[140485]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:48 np0005486759.ooo.test dnf[140523]: Updating Subscription Management repositories.
Oct 14 09:20:48 np0005486759.ooo.test sudo[140613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzbgovqughcdtplbfrzqmttckrrqayjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433648.0691335-524-189850978951866/AnsiballZ_command.py
Oct 14 09:20:48 np0005486759.ooo.test sudo[140613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:48 np0005486759.ooo.test python3.9[140615]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:48 np0005486759.ooo.test sudo[140613]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:49 np0005486759.ooo.test sudo[140706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzvdkezjjvkmgwvccngwawzpxzhxsozf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433648.7413883-524-118348452330536/AnsiballZ_command.py
Oct 14 09:20:49 np0005486759.ooo.test sudo[140706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:49 np0005486759.ooo.test python3.9[140708]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:49 np0005486759.ooo.test sudo[140706]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30530 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=2583125511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C75010000000001030307) 
Oct 14 09:20:49 np0005486759.ooo.test sudo[140799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjokgzqklbuwwfqeocjzcasuvmotmrzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433649.3904426-524-4409856664435/AnsiballZ_command.py
Oct 14 09:20:49 np0005486759.ooo.test sudo[140799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:49 np0005486759.ooo.test dnf[140523]: Metadata cache refreshed recently.
Oct 14 09:20:49 np0005486759.ooo.test python3.9[140801]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:49 np0005486759.ooo.test sudo[140799]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:49 np0005486759.ooo.test systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 14 09:20:49 np0005486759.ooo.test systemd[1]: Finished dnf makecache.
Oct 14 09:20:49 np0005486759.ooo.test systemd[1]: dnf-makecache.service: Consumed 1.969s CPU time.
Oct 14 09:20:50 np0005486759.ooo.test sudo[140892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqeniqfwciwfmhxfkagcrcwesdqujpye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433650.0021198-524-242306264667806/AnsiballZ_command.py
Oct 14 09:20:50 np0005486759.ooo.test sudo[140892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:50 np0005486759.ooo.test python3.9[140894]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:51 np0005486759.ooo.test sudo[140892]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:51 np0005486759.ooo.test sudo[140985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqbdovuzmqfaglofsvfwmeetjonixplh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433651.594339-524-23279673430936/AnsiballZ_command.py
Oct 14 09:20:51 np0005486759.ooo.test sudo[140985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:52 np0005486759.ooo.test python3.9[140987]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:52 np0005486759.ooo.test sudo[140985]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:52 np0005486759.ooo.test sudo[141078]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wohpxrfqjaxylauqveqnogguvsycdjez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433652.229128-524-159944940798817/AnsiballZ_command.py
Oct 14 09:20:52 np0005486759.ooo.test sudo[141078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47707 DF PROTO=TCP SPT=45518 DPT=9102 SEQ=1202958198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C81410000000001030307) 
Oct 14 09:20:52 np0005486759.ooo.test python3.9[141080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:52 np0005486759.ooo.test sudo[141078]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:53 np0005486759.ooo.test sudo[141171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gniisurswbtkrvlzujfsuzbrsbvikpav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433652.8539488-524-200884527645954/AnsiballZ_command.py
Oct 14 09:20:53 np0005486759.ooo.test sudo[141171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:53 np0005486759.ooo.test python3.9[141173]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:53 np0005486759.ooo.test sudo[141171]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:53 np0005486759.ooo.test sudo[141264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggayspnxzkquxhysjjdisyuycbrjnqip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433653.4652393-524-86020872894604/AnsiballZ_command.py
Oct 14 09:20:53 np0005486759.ooo.test sudo[141264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:53 np0005486759.ooo.test python3.9[141266]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:53 np0005486759.ooo.test sudo[141264]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:54 np0005486759.ooo.test sudo[141357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qreuspjecygjiahafjygayjpcjxflmcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433654.0699785-524-78333097206728/AnsiballZ_command.py
Oct 14 09:20:54 np0005486759.ooo.test sudo[141357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:54 np0005486759.ooo.test python3.9[141359]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:54 np0005486759.ooo.test sudo[141357]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:54 np0005486759.ooo.test sudo[141450]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylnuvmffiqpnfglwouefrtynafnaorsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433654.6346633-524-29446645550767/AnsiballZ_command.py
Oct 14 09:20:54 np0005486759.ooo.test sudo[141450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:55 np0005486759.ooo.test python3.9[141452]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:55 np0005486759.ooo.test sudo[141450]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:55 np0005486759.ooo.test sudo[141543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blaabdjgjpchhuyxucqoatcdhnnytjhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433655.2316928-524-274805557419668/AnsiballZ_command.py
Oct 14 09:20:55 np0005486759.ooo.test sudo[141543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:55 np0005486759.ooo.test python3.9[141545]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:55 np0005486759.ooo.test sudo[141543]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:56 np0005486759.ooo.test sudo[141636]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnmavhunsmtchxajenmtnnzquofntkwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433655.807974-524-280681645968241/AnsiballZ_command.py
Oct 14 09:20:56 np0005486759.ooo.test sudo[141636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:56 np0005486759.ooo.test python3.9[141638]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:56 np0005486759.ooo.test sudo[141636]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47708 DF PROTO=TCP SPT=45518 DPT=9102 SEQ=1202958198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7C91010000000001030307) 
Oct 14 09:20:56 np0005486759.ooo.test sudo[141729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukheeinlhyttuxiojncqxpmrdmzymtsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433656.4265118-524-94116037098465/AnsiballZ_command.py
Oct 14 09:20:56 np0005486759.ooo.test sudo[141729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:56 np0005486759.ooo.test python3.9[141731]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:56 np0005486759.ooo.test sudo[141729]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:57 np0005486759.ooo.test sudo[141822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ligswfqzevxlskwqfqrapbhmtlexuwcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433657.0576503-524-261966060528979/AnsiballZ_command.py
Oct 14 09:20:57 np0005486759.ooo.test sudo[141822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:57 np0005486759.ooo.test python3.9[141824]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:57 np0005486759.ooo.test sudo[141822]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:57 np0005486759.ooo.test sudo[141915]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vljanhrjutubtliuenceanzynedmxdaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433657.7386749-524-101612526062237/AnsiballZ_command.py
Oct 14 09:20:57 np0005486759.ooo.test sudo[141915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:58 np0005486759.ooo.test python3.9[141917]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:58 np0005486759.ooo.test sudo[141915]: pam_unix(sudo:session): session closed for user root
Oct 14 09:20:59 np0005486759.ooo.test sudo[142008]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmuxfuunyklaggrqqfvndcevonxopscy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433658.302654-524-79269962470581/AnsiballZ_command.py
Oct 14 09:20:59 np0005486759.ooo.test sudo[142008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:20:59 np0005486759.ooo.test python3.9[142010]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:20:59 np0005486759.ooo.test sudo[142008]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:00 np0005486759.ooo.test sudo[142101]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dopfjraaozvyfjtxxlglacleiesrjsph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433659.7776403-524-206745306953752/AnsiballZ_command.py
Oct 14 09:21:00 np0005486759.ooo.test sudo[142101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:00 np0005486759.ooo.test python3.9[142103]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:00 np0005486759.ooo.test sudo[142101]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:00 np0005486759.ooo.test sudo[142194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvswjmnqhdhzffyliehjycguymzkoikn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433660.355425-524-164634417164927/AnsiballZ_command.py
Oct 14 09:21:00 np0005486759.ooo.test sudo[142194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:00 np0005486759.ooo.test python3.9[142196]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:00 np0005486759.ooo.test sudo[142194]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:01 np0005486759.ooo.test sudo[142287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uohabxubzhfiqdulidgidyuiixyfbahz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433661.0034864-524-248301184959072/AnsiballZ_command.py
Oct 14 09:21:01 np0005486759.ooo.test sudo[142287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:01 np0005486759.ooo.test python3.9[142289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:01 np0005486759.ooo.test sudo[142287]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:02 np0005486759.ooo.test sudo[142380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgvqdgtinqpryjwmxkkixzsiywwnghhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433662.2135055-524-140262054285049/AnsiballZ_command.py
Oct 14 09:21:02 np0005486759.ooo.test sudo[142380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:02 np0005486759.ooo.test python3.9[142382]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:02 np0005486759.ooo.test sudo[142380]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:03 np0005486759.ooo.test sudo[142473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rddbgojawqzynvszlfpefonkftolfwfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433662.8301702-524-271412680478426/AnsiballZ_command.py
Oct 14 09:21:03 np0005486759.ooo.test sudo[142473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:03 np0005486759.ooo.test python3.9[142475]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:03 np0005486759.ooo.test sudo[142473]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:03 np0005486759.ooo.test sshd[131590]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:21:03 np0005486759.ooo.test systemd[1]: session-20.scope: Deactivated successfully.
Oct 14 09:21:03 np0005486759.ooo.test systemd[1]: session-20.scope: Consumed 46.769s CPU time.
Oct 14 09:21:03 np0005486759.ooo.test systemd-logind[759]: Session 20 logged out. Waiting for processes to exit.
Oct 14 09:21:03 np0005486759.ooo.test systemd-logind[759]: Removed session 20.
Oct 14 09:21:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15777 DF PROTO=TCP SPT=57372 DPT=9100 SEQ=3869475583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CAF070000000001030307) 
Oct 14 09:21:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15778 DF PROTO=TCP SPT=57372 DPT=9100 SEQ=3869475583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CB3010000000001030307) 
Oct 14 09:21:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15779 DF PROTO=TCP SPT=57372 DPT=9100 SEQ=3869475583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CBB010000000001030307) 
Oct 14 09:21:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15780 DF PROTO=TCP SPT=57372 DPT=9100 SEQ=3869475583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CCAC10000000001030307) 
Oct 14 09:21:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2525 DF PROTO=TCP SPT=50674 DPT=9882 SEQ=1664012996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CCE580000000001030307) 
Oct 14 09:21:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2526 DF PROTO=TCP SPT=50674 DPT=9882 SEQ=1664012996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CD2410000000001030307) 
Oct 14 09:21:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64197 DF PROTO=TCP SPT=58972 DPT=9105 SEQ=4187344177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CE1010000000001030307) 
Oct 14 09:21:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2528 DF PROTO=TCP SPT=50674 DPT=9882 SEQ=1664012996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CEA010000000001030307) 
Oct 14 09:21:20 np0005486759.ooo.test sshd[142491]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:21:20 np0005486759.ooo.test sshd[142491]: Accepted publickey for zuul from 192.168.122.31 port 53464 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:21:20 np0005486759.ooo.test systemd-logind[759]: New session 21 of user zuul.
Oct 14 09:21:20 np0005486759.ooo.test systemd[1]: Started Session 21 of User zuul.
Oct 14 09:21:20 np0005486759.ooo.test sshd[142491]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:21:21 np0005486759.ooo.test python3.9[142584]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 14 09:21:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18541 DF PROTO=TCP SPT=51340 DPT=9102 SEQ=1894501044 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7CF6810000000001030307) 
Oct 14 09:21:22 np0005486759.ooo.test python3.9[142688]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:21:23 np0005486759.ooo.test sudo[142778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfewffozzdrnbokzenokibgwyzxqehho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433683.1045954-45-37620570162453/AnsiballZ_command.py
Oct 14 09:21:23 np0005486759.ooo.test sudo[142778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:23 np0005486759.ooo.test python3.9[142780]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:23 np0005486759.ooo.test sudo[142778]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:24 np0005486759.ooo.test sudo[142871]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyccswemzjrfldrfaigttwrfzkmyvqpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433684.0750844-57-237470344445060/AnsiballZ_stat.py
Oct 14 09:21:24 np0005486759.ooo.test sudo[142871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:24 np0005486759.ooo.test python3.9[142873]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:21:24 np0005486759.ooo.test sudo[142871]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:25 np0005486759.ooo.test sudo[142963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqwwgkrixuqxkczpqvykmbfwatazcrej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433684.916814-65-146559968597949/AnsiballZ_file.py
Oct 14 09:21:25 np0005486759.ooo.test sudo[142963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:25 np0005486759.ooo.test python3.9[142965]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:21:25 np0005486759.ooo.test sudo[142963]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:26 np0005486759.ooo.test sudo[143055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kadunmpgybvqqlnerlbfaddcqhnnuyzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433685.7737632-73-53014966899081/AnsiballZ_stat.py
Oct 14 09:21:26 np0005486759.ooo.test sudo[143055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:26 np0005486759.ooo.test python3.9[143057]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:21:26 np0005486759.ooo.test sudo[143055]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18542 DF PROTO=TCP SPT=51340 DPT=9102 SEQ=1894501044 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D06420000000001030307) 
Oct 14 09:21:26 np0005486759.ooo.test sudo[143128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laalpzpiufaqlqfssmnsnlhhcguagess ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433685.7737632-73-53014966899081/AnsiballZ_copy.py
Oct 14 09:21:26 np0005486759.ooo.test sudo[143128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:26 np0005486759.ooo.test python3.9[143130]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433685.7737632-73-53014966899081/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:21:26 np0005486759.ooo.test sudo[143128]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:27 np0005486759.ooo.test sudo[143220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwfumgdbdntnrubenbtceizuyznrnrju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433687.056875-88-115949567480060/AnsiballZ_setup.py
Oct 14 09:21:27 np0005486759.ooo.test sudo[143220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:27 np0005486759.ooo.test python3.9[143222]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:21:27 np0005486759.ooo.test sudo[143220]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:28 np0005486759.ooo.test sudo[143316]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiyevkgzdsbamhbxeppwntqpqhlhthlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433688.0387163-96-155747302698663/AnsiballZ_file.py
Oct 14 09:21:28 np0005486759.ooo.test sudo[143316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:28 np0005486759.ooo.test python3.9[143318]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:21:28 np0005486759.ooo.test sudo[143316]: pam_unix(sudo:session): session closed for user root
Oct 14 09:21:29 np0005486759.ooo.test python3.9[143408]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:21:29 np0005486759.ooo.test network[143425]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:21:29 np0005486759.ooo.test network[143426]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:21:29 np0005486759.ooo.test network[143427]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:21:31 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:21:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38251 DF PROTO=TCP SPT=60710 DPT=9100 SEQ=4222718112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D24370000000001030307) 
Oct 14 09:21:34 np0005486759.ooo.test python3.9[143623]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:21:35 np0005486759.ooo.test python3.9[143713]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:21:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38252 DF PROTO=TCP SPT=60710 DPT=9100 SEQ=4222718112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D28410000000001030307) 
Oct 14 09:21:35 np0005486759.ooo.test sudo[143807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezlaopjxohoblpqefhxbylmjkotwhscm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433695.5740492-130-206481313295003/AnsiballZ_command.py
Oct 14 09:21:35 np0005486759.ooo.test sudo[143807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:21:36 np0005486759.ooo.test python3.9[143809]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                         set -euxo pipefail
                                                         curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                         python3 -m venv ./venv
                                                         PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                         # This is required for FIPS enabled until trunk.rdoproject.org
                                                         # is not being served from a centos7 host, tracked by
                                                         # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                         dnf -y install crypto-policies
                                                         update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                         ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                         
                                                         # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                         # with rhel 9.2 openssh
                                                         dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                         # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                         # here we only ensuring that decontainerized libvirt can start
                                                         dnf -y upgrade openstack-selinux
                                                         rm -f /run/virtlogd.pid
                                                         
                                                         rm -rf repo-setup-main
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:21:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38253 DF PROTO=TCP SPT=60710 DPT=9100 SEQ=4222718112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D30410000000001030307) 
Oct 14 09:21:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38254 DF PROTO=TCP SPT=60710 DPT=9100 SEQ=4222718112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D40010000000001030307) 
Oct 14 09:21:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15568 DF PROTO=TCP SPT=60888 DPT=9882 SEQ=3819999447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D43880000000001030307) 
Oct 14 09:21:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15569 DF PROTO=TCP SPT=60888 DPT=9882 SEQ=3819999447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D47810000000001030307) 
Oct 14 09:21:45 np0005486759.ooo.test sshd[33589]: Received signal 15; terminating.
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Stopping OpenSSH server daemon...
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: sshd.service: Deactivated successfully.
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Stopped OpenSSH server daemon.
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Stopped target sshd-keygen.target.
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Stopping sshd-keygen.target...
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Reached target sshd-keygen.target.
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Starting OpenSSH server daemon...
Oct 14 09:21:45 np0005486759.ooo.test sshd[143852]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:21:45 np0005486759.ooo.test sshd[143852]: Server listening on 0.0.0.0 port 22.
Oct 14 09:21:45 np0005486759.ooo.test sshd[143852]: Server listening on :: port 22.
Oct 14 09:21:45 np0005486759.ooo.test systemd[1]: Started OpenSSH server daemon.
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: run-rff70adcb5ecf491a8faf6d048aad0bbb.service: Deactivated successfully.
Oct 14 09:21:46 np0005486759.ooo.test systemd[1]: run-r5e4e071ecde4426799938422fe95b218.service: Deactivated successfully.
Oct 14 09:21:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14162 DF PROTO=TCP SPT=48686 DPT=9105 SEQ=1415206494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D56410000000001030307) 
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Stopping OpenSSH server daemon...
Oct 14 09:21:47 np0005486759.ooo.test sshd[143852]: Received signal 15; terminating.
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: sshd.service: Deactivated successfully.
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Stopped OpenSSH server daemon.
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Stopped target sshd-keygen.target.
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Stopping sshd-keygen.target...
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Reached target sshd-keygen.target.
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Starting OpenSSH server daemon...
Oct 14 09:21:47 np0005486759.ooo.test sshd[144249]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:21:47 np0005486759.ooo.test sshd[144249]: Server listening on 0.0.0.0 port 22.
Oct 14 09:21:47 np0005486759.ooo.test sshd[144249]: Server listening on :: port 22.
Oct 14 09:21:47 np0005486759.ooo.test systemd[1]: Started OpenSSH server daemon.
Oct 14 09:21:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15571 DF PROTO=TCP SPT=60888 DPT=9882 SEQ=3819999447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D5F410000000001030307) 
Oct 14 09:21:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62788 DF PROTO=TCP SPT=59588 DPT=9102 SEQ=1223574153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D6BC10000000001030307) 
Oct 14 09:21:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62789 DF PROTO=TCP SPT=59588 DPT=9102 SEQ=1223574153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D7B810000000001030307) 
Oct 14 09:22:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40222 DF PROTO=TCP SPT=38844 DPT=9100 SEQ=4111294275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D99680000000001030307) 
Oct 14 09:22:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40223 DF PROTO=TCP SPT=38844 DPT=9100 SEQ=4111294275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7D9D810000000001030307) 
Oct 14 09:22:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40224 DF PROTO=TCP SPT=38844 DPT=9100 SEQ=4111294275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DA5810000000001030307) 
Oct 14 09:22:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40225 DF PROTO=TCP SPT=38844 DPT=9100 SEQ=4111294275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DB5410000000001030307) 
Oct 14 09:22:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45427 DF PROTO=TCP SPT=43178 DPT=9882 SEQ=211720273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DB8B70000000001030307) 
Oct 14 09:22:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45428 DF PROTO=TCP SPT=43178 DPT=9882 SEQ=211720273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DBCC10000000001030307) 
Oct 14 09:22:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13174 DF PROTO=TCP SPT=49992 DPT=9105 SEQ=930666118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DCB810000000001030307) 
Oct 14 09:22:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45430 DF PROTO=TCP SPT=43178 DPT=9882 SEQ=211720273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DD4810000000001030307) 
Oct 14 09:22:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37634 DF PROTO=TCP SPT=49106 DPT=9102 SEQ=2995034711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DE1010000000001030307) 
Oct 14 09:22:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37635 DF PROTO=TCP SPT=49106 DPT=9102 SEQ=2995034711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7DF0C10000000001030307) 
Oct 14 09:22:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11598 DF PROTO=TCP SPT=48540 DPT=9100 SEQ=3252321699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E0E980000000001030307) 
Oct 14 09:22:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11599 DF PROTO=TCP SPT=48540 DPT=9100 SEQ=3252321699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E12810000000001030307) 
Oct 14 09:22:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11600 DF PROTO=TCP SPT=48540 DPT=9100 SEQ=3252321699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E1A810000000001030307) 
Oct 14 09:22:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11601 DF PROTO=TCP SPT=48540 DPT=9100 SEQ=3252321699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E2A410000000001030307) 
Oct 14 09:22:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20656 DF PROTO=TCP SPT=47416 DPT=9882 SEQ=427545448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E2DE70000000001030307) 
Oct 14 09:22:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20657 DF PROTO=TCP SPT=47416 DPT=9882 SEQ=427545448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E32010000000001030307) 
Oct 14 09:22:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44887 DF PROTO=TCP SPT=37822 DPT=9105 SEQ=3376778944 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E40C10000000001030307) 
Oct 14 09:22:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20659 DF PROTO=TCP SPT=47416 DPT=9882 SEQ=427545448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E49C10000000001030307) 
Oct 14 09:22:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61438 DF PROTO=TCP SPT=50470 DPT=9102 SEQ=1772786277 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E56010000000001030307) 
Oct 14 09:22:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61439 DF PROTO=TCP SPT=50470 DPT=9102 SEQ=1772786277 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E65C10000000001030307) 
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  Converting 2744 SID table entries...
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:22:59 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:23:00 np0005486759.ooo.test sudo[143807]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:01 np0005486759.ooo.test sudo[144818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trpjsirpdptkjxlebtlihwdzguqtaoen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433780.9236398-139-187091389968840/AnsiballZ_file.py
Oct 14 09:23:01 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Oct 14 09:23:01 np0005486759.ooo.test sudo[144818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:01 np0005486759.ooo.test python3.9[144820]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:23:01 np0005486759.ooo.test sudo[144818]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:01 np0005486759.ooo.test sudo[144910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pshhtirhxalzdabgpmxoqyqisndtcpvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433781.634357-147-80044105026727/AnsiballZ_stat.py
Oct 14 09:23:01 np0005486759.ooo.test sudo[144910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:02 np0005486759.ooo.test python3.9[144912]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:23:02 np0005486759.ooo.test sudo[144910]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:03 np0005486759.ooo.test sudo[144983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzxitltqnizjtsiyhtkturdruszwcpvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433781.634357-147-80044105026727/AnsiballZ_copy.py
Oct 14 09:23:03 np0005486759.ooo.test sudo[144983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:03 np0005486759.ooo.test python3.9[144985]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433781.634357-147-80044105026727/.source.fact _original_basename=.ap0txc_y follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:23:03 np0005486759.ooo.test sudo[144983]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:04 np0005486759.ooo.test python3.9[145075]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:23:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46704 DF PROTO=TCP SPT=36130 DPT=9100 SEQ=2219566626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E83C70000000001030307) 
Oct 14 09:23:05 np0005486759.ooo.test sudo[145171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofxssosslwlkoumatswqihqdglwecwvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433784.7413654-172-94406785259514/AnsiballZ_setup.py
Oct 14 09:23:05 np0005486759.ooo.test sudo[145171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46705 DF PROTO=TCP SPT=36130 DPT=9100 SEQ=2219566626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E87C10000000001030307) 
Oct 14 09:23:05 np0005486759.ooo.test python3.9[145173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:23:05 np0005486759.ooo.test sudo[145171]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:06 np0005486759.ooo.test sudo[145225]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buhizqatupfvzvnsvoqjdarnlgvpydjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433784.7413654-172-94406785259514/AnsiballZ_dnf.py
Oct 14 09:23:06 np0005486759.ooo.test sudo[145225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:06 np0005486759.ooo.test python3.9[145227]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:23:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46706 DF PROTO=TCP SPT=36130 DPT=9100 SEQ=2219566626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E8FC10000000001030307) 
Oct 14 09:23:10 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:23:10 np0005486759.ooo.test systemd-sysv-generator[145263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:23:10 np0005486759.ooo.test systemd-rc-local-generator[145260]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:23:10 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:23:10 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 09:23:11 np0005486759.ooo.test sudo[145225]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46707 DF PROTO=TCP SPT=36130 DPT=9100 SEQ=2219566626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7E9F810000000001030307) 
Oct 14 09:23:11 np0005486759.ooo.test sudo[145364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smojcbhegguoamgufxojblovdhuvpyvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433791.265394-184-226799216316608/AnsiballZ_command.py
Oct 14 09:23:11 np0005486759.ooo.test sudo[145364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:11 np0005486759.ooo.test python3.9[145366]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:23:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43700 DF PROTO=TCP SPT=50604 DPT=9882 SEQ=1001554757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EA3180000000001030307) 
Oct 14 09:23:12 np0005486759.ooo.test sudo[145364]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43701 DF PROTO=TCP SPT=50604 DPT=9882 SEQ=1001554757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EA7010000000001030307) 
Oct 14 09:23:13 np0005486759.ooo.test sudo[145603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sllnnnqxpliwlgypsriroyhpkarwcbkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433792.5492613-192-186452028241832/AnsiballZ_selinux.py
Oct 14 09:23:13 np0005486759.ooo.test sudo[145603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:13 np0005486759.ooo.test python3.9[145605]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 14 09:23:13 np0005486759.ooo.test sudo[145603]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:14 np0005486759.ooo.test sudo[145695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdzucantksqqbonihemfrkxiajqdzrsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433793.9661071-203-121819787918444/AnsiballZ_command.py
Oct 14 09:23:14 np0005486759.ooo.test sudo[145695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:14 np0005486759.ooo.test python3.9[145697]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 14 09:23:14 np0005486759.ooo.test sudo[145695]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:15 np0005486759.ooo.test sudo[145788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iridbjohupbcvevrdvwpvvldecolyorc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433795.1387722-211-274280341865423/AnsiballZ_file.py
Oct 14 09:23:15 np0005486759.ooo.test sudo[145788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:15 np0005486759.ooo.test python3.9[145790]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:23:15 np0005486759.ooo.test sudo[145788]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:16 np0005486759.ooo.test sudo[145880]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfbwcvguuduwhvhdnfknebhrldwlysde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433795.7944493-219-4987504480751/AnsiballZ_mount.py
Oct 14 09:23:16 np0005486759.ooo.test sudo[145880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:16 np0005486759.ooo.test python3.9[145882]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 14 09:23:16 np0005486759.ooo.test sudo[145880]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16872 DF PROTO=TCP SPT=55008 DPT=9105 SEQ=4003394980 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EB5C10000000001030307) 
Oct 14 09:23:17 np0005486759.ooo.test sudo[145972]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybhrjhozzhlxwkhtlctvzjubsjdhbnof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433797.2189622-247-205066415922067/AnsiballZ_file.py
Oct 14 09:23:17 np0005486759.ooo.test sudo[145972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:17 np0005486759.ooo.test python3.9[145974]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:23:17 np0005486759.ooo.test sudo[145972]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:18 np0005486759.ooo.test sudo[146064]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nctdfxoumwybzsacshvhocimutahrmao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433797.8554585-255-65354402134423/AnsiballZ_stat.py
Oct 14 09:23:18 np0005486759.ooo.test sudo[146064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:18 np0005486759.ooo.test python3.9[146066]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:23:18 np0005486759.ooo.test sudo[146064]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:18 np0005486759.ooo.test sudo[146137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awkbpfljbgrdnpbpokabyatswjoivyoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433797.8554585-255-65354402134423/AnsiballZ_copy.py
Oct 14 09:23:18 np0005486759.ooo.test sudo[146137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:18 np0005486759.ooo.test python3.9[146139]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760433797.8554585-255-65354402134423/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:23:18 np0005486759.ooo.test sudo[146137]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43703 DF PROTO=TCP SPT=50604 DPT=9882 SEQ=1001554757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EBEC10000000001030307) 
Oct 14 09:23:20 np0005486759.ooo.test sudo[146229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okwfguuupqdeoomxkoivwvibighdonxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433799.645836-282-225728140262191/AnsiballZ_getent.py
Oct 14 09:23:20 np0005486759.ooo.test sudo[146229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:20 np0005486759.ooo.test python3.9[146231]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 14 09:23:20 np0005486759.ooo.test sudo[146229]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:20 np0005486759.ooo.test sudo[146322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cystwgvzgpgasgwmrlsxnzboqjpnftly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433800.6139638-292-99240288336788/AnsiballZ_getent.py
Oct 14 09:23:20 np0005486759.ooo.test sudo[146322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:21 np0005486759.ooo.test python3.9[146324]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 14 09:23:21 np0005486759.ooo.test sudo[146322]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:21 np0005486759.ooo.test sudo[146415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gosvginublgvozbtbdybzkfgyyzdnkks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433801.2859507-300-216094360950684/AnsiballZ_group.py
Oct 14 09:23:21 np0005486759.ooo.test sudo[146415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:21 np0005486759.ooo.test python3.9[146417]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 14 09:23:21 np0005486759.ooo.test groupmod[146418]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Oct 14 09:23:21 np0005486759.ooo.test groupmod[146418]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Oct 14 09:23:22 np0005486759.ooo.test sudo[146415]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:22 np0005486759.ooo.test sudo[146513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mafxxbmrabohvmaomjqsqisknucjacan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433802.2455497-309-120899788746774/AnsiballZ_file.py
Oct 14 09:23:22 np0005486759.ooo.test sudo[146513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42136 DF PROTO=TCP SPT=40638 DPT=9102 SEQ=2895963183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7ECB420000000001030307) 
Oct 14 09:23:22 np0005486759.ooo.test python3.9[146515]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 14 09:23:22 np0005486759.ooo.test sudo[146513]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:23 np0005486759.ooo.test sudo[146605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhxeuicciqyxiqsqlendkvvbsbzeyhib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433803.1501894-320-24389718271607/AnsiballZ_dnf.py
Oct 14 09:23:23 np0005486759.ooo.test sudo[146605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:23 np0005486759.ooo.test python3.9[146607]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:23:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42137 DF PROTO=TCP SPT=40638 DPT=9102 SEQ=2895963183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EDB010000000001030307) 
Oct 14 09:23:26 np0005486759.ooo.test sudo[146605]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:27 np0005486759.ooo.test sudo[146699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnnsnfndsupnjascrptwpgkacfeihjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433806.9206192-328-187653972672056/AnsiballZ_file.py
Oct 14 09:23:27 np0005486759.ooo.test sudo[146699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:27 np0005486759.ooo.test python3.9[146701]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:23:27 np0005486759.ooo.test sudo[146699]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:27 np0005486759.ooo.test sudo[146791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txwsfzgkwebhhoxzpvzscgnynydetvyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433807.624845-336-154330454483618/AnsiballZ_stat.py
Oct 14 09:23:27 np0005486759.ooo.test sudo[146791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:28 np0005486759.ooo.test python3.9[146793]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:23:28 np0005486759.ooo.test sudo[146791]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:28 np0005486759.ooo.test sudo[146864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poypbocmeauuhojkcxfyhxygiskhticq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433807.624845-336-154330454483618/AnsiballZ_copy.py
Oct 14 09:23:28 np0005486759.ooo.test sudo[146864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:28 np0005486759.ooo.test python3.9[146866]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433807.624845-336-154330454483618/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:23:28 np0005486759.ooo.test sudo[146864]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:29 np0005486759.ooo.test sudo[146956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohrswrcblmmfajlauznmjjnvuhefgcdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433808.937575-351-229427727241229/AnsiballZ_systemd.py
Oct 14 09:23:29 np0005486759.ooo.test sudo[146956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:29 np0005486759.ooo.test python3.9[146958]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:23:29 np0005486759.ooo.test systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 14 09:23:29 np0005486759.ooo.test systemd[1]: Stopped Load Kernel Modules.
Oct 14 09:23:29 np0005486759.ooo.test systemd[1]: Stopping Load Kernel Modules...
Oct 14 09:23:29 np0005486759.ooo.test systemd[1]: Starting Load Kernel Modules...
Oct 14 09:23:29 np0005486759.ooo.test systemd-modules-load[146962]: Module 'msr' is built in
Oct 14 09:23:29 np0005486759.ooo.test systemd[1]: Finished Load Kernel Modules.
Oct 14 09:23:30 np0005486759.ooo.test sudo[146956]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:30 np0005486759.ooo.test sudo[147052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etztpdafitwoajrsjcsqlayiqoqnfwhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433810.1611698-359-178273484330686/AnsiballZ_stat.py
Oct 14 09:23:30 np0005486759.ooo.test sudo[147052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:30 np0005486759.ooo.test python3.9[147054]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:23:30 np0005486759.ooo.test sudo[147052]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:30 np0005486759.ooo.test sudo[147125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aupnvgxlnuqunxfqmbbeozwxbyzbszah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433810.1611698-359-178273484330686/AnsiballZ_copy.py
Oct 14 09:23:30 np0005486759.ooo.test sudo[147125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22091 DF PROTO=TCP SPT=59818 DPT=9100 SEQ=3223587433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EF8F70000000001030307) 
Oct 14 09:23:35 np0005486759.ooo.test python3.9[147127]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433810.1611698-359-178273484330686/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:23:35 np0005486759.ooo.test sudo[147125]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22092 DF PROTO=TCP SPT=59818 DPT=9100 SEQ=3223587433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7EFD010000000001030307) 
Oct 14 09:23:35 np0005486759.ooo.test sudo[147217]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltixmqwnkdtaotxkwkyiyouutbdukenr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433815.296287-377-19151394822923/AnsiballZ_dnf.py
Oct 14 09:23:35 np0005486759.ooo.test sudo[147217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:35 np0005486759.ooo.test python3.9[147219]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:23:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22093 DF PROTO=TCP SPT=59818 DPT=9100 SEQ=3223587433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F05010000000001030307) 
Oct 14 09:23:39 np0005486759.ooo.test sudo[147217]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:40 np0005486759.ooo.test python3.9[147311]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:23:40 np0005486759.ooo.test python3.9[147403]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 14 09:23:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22094 DF PROTO=TCP SPT=59818 DPT=9100 SEQ=3223587433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F14C20000000001030307) 
Oct 14 09:23:41 np0005486759.ooo.test python3.9[147493]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:23:42 np0005486759.ooo.test sudo[147583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wllpnktchixrejexguqfwubdayejcgit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433821.9626002-418-77979469820221/AnsiballZ_systemd.py
Oct 14 09:23:42 np0005486759.ooo.test sudo[147583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25 DF PROTO=TCP SPT=45448 DPT=9882 SEQ=3558274028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F18480000000001030307) 
Oct 14 09:23:42 np0005486759.ooo.test python3.9[147585]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:23:42 np0005486759.ooo.test systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 14 09:23:42 np0005486759.ooo.test systemd[1]: tuned.service: Deactivated successfully.
Oct 14 09:23:42 np0005486759.ooo.test systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 14 09:23:42 np0005486759.ooo.test systemd[1]: tuned.service: Consumed 1.588s CPU time, no IO.
Oct 14 09:23:42 np0005486759.ooo.test systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 14 09:23:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26 DF PROTO=TCP SPT=45448 DPT=9882 SEQ=3558274028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F1C420000000001030307) 
Oct 14 09:23:43 np0005486759.ooo.test systemd[1]: Started Dynamic System Tuning Daemon.
Oct 14 09:23:43 np0005486759.ooo.test sudo[147583]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:44 np0005486759.ooo.test python3.9[147688]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 14 09:23:46 np0005486759.ooo.test sudo[147778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzvwrtniuaxawxggromdurcdpxsmsmkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433826.0431542-475-101182305401075/AnsiballZ_systemd.py
Oct 14 09:23:46 np0005486759.ooo.test sudo[147778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:46 np0005486759.ooo.test python3.9[147780]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:23:46 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:23:46 np0005486759.ooo.test systemd-rc-local-generator[147810]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:23:46 np0005486759.ooo.test systemd-sysv-generator[147813]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:23:46 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:23:46 np0005486759.ooo.test sudo[147778]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36514 DF PROTO=TCP SPT=46488 DPT=9105 SEQ=712274688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F2B020000000001030307) 
Oct 14 09:23:47 np0005486759.ooo.test sudo[147908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovadghmrvrdoydotidtqvqryvvuhcrme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433827.0371377-475-174346034850053/AnsiballZ_systemd.py
Oct 14 09:23:47 np0005486759.ooo.test sudo[147908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:47 np0005486759.ooo.test python3.9[147910]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:23:47 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:23:47 np0005486759.ooo.test systemd-rc-local-generator[147934]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:23:47 np0005486759.ooo.test systemd-sysv-generator[147938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:23:47 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:23:47 np0005486759.ooo.test sudo[147908]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:48 np0005486759.ooo.test sudo[148037]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckjmgiuxfpysxbvvmreqacqvggdvipcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433828.0749972-491-269193240300586/AnsiballZ_command.py
Oct 14 09:23:48 np0005486759.ooo.test sudo[148037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:48 np0005486759.ooo.test python3.9[148039]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:23:48 np0005486759.ooo.test sudo[148037]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:49 np0005486759.ooo.test sudo[148130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwdklgwkzjtwtquqphrfqoaapzurcmjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433828.7554696-499-38167668133711/AnsiballZ_command.py
Oct 14 09:23:49 np0005486759.ooo.test sudo[148130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:49 np0005486759.ooo.test python3.9[148132]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:23:49 np0005486759.ooo.test kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Oct 14 09:23:49 np0005486759.ooo.test sudo[148130]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28 DF PROTO=TCP SPT=45448 DPT=9882 SEQ=3558274028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F34010000000001030307) 
Oct 14 09:23:49 np0005486759.ooo.test sudo[148223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwymjqvmxdydvvjwiufepmzmyvucxmho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433829.5271916-507-218795870411729/AnsiballZ_command.py
Oct 14 09:23:49 np0005486759.ooo.test sudo[148223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:50 np0005486759.ooo.test python3.9[148225]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:23:51 np0005486759.ooo.test sudo[148223]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:51 np0005486759.ooo.test sudo[148322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iivwimeamqrehjyyyojbamwubhmphfqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433831.4309912-515-144956173093913/AnsiballZ_command.py
Oct 14 09:23:51 np0005486759.ooo.test sudo[148322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:51 np0005486759.ooo.test python3.9[148324]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:23:51 np0005486759.ooo.test sudo[148322]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:52 np0005486759.ooo.test sudo[148415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghuvglyjlqnoewenkofqlckqkidrylee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433832.1144023-523-153304938055688/AnsiballZ_systemd.py
Oct 14 09:23:52 np0005486759.ooo.test sudo[148415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:23:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21760 DF PROTO=TCP SPT=37858 DPT=9102 SEQ=852068754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F40810000000001030307) 
Oct 14 09:23:52 np0005486759.ooo.test python3.9[148417]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:23:52 np0005486759.ooo.test systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 14 09:23:52 np0005486759.ooo.test systemd[1]: Stopped Apply Kernel Variables.
Oct 14 09:23:52 np0005486759.ooo.test systemd[1]: Stopping Apply Kernel Variables...
Oct 14 09:23:52 np0005486759.ooo.test systemd[1]: Starting Apply Kernel Variables...
Oct 14 09:23:52 np0005486759.ooo.test systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 14 09:23:52 np0005486759.ooo.test systemd[1]: Finished Apply Kernel Variables.
Oct 14 09:23:52 np0005486759.ooo.test sudo[148415]: pam_unix(sudo:session): session closed for user root
Oct 14 09:23:53 np0005486759.ooo.test sshd[142491]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:23:53 np0005486759.ooo.test systemd[1]: session-21.scope: Deactivated successfully.
Oct 14 09:23:53 np0005486759.ooo.test systemd[1]: session-21.scope: Consumed 1min 57.075s CPU time.
Oct 14 09:23:53 np0005486759.ooo.test systemd-logind[759]: Session 21 logged out. Waiting for processes to exit.
Oct 14 09:23:53 np0005486759.ooo.test systemd-logind[759]: Removed session 21.
Oct 14 09:23:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21761 DF PROTO=TCP SPT=37858 DPT=9102 SEQ=852068754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F50410000000001030307) 
Oct 14 09:23:59 np0005486759.ooo.test sshd[148438]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:23:59 np0005486759.ooo.test sshd[148438]: Accepted publickey for zuul from 192.168.122.31 port 60940 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:23:59 np0005486759.ooo.test systemd-logind[759]: New session 22 of user zuul.
Oct 14 09:23:59 np0005486759.ooo.test systemd[1]: Started Session 22 of User zuul.
Oct 14 09:23:59 np0005486759.ooo.test sshd[148438]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:24:00 np0005486759.ooo.test python3.9[148531]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:24:01 np0005486759.ooo.test python3.9[148625]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:24:02 np0005486759.ooo.test sudo[148719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhochfwibjnfrxjllmohcqvrqdsqkuxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433842.0541644-50-148417423619379/AnsiballZ_command.py
Oct 14 09:24:02 np0005486759.ooo.test sudo[148719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:02 np0005486759.ooo.test python3.9[148721]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:24:02 np0005486759.ooo.test sudo[148719]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:03 np0005486759.ooo.test python3.9[148812]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:24:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28188 DF PROTO=TCP SPT=37906 DPT=9100 SEQ=1080740496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F6E280000000001030307) 
Oct 14 09:24:04 np0005486759.ooo.test sudo[148906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swoxhgydsmxfwbjepxhziwhlukrewuac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433844.1180277-70-15145228740575/AnsiballZ_setup.py
Oct 14 09:24:04 np0005486759.ooo.test sudo[148906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:04 np0005486759.ooo.test python3.9[148908]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:24:05 np0005486759.ooo.test sudo[148906]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28189 DF PROTO=TCP SPT=37906 DPT=9100 SEQ=1080740496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F72410000000001030307) 
Oct 14 09:24:05 np0005486759.ooo.test sudo[148960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmiravakiszfdtwvtlpnzodinsduaqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433844.1180277-70-15145228740575/AnsiballZ_dnf.py
Oct 14 09:24:05 np0005486759.ooo.test sudo[148960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:05 np0005486759.ooo.test python3.9[148962]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:24:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28190 DF PROTO=TCP SPT=37906 DPT=9100 SEQ=1080740496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F7A410000000001030307) 
Oct 14 09:24:08 np0005486759.ooo.test sudo[148960]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:09 np0005486759.ooo.test sudo[149054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ostofguoouriuodjegjguarlfoapinuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433849.0342758-82-81875836896686/AnsiballZ_setup.py
Oct 14 09:24:09 np0005486759.ooo.test sudo[149054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:09 np0005486759.ooo.test python3.9[149056]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:24:09 np0005486759.ooo.test sudo[149054]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:10 np0005486759.ooo.test sudo[149209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzxtfukaeyqsyptorucvsycgughlkpvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433850.1272855-93-200027053026666/AnsiballZ_file.py
Oct 14 09:24:10 np0005486759.ooo.test sudo[149209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:10 np0005486759.ooo.test python3.9[149211]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:24:10 np0005486759.ooo.test sudo[149209]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28191 DF PROTO=TCP SPT=37906 DPT=9100 SEQ=1080740496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F8A010000000001030307) 
Oct 14 09:24:12 np0005486759.ooo.test sudo[149301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqfyidrnoobbrvhfyzbuyfxsucdersto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433851.7401092-101-207667926099597/AnsiballZ_command.py
Oct 14 09:24:12 np0005486759.ooo.test sudo[149301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:12 np0005486759.ooo.test python3.9[149303]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:24:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17318 DF PROTO=TCP SPT=33958 DPT=9882 SEQ=487593802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F8D780000000001030307) 
Oct 14 09:24:12 np0005486759.ooo.test sudo[149301]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:13 np0005486759.ooo.test sudo[149406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acrvgwjynivnmqgqjyhfayshujkpzllu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433852.5456789-109-244356580342537/AnsiballZ_stat.py
Oct 14 09:24:13 np0005486759.ooo.test sudo[149406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:13 np0005486759.ooo.test python3.9[149408]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:24:13 np0005486759.ooo.test sudo[149406]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17319 DF PROTO=TCP SPT=33958 DPT=9882 SEQ=487593802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7F91810000000001030307) 
Oct 14 09:24:13 np0005486759.ooo.test sudo[149454]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsvnmvxxmryebfhvxwvvaoadrztejkfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433852.5456789-109-244356580342537/AnsiballZ_file.py
Oct 14 09:24:13 np0005486759.ooo.test sudo[149454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:13 np0005486759.ooo.test python3.9[149456]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:24:13 np0005486759.ooo.test sudo[149454]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:14 np0005486759.ooo.test sudo[149546]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcvpatzkcshzinrscauhjdyedxpdmcuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433853.8219316-121-216532110848107/AnsiballZ_stat.py
Oct 14 09:24:14 np0005486759.ooo.test sudo[149546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:14 np0005486759.ooo.test python3.9[149548]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:24:14 np0005486759.ooo.test sudo[149546]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:14 np0005486759.ooo.test sudo[149619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypbjdhzhoiahlhjvzlenqmtjyozczpuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433853.8219316-121-216532110848107/AnsiballZ_copy.py
Oct 14 09:24:14 np0005486759.ooo.test sudo[149619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:15 np0005486759.ooo.test python3.9[149621]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760433853.8219316-121-216532110848107/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:24:15 np0005486759.ooo.test sudo[149619]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:15 np0005486759.ooo.test sudo[149711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnpbhwarypbjbdbtyocpeehuwldqcvwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433855.3595166-137-20317038835626/AnsiballZ_ini_file.py
Oct 14 09:24:15 np0005486759.ooo.test sudo[149711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:16 np0005486759.ooo.test python3.9[149713]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:24:16 np0005486759.ooo.test sudo[149711]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:16 np0005486759.ooo.test sudo[149803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzgshzyjixpsdlygasxgnlfsclaeyrrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433856.186365-137-8748454780071/AnsiballZ_ini_file.py
Oct 14 09:24:16 np0005486759.ooo.test sudo[149803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:16 np0005486759.ooo.test python3.9[149805]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:24:16 np0005486759.ooo.test sudo[149803]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:17 np0005486759.ooo.test sudo[149895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nspgchzcfocquylknyynuqgwuqvwaikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433856.812223-137-267489556828180/AnsiballZ_ini_file.py
Oct 14 09:24:17 np0005486759.ooo.test sudo[149895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20083 DF PROTO=TCP SPT=51062 DPT=9105 SEQ=2207316978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FA0410000000001030307) 
Oct 14 09:24:17 np0005486759.ooo.test python3.9[149897]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:24:17 np0005486759.ooo.test sudo[149895]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:17 np0005486759.ooo.test sudo[149987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-devpejpzqacrjvwntnxbzwibjmlkbdiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433857.4389398-137-2530357463331/AnsiballZ_ini_file.py
Oct 14 09:24:17 np0005486759.ooo.test sudo[149987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:17 np0005486759.ooo.test python3.9[149989]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:24:17 np0005486759.ooo.test sudo[149987]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:18 np0005486759.ooo.test python3.9[150079]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:24:19 np0005486759.ooo.test sudo[150171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haxctfstjxeyuluqrjefeijjvoqvvhbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433858.9572947-177-129744586235462/AnsiballZ_dnf.py
Oct 14 09:24:19 np0005486759.ooo.test sudo[150171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17321 DF PROTO=TCP SPT=33958 DPT=9882 SEQ=487593802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FA9410000000001030307) 
Oct 14 09:24:19 np0005486759.ooo.test python3.9[150173]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42385 DF PROTO=TCP SPT=43132 DPT=9102 SEQ=2387798353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FB5C10000000001030307) 
Oct 14 09:24:22 np0005486759.ooo.test sudo[150171]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:23 np0005486759.ooo.test sudo[150265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-salnmwiarvylbewlsohhppyqmqutqhtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433862.9780774-185-150474910975842/AnsiballZ_dnf.py
Oct 14 09:24:23 np0005486759.ooo.test sudo[150265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:23 np0005486759.ooo.test python3.9[150267]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:26 np0005486759.ooo.test sudo[150265]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42386 DF PROTO=TCP SPT=43132 DPT=9102 SEQ=2387798353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FC5810000000001030307) 
Oct 14 09:24:27 np0005486759.ooo.test sudo[150359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaccvhvldglomwlvardzippfiwavryvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433866.7980876-195-96554074687573/AnsiballZ_dnf.py
Oct 14 09:24:27 np0005486759.ooo.test sudo[150359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:27 np0005486759.ooo.test python3.9[150361]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:30 np0005486759.ooo.test sudo[150359]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:31 np0005486759.ooo.test sudo[150459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnsasnrftsyimbdeneryxaznxtqwznik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433870.7269833-204-200071406035027/AnsiballZ_dnf.py
Oct 14 09:24:31 np0005486759.ooo.test sudo[150459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:31 np0005486759.ooo.test python3.9[150461]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1707 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=880672091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FE3580000000001030307) 
Oct 14 09:24:34 np0005486759.ooo.test sudo[150459]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:34 np0005486759.ooo.test sudo[150553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riqyndmhosoncbqnjiinpuvjwjjlqpki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433874.7048624-216-279262267638602/AnsiballZ_dnf.py
Oct 14 09:24:34 np0005486759.ooo.test sudo[150553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:35 np0005486759.ooo.test python3.9[150555]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1708 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=880672091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FE7410000000001030307) 
Oct 14 09:24:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1709 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=880672091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FEF410000000001030307) 
Oct 14 09:24:38 np0005486759.ooo.test sudo[150553]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:38 np0005486759.ooo.test sudo[150647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vveiefgsfvnzkwslraljwtnvjpuqfqml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433878.4980874-225-253221650644136/AnsiballZ_dnf.py
Oct 14 09:24:38 np0005486759.ooo.test sudo[150647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:39 np0005486759.ooo.test python3.9[150649]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1710 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=880672091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F7FFF010000000001030307) 
Oct 14 09:24:42 np0005486759.ooo.test sudo[150647]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45313 DF PROTO=TCP SPT=38122 DPT=9882 SEQ=2961167587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8002A80000000001030307) 
Oct 14 09:24:42 np0005486759.ooo.test sudo[150741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppqftzwssytzpepuagsbafeyeaqfgpfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433882.469277-234-27216182439075/AnsiballZ_dnf.py
Oct 14 09:24:42 np0005486759.ooo.test sudo[150741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:43 np0005486759.ooo.test python3.9[150743]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:24:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45314 DF PROTO=TCP SPT=38122 DPT=9882 SEQ=2961167587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8006C20000000001030307) 
Oct 14 09:24:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10746 DF PROTO=TCP SPT=54050 DPT=9105 SEQ=2660855688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8015810000000001030307) 
Oct 14 09:24:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45316 DF PROTO=TCP SPT=38122 DPT=9882 SEQ=2961167587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F801E810000000001030307) 
Oct 14 09:24:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55673 DF PROTO=TCP SPT=44472 DPT=9102 SEQ=3049617557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F802B010000000001030307) 
Oct 14 09:24:53 np0005486759.ooo.test sudo[150741]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:54 np0005486759.ooo.test sudo[150910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqiwhwgcpiyezyjslcpuqkxhjbwcsdgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433894.0093331-245-85475780970114/AnsiballZ_file.py
Oct 14 09:24:54 np0005486759.ooo.test sudo[150910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:54 np0005486759.ooo.test python3.9[150912]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:24:54 np0005486759.ooo.test sudo[150910]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:54 np0005486759.ooo.test sudo[151015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbmersloaqoehsfqateuyfqlzgggtnxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433894.6912155-253-111205794311872/AnsiballZ_stat.py
Oct 14 09:24:55 np0005486759.ooo.test sudo[151015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:55 np0005486759.ooo.test python3.9[151017]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:24:55 np0005486759.ooo.test sudo[151015]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:55 np0005486759.ooo.test sudo[151088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moandcjbevrvgzcgsknppxdgspsrtrhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433894.6912155-253-111205794311872/AnsiballZ_copy.py
Oct 14 09:24:55 np0005486759.ooo.test sudo[151088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:55 np0005486759.ooo.test python3.9[151090]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1760433894.6912155-253-111205794311872/.source.json _original_basename=.sjqs1vv1 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:24:55 np0005486759.ooo.test sudo[151088]: pam_unix(sudo:session): session closed for user root
Oct 14 09:24:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55674 DF PROTO=TCP SPT=44472 DPT=9102 SEQ=3049617557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F803AC10000000001030307) 
Oct 14 09:24:57 np0005486759.ooo.test sudo[151180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsyawsakjystntsmineynrqmbenxffse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433896.0741236-272-97015099363494/AnsiballZ_podman_image.py
Oct 14 09:24:57 np0005486759.ooo.test sudo[151180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:24:57 np0005486759.ooo.test python3.9[151182]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 14 09:25:03 np0005486759.ooo.test podman[151194]: 2025-10-14 09:24:57.972629379 +0000 UTC m=+0.049703965 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 14 09:25:03 np0005486759.ooo.test sudo[151180]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58511 DF PROTO=TCP SPT=37404 DPT=9100 SEQ=3502490507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8058880000000001030307) 
Oct 14 09:25:04 np0005486759.ooo.test sudo[151394]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcudufpeqexjgtcvohxrbyfgsadgkbxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433904.26021-283-63061196626755/AnsiballZ_podman_image.py
Oct 14 09:25:04 np0005486759.ooo.test sudo[151394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:04 np0005486759.ooo.test python3.9[151396]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 14 09:25:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58512 DF PROTO=TCP SPT=37404 DPT=9100 SEQ=3502490507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F805C810000000001030307) 
Oct 14 09:25:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58513 DF PROTO=TCP SPT=37404 DPT=9100 SEQ=3502490507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8064820000000001030307) 
Oct 14 09:25:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58514 DF PROTO=TCP SPT=37404 DPT=9100 SEQ=3502490507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8074410000000001030307) 
Oct 14 09:25:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40688 DF PROTO=TCP SPT=56402 DPT=9882 SEQ=1811943161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8077D80000000001030307) 
Oct 14 09:25:12 np0005486759.ooo.test podman[151409]: 2025-10-14 09:25:04.939852554 +0000 UTC m=+0.044271988 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 09:25:13 np0005486759.ooo.test sudo[151394]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40689 DF PROTO=TCP SPT=56402 DPT=9882 SEQ=1811943161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F807BC20000000001030307) 
Oct 14 09:25:13 np0005486759.ooo.test sudo[151608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sltojavszrezndmtewkzlxkzveddnkol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433913.6492074-295-46556122437869/AnsiballZ_podman_image.py
Oct 14 09:25:13 np0005486759.ooo.test sudo[151608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:14 np0005486759.ooo.test python3.9[151610]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 14 09:25:16 np0005486759.ooo.test podman[151622]: 2025-10-14 09:25:14.222428012 +0000 UTC m=+0.039293155 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Oct 14 09:25:16 np0005486759.ooo.test sudo[151608]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2871 DF PROTO=TCP SPT=47300 DPT=9105 SEQ=4173356007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F808A810000000001030307) 
Oct 14 09:25:17 np0005486759.ooo.test sudo[151786]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgzwnqiteqdprcsdccxwixbcwxvmvwkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433917.0304227-304-123831837776633/AnsiballZ_podman_image.py
Oct 14 09:25:17 np0005486759.ooo.test sudo[151786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:17 np0005486759.ooo.test python3.9[151788]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 14 09:25:18 np0005486759.ooo.test podman[151800]: 2025-10-14 09:25:17.6435877 +0000 UTC m=+0.049137488 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 09:25:19 np0005486759.ooo.test sudo[151786]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40691 DF PROTO=TCP SPT=56402 DPT=9882 SEQ=1811943161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8093810000000001030307) 
Oct 14 09:25:19 np0005486759.ooo.test sudo[151960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smtkqwuskfcrxaeiwljedocrhdvhltdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433919.2704713-313-40536082787548/AnsiballZ_podman_image.py
Oct 14 09:25:19 np0005486759.ooo.test sudo[151960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:19 np0005486759.ooo.test python3.9[151962]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 14 09:25:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19720 DF PROTO=TCP SPT=55866 DPT=9102 SEQ=3880567263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80A0010000000001030307) 
Oct 14 09:25:25 np0005486759.ooo.test podman[151974]: 2025-10-14 09:25:19.892110234 +0000 UTC m=+0.034184749 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct 14 09:25:25 np0005486759.ooo.test sudo[151960]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:26 np0005486759.ooo.test sudo[152192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyclwkjzthlhrzzxmkugqigsfkgfaesr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433925.8092713-313-161320027264951/AnsiballZ_podman_image.py
Oct 14 09:25:26 np0005486759.ooo.test sudo[152192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:26 np0005486759.ooo.test python3.9[152194]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 14 09:25:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19721 DF PROTO=TCP SPT=55866 DPT=9102 SEQ=3880567263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80AFC10000000001030307) 
Oct 14 09:25:28 np0005486759.ooo.test podman[152206]: 2025-10-14 09:25:26.433568969 +0000 UTC m=+0.024836403 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Oct 14 09:25:28 np0005486759.ooo.test sudo[152192]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:29 np0005486759.ooo.test sshd[148438]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:25:29 np0005486759.ooo.test systemd[1]: session-22.scope: Deactivated successfully.
Oct 14 09:25:29 np0005486759.ooo.test systemd[1]: session-22.scope: Consumed 1min 32.375s CPU time.
Oct 14 09:25:29 np0005486759.ooo.test systemd-logind[759]: Session 22 logged out. Waiting for processes to exit.
Oct 14 09:25:29 np0005486759.ooo.test systemd-logind[759]: Removed session 22.
Oct 14 09:25:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2349 DF PROTO=TCP SPT=50102 DPT=9100 SEQ=3623247893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80CDB70000000001030307) 
Oct 14 09:25:34 np0005486759.ooo.test sshd[152551]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:25:34 np0005486759.ooo.test sshd[152551]: Accepted publickey for zuul from 192.168.122.31 port 32780 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:25:34 np0005486759.ooo.test systemd-logind[759]: New session 23 of user zuul.
Oct 14 09:25:34 np0005486759.ooo.test systemd[1]: Started Session 23 of User zuul.
Oct 14 09:25:34 np0005486759.ooo.test sshd[152551]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:25:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2350 DF PROTO=TCP SPT=50102 DPT=9100 SEQ=3623247893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80D1C20000000001030307) 
Oct 14 09:25:36 np0005486759.ooo.test python3.9[152644]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:25:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2351 DF PROTO=TCP SPT=50102 DPT=9100 SEQ=3623247893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80D9C20000000001030307) 
Oct 14 09:25:38 np0005486759.ooo.test sudo[152738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abrcqomjymdnnleuuxlputyyjvogifgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433937.3793323-36-147135783800678/AnsiballZ_getent.py
Oct 14 09:25:38 np0005486759.ooo.test sudo[152738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:38 np0005486759.ooo.test python3.9[152740]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 14 09:25:38 np0005486759.ooo.test sudo[152738]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:39 np0005486759.ooo.test sudo[152831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hacmberfrzzbboselrgnfounqwdatwun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433938.906044-48-26948838642310/AnsiballZ_setup.py
Oct 14 09:25:39 np0005486759.ooo.test sudo[152831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:39 np0005486759.ooo.test python3.9[152833]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:25:39 np0005486759.ooo.test sudo[152831]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:40 np0005486759.ooo.test sudo[152885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkugwnzbvstrfabajtirsojutcwrtxia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433938.906044-48-26948838642310/AnsiballZ_dnf.py
Oct 14 09:25:40 np0005486759.ooo.test sudo[152885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:41 np0005486759.ooo.test python3.9[152887]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:25:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2352 DF PROTO=TCP SPT=50102 DPT=9100 SEQ=3623247893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80E9820000000001030307) 
Oct 14 09:25:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65126 DF PROTO=TCP SPT=41790 DPT=9882 SEQ=2300732031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80ED080000000001030307) 
Oct 14 09:25:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65127 DF PROTO=TCP SPT=41790 DPT=9882 SEQ=2300732031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80F1010000000001030307) 
Oct 14 09:25:44 np0005486759.ooo.test sudo[152885]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:44 np0005486759.ooo.test sudo[152979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hajajnbdbnbhmpvmpkddpmjiaelzjgqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433944.644098-62-248784868391950/AnsiballZ_dnf.py
Oct 14 09:25:44 np0005486759.ooo.test sudo[152979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:45 np0005486759.ooo.test python3.9[152981]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:25:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64192 DF PROTO=TCP SPT=55932 DPT=9105 SEQ=1903917584 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F80FFC10000000001030307) 
Oct 14 09:25:48 np0005486759.ooo.test sudo[152979]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:49 np0005486759.ooo.test sudo[153073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnzbpunmuxsbnjtuimuvsrezwimuzslf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433948.537411-70-169134671898785/AnsiballZ_systemd.py
Oct 14 09:25:49 np0005486759.ooo.test sudo[153073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65129 DF PROTO=TCP SPT=41790 DPT=9882 SEQ=2300732031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8108C10000000001030307) 
Oct 14 09:25:49 np0005486759.ooo.test python3.9[153075]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:25:49 np0005486759.ooo.test sudo[153073]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:50 np0005486759.ooo.test python3.9[153168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:25:51 np0005486759.ooo.test sudo[153258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjplemufvhrypfsluwqjxdljxlrtlxmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433950.6636956-88-201222086700742/AnsiballZ_sefcontext.py
Oct 14 09:25:51 np0005486759.ooo.test sudo[153258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:51 np0005486759.ooo.test python3.9[153260]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  Converting 2746 SID table entries...
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:25:52 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:25:52 np0005486759.ooo.test sudo[153258]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47087 DF PROTO=TCP SPT=33092 DPT=9102 SEQ=2918161117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8115410000000001030307) 
Oct 14 09:25:53 np0005486759.ooo.test python3.9[153483]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:25:55 np0005486759.ooo.test sudo[153579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwibnrtjdeibrswmwosoojpvsjlrlgtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433953.9603708-106-24993478529911/AnsiballZ_dnf.py
Oct 14 09:25:55 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Oct 14 09:25:55 np0005486759.ooo.test sudo[153579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:55 np0005486759.ooo.test python3.9[153581]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:25:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47088 DF PROTO=TCP SPT=33092 DPT=9102 SEQ=2918161117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8125010000000001030307) 
Oct 14 09:25:58 np0005486759.ooo.test sudo[153579]: pam_unix(sudo:session): session closed for user root
Oct 14 09:25:59 np0005486759.ooo.test sudo[153673]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdmydvugrkuykgmtiulkekcjjjsjxdvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433958.6449351-114-136124212205779/AnsiballZ_command.py
Oct 14 09:25:59 np0005486759.ooo.test sudo[153673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:25:59 np0005486759.ooo.test python3.9[153675]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:26:00 np0005486759.ooo.test sudo[153673]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:00 np0005486759.ooo.test sudo[153918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coqphgeymmbunapjzxqkurktwdbutanw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433960.203497-122-45423868914103/AnsiballZ_file.py
Oct 14 09:26:00 np0005486759.ooo.test sudo[153918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:00 np0005486759.ooo.test python3.9[153920]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:26:00 np0005486759.ooo.test sudo[153918]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:01 np0005486759.ooo.test python3.9[154010]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:26:02 np0005486759.ooo.test sudo[154102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgdsqfebvhratnluhahybywswuhbwkoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433962.0048757-140-80431804767015/AnsiballZ_dnf.py
Oct 14 09:26:02 np0005486759.ooo.test sudo[154102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:02 np0005486759.ooo.test python3.9[154104]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:26:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36148 DF PROTO=TCP SPT=58520 DPT=9100 SEQ=383057127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8142E70000000001030307) 
Oct 14 09:26:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36149 DF PROTO=TCP SPT=58520 DPT=9100 SEQ=383057127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8147010000000001030307) 
Oct 14 09:26:05 np0005486759.ooo.test sudo[154102]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:06 np0005486759.ooo.test sudo[154196]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffypcbgjlgjejdmccepeltrfyovrlugh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433965.921351-148-91880939001929/AnsiballZ_dnf.py
Oct 14 09:26:06 np0005486759.ooo.test sudo[154196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:06 np0005486759.ooo.test python3.9[154198]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:26:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36150 DF PROTO=TCP SPT=58520 DPT=9100 SEQ=383057127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F814F010000000001030307) 
Oct 14 09:26:09 np0005486759.ooo.test sudo[154196]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:10 np0005486759.ooo.test sudo[154290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nknhpxcxyyxfwkknmayqgtsvrahxyuel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433969.7159626-156-34141302788845/AnsiballZ_systemd.py
Oct 14 09:26:10 np0005486759.ooo.test sudo[154290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:10 np0005486759.ooo.test python3.9[154292]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 14 09:26:10 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:26:10 np0005486759.ooo.test systemd-rc-local-generator[154323]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:26:10 np0005486759.ooo.test systemd-sysv-generator[154326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:26:10 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:26:10 np0005486759.ooo.test sudo[154290]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:11 np0005486759.ooo.test sudo[154421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exhbxbaqhufhngzrsofptfhdinaoafuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433970.969621-166-258937750533213/AnsiballZ_stat.py
Oct 14 09:26:11 np0005486759.ooo.test sudo[154421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36151 DF PROTO=TCP SPT=58520 DPT=9100 SEQ=383057127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F815EC20000000001030307) 
Oct 14 09:26:11 np0005486759.ooo.test python3.9[154423]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:26:11 np0005486759.ooo.test sudo[154421]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:12 np0005486759.ooo.test sudo[154513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opntinwcohvvemzkroanfkmeotpsegxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433971.6831112-175-250048939802791/AnsiballZ_ini_file.py
Oct 14 09:26:12 np0005486759.ooo.test sudo[154513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28652 DF PROTO=TCP SPT=39922 DPT=9882 SEQ=3165921843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8162380000000001030307) 
Oct 14 09:26:12 np0005486759.ooo.test python3.9[154515]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:12 np0005486759.ooo.test sudo[154513]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:12 np0005486759.ooo.test sudo[154607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utlbmfcuxgnobqartvoqzppgtvdhgncc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433972.5411355-183-56844666126729/AnsiballZ_ini_file.py
Oct 14 09:26:12 np0005486759.ooo.test sudo[154607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:13 np0005486759.ooo.test python3.9[154609]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:13 np0005486759.ooo.test sudo[154607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28653 DF PROTO=TCP SPT=39922 DPT=9882 SEQ=3165921843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8166420000000001030307) 
Oct 14 09:26:13 np0005486759.ooo.test sudo[154699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rriaycixnawbyyqfdfqjdlhhleooerdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433973.2218432-191-196949927473079/AnsiballZ_ini_file.py
Oct 14 09:26:13 np0005486759.ooo.test sudo[154699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:13 np0005486759.ooo.test python3.9[154701]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:13 np0005486759.ooo.test sudo[154699]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:14 np0005486759.ooo.test sudo[154791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfkuadooxkbecuquigzorduhhowppcgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433973.9532762-201-98756605414043/AnsiballZ_stat.py
Oct 14 09:26:14 np0005486759.ooo.test sudo[154791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:14 np0005486759.ooo.test python3.9[154793]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:26:14 np0005486759.ooo.test sudo[154791]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:14 np0005486759.ooo.test sudo[154864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umgbmvfmrvvkmtlztooghawrekrxsjfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433973.9532762-201-98756605414043/AnsiballZ_copy.py
Oct 14 09:26:14 np0005486759.ooo.test sudo[154864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:15 np0005486759.ooo.test python3.9[154866]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433973.9532762-201-98756605414043/.source _original_basename=.akbcr2z4 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:15 np0005486759.ooo.test sudo[154864]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:15 np0005486759.ooo.test sudo[154956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyweuhxxxyaslosakamuqemrgdwrdrsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433975.2520137-216-241045804287957/AnsiballZ_file.py
Oct 14 09:26:15 np0005486759.ooo.test sudo[154956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:15 np0005486759.ooo.test python3.9[154958]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:15 np0005486759.ooo.test sudo[154956]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:16 np0005486759.ooo.test sudo[155048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plfvjsjehgnpjiadfydkfuhknnaqbkil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433975.8366268-224-71967466767336/AnsiballZ_edpm_os_net_config_mappings.py
Oct 14 09:26:16 np0005486759.ooo.test sudo[155048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:16 np0005486759.ooo.test python3.9[155050]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 14 09:26:16 np0005486759.ooo.test sudo[155048]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:16 np0005486759.ooo.test sudo[155140]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymhsvkwnsvvqofukocohohxayveagkfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433976.7216208-233-140451892965218/AnsiballZ_file.py
Oct 14 09:26:17 np0005486759.ooo.test sudo[155140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51958 DF PROTO=TCP SPT=49786 DPT=9105 SEQ=4108457963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8175010000000001030307) 
Oct 14 09:26:17 np0005486759.ooo.test python3.9[155142]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:17 np0005486759.ooo.test sudo[155140]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:18 np0005486759.ooo.test sudo[155232]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyizyhclhcbfxpobmaydvygyvxmlenpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433978.456171-243-3179399828723/AnsiballZ_stat.py
Oct 14 09:26:18 np0005486759.ooo.test sudo[155232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:18 np0005486759.ooo.test python3.9[155234]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:26:18 np0005486759.ooo.test sudo[155232]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:19 np0005486759.ooo.test sudo[155305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juzjoxpddzcufqnvynihdtjiquxvzckn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433978.456171-243-3179399828723/AnsiballZ_copy.py
Oct 14 09:26:19 np0005486759.ooo.test sudo[155305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28655 DF PROTO=TCP SPT=39922 DPT=9882 SEQ=3165921843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F817E010000000001030307) 
Oct 14 09:26:19 np0005486759.ooo.test python3.9[155307]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433978.456171-243-3179399828723/.source.yaml _original_basename=.vvl58usb follow=False checksum=dbb14a2cc09088cbb44ee7b7b2a5e1145bb56ed5 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:19 np0005486759.ooo.test sudo[155305]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:20 np0005486759.ooo.test sudo[155397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olxvgxzcfxetohjuvwffaggsunggmmhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433979.7854917-258-5786728520067/AnsiballZ_slurp.py
Oct 14 09:26:20 np0005486759.ooo.test sudo[155397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:20 np0005486759.ooo.test python3.9[155399]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 14 09:26:20 np0005486759.ooo.test sudo[155397]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:22 np0005486759.ooo.test sudo[155502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icnvabcbksobbkrbhmlyxmdjsoymduqi ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433981.567637-267-169490636033722/async_wrapper.py j346749023295 300 /home/zuul/.ansible/tmp/ansible-tmp-1760433981.567637-267-169490636033722/AnsiballZ_edpm_os_net_config.py _
Oct 14 09:26:22 np0005486759.ooo.test sudo[155502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:22 np0005486759.ooo.test ansible-async_wrapper.py[155504]: Invoked with j346749023295 300 /home/zuul/.ansible/tmp/ansible-tmp-1760433981.567637-267-169490636033722/AnsiballZ_edpm_os_net_config.py _
Oct 14 09:26:22 np0005486759.ooo.test ansible-async_wrapper.py[155507]: Starting module and watcher
Oct 14 09:26:22 np0005486759.ooo.test ansible-async_wrapper.py[155507]: Start watching 155508 (300)
Oct 14 09:26:22 np0005486759.ooo.test ansible-async_wrapper.py[155508]: Start module (155508)
Oct 14 09:26:22 np0005486759.ooo.test ansible-async_wrapper.py[155504]: Return async_wrapper task started.
Oct 14 09:26:22 np0005486759.ooo.test sudo[155502]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13232 DF PROTO=TCP SPT=35140 DPT=9102 SEQ=1120124049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F818A810000000001030307) 
Oct 14 09:26:22 np0005486759.ooo.test python3.9[155509]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Oct 14 09:26:23 np0005486759.ooo.test ansible-async_wrapper.py[155508]: Module complete (155508)
Oct 14 09:26:26 np0005486759.ooo.test sudo[155599]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwkcmquezcludrvlqwfbtiomhxtenzyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433985.621278-267-206050166043648/AnsiballZ_async_status.py
Oct 14 09:26:26 np0005486759.ooo.test sudo[155599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:26 np0005486759.ooo.test python3.9[155601]: ansible-ansible.legacy.async_status Invoked with jid=j346749023295.155504 mode=status _async_dir=/root/.ansible_async
Oct 14 09:26:26 np0005486759.ooo.test sudo[155599]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:26 np0005486759.ooo.test sudo[155658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frdfejhpmpejkpjihxpftqahzuxgthqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433985.621278-267-206050166043648/AnsiballZ_async_status.py
Oct 14 09:26:26 np0005486759.ooo.test sudo[155658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13233 DF PROTO=TCP SPT=35140 DPT=9102 SEQ=1120124049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F819A410000000001030307) 
Oct 14 09:26:26 np0005486759.ooo.test python3.9[155660]: ansible-ansible.legacy.async_status Invoked with jid=j346749023295.155504 mode=cleanup _async_dir=/root/.ansible_async
Oct 14 09:26:26 np0005486759.ooo.test sudo[155658]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:27 np0005486759.ooo.test sudo[155750]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtmioyfmranbfulignvsbdzxbqxiwxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433986.999891-289-171263689834052/AnsiballZ_stat.py
Oct 14 09:26:27 np0005486759.ooo.test sudo[155750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:27 np0005486759.ooo.test python3.9[155752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:26:27 np0005486759.ooo.test sudo[155750]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:27 np0005486759.ooo.test ansible-async_wrapper.py[155507]: Done in kid B.
Oct 14 09:26:27 np0005486759.ooo.test sudo[155823]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkpyrhsgclewazcctglizpkiksgfjruw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433986.999891-289-171263689834052/AnsiballZ_copy.py
Oct 14 09:26:27 np0005486759.ooo.test sudo[155823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:27 np0005486759.ooo.test python3.9[155825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433986.999891-289-171263689834052/.source.returncode _original_basename=.8c_3v0du follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:27 np0005486759.ooo.test sudo[155823]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:28 np0005486759.ooo.test sudo[155915]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyykaipjcmsvycotqwelnxkjqgbrqryq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433988.1263783-305-1089894377631/AnsiballZ_stat.py
Oct 14 09:26:28 np0005486759.ooo.test sudo[155915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:28 np0005486759.ooo.test python3.9[155917]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:26:28 np0005486759.ooo.test sudo[155915]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:29 np0005486759.ooo.test sudo[155988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yijqgzhxbammuckikwaxipqbotvicgun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433988.1263783-305-1089894377631/AnsiballZ_copy.py
Oct 14 09:26:29 np0005486759.ooo.test sudo[155988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:29 np0005486759.ooo.test python3.9[155990]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760433988.1263783-305-1089894377631/.source.cfg _original_basename=.8pktnqek follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:29 np0005486759.ooo.test sudo[155988]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:29 np0005486759.ooo.test sudo[156080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgvysebpbmgeffkjilakbtcfxluseuwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760433989.3221598-320-170465080466576/AnsiballZ_systemd.py
Oct 14 09:26:29 np0005486759.ooo.test sudo[156080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:29 np0005486759.ooo.test python3.9[156082]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:26:30 np0005486759.ooo.test systemd[1]: Reloading Network Manager...
Oct 14 09:26:30 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760433990.9374] audit: op="reload" arg="0" pid=156086 uid=0 result="success"
Oct 14 09:26:30 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760433990.9383] config: signal: SIGHUP (no changes from disk)
Oct 14 09:26:30 np0005486759.ooo.test systemd[1]: Reloaded Network Manager.
Oct 14 09:26:30 np0005486759.ooo.test sudo[156080]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:31 np0005486759.ooo.test sshd[152551]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:26:31 np0005486759.ooo.test systemd[1]: session-23.scope: Deactivated successfully.
Oct 14 09:26:31 np0005486759.ooo.test systemd[1]: session-23.scope: Consumed 35.426s CPU time.
Oct 14 09:26:31 np0005486759.ooo.test systemd-logind[759]: Session 23 logged out. Waiting for processes to exit.
Oct 14 09:26:31 np0005486759.ooo.test systemd-logind[759]: Removed session 23.
Oct 14 09:26:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42169 DF PROTO=TCP SPT=43184 DPT=9100 SEQ=2384750614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81B8170000000001030307) 
Oct 14 09:26:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42170 DF PROTO=TCP SPT=43184 DPT=9100 SEQ=2384750614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81BC010000000001030307) 
Oct 14 09:26:37 np0005486759.ooo.test sshd[156101]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:26:37 np0005486759.ooo.test sshd[156101]: Accepted publickey for zuul from 192.168.122.31 port 51050 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:26:37 np0005486759.ooo.test systemd-logind[759]: New session 24 of user zuul.
Oct 14 09:26:37 np0005486759.ooo.test systemd[1]: Started Session 24 of User zuul.
Oct 14 09:26:37 np0005486759.ooo.test sshd[156101]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:26:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42171 DF PROTO=TCP SPT=43184 DPT=9100 SEQ=2384750614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81C4010000000001030307) 
Oct 14 09:26:38 np0005486759.ooo.test python3.9[156194]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:26:39 np0005486759.ooo.test python3.9[156288]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:26:41 np0005486759.ooo.test python3.9[156441]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:26:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42172 DF PROTO=TCP SPT=43184 DPT=9100 SEQ=2384750614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81D3C10000000001030307) 
Oct 14 09:26:41 np0005486759.ooo.test sshd[156101]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:26:41 np0005486759.ooo.test systemd[1]: session-24.scope: Deactivated successfully.
Oct 14 09:26:41 np0005486759.ooo.test systemd[1]: session-24.scope: Consumed 2.092s CPU time.
Oct 14 09:26:41 np0005486759.ooo.test systemd-logind[759]: Session 24 logged out. Waiting for processes to exit.
Oct 14 09:26:41 np0005486759.ooo.test systemd-logind[759]: Removed session 24.
Oct 14 09:26:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25728 DF PROTO=TCP SPT=38740 DPT=9882 SEQ=1470002981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81D7680000000001030307) 
Oct 14 09:26:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25729 DF PROTO=TCP SPT=38740 DPT=9882 SEQ=1470002981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81DB810000000001030307) 
Oct 14 09:26:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54877 DF PROTO=TCP SPT=44428 DPT=9105 SEQ=2106809877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81EA420000000001030307) 
Oct 14 09:26:49 np0005486759.ooo.test sshd[156457]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:26:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25731 DF PROTO=TCP SPT=38740 DPT=9882 SEQ=1470002981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81F3410000000001030307) 
Oct 14 09:26:49 np0005486759.ooo.test sshd[156457]: Accepted publickey for zuul from 192.168.122.31 port 52316 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:26:49 np0005486759.ooo.test systemd-logind[759]: New session 25 of user zuul.
Oct 14 09:26:49 np0005486759.ooo.test systemd[1]: Started Session 25 of User zuul.
Oct 14 09:26:49 np0005486759.ooo.test sshd[156457]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:26:50 np0005486759.ooo.test python3.9[156550]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:26:51 np0005486759.ooo.test python3.9[156644]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:26:52 np0005486759.ooo.test sudo[156738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndsbsuyxwciwqiqqmahtpyyvotlvvvei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434012.0682073-40-233619476754312/AnsiballZ_setup.py
Oct 14 09:26:52 np0005486759.ooo.test sudo[156738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11378 DF PROTO=TCP SPT=39658 DPT=9102 SEQ=1097146846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F81FF810000000001030307) 
Oct 14 09:26:52 np0005486759.ooo.test python3.9[156740]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:26:52 np0005486759.ooo.test sudo[156738]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:53 np0005486759.ooo.test sudo[156792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdgxhionisjghltqsqrfhhibutyzpcly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434012.0682073-40-233619476754312/AnsiballZ_dnf.py
Oct 14 09:26:53 np0005486759.ooo.test sudo[156792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:53 np0005486759.ooo.test python3.9[156794]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:26:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11379 DF PROTO=TCP SPT=39658 DPT=9102 SEQ=1097146846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F820F410000000001030307) 
Oct 14 09:26:56 np0005486759.ooo.test sudo[156792]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:57 np0005486759.ooo.test sudo[156886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnorbrytpunkenqxgrbtxupsoflrdfot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434016.9183931-52-122185076174738/AnsiballZ_setup.py
Oct 14 09:26:58 np0005486759.ooo.test sudo[156886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:58 np0005486759.ooo.test python3.9[156888]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:26:58 np0005486759.ooo.test sudo[156886]: pam_unix(sudo:session): session closed for user root
Oct 14 09:26:59 np0005486759.ooo.test sudo[157041]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egwezgzxfzkkcprjqhosivvycorznekr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434018.8246205-63-67310497511852/AnsiballZ_file.py
Oct 14 09:26:59 np0005486759.ooo.test sudo[157041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:26:59 np0005486759.ooo.test python3.9[157043]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:26:59 np0005486759.ooo.test sudo[157041]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:00 np0005486759.ooo.test sudo[157133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swgjhiliatnazcurkehzgftcqsnfcqkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434020.2462883-71-161160336866601/AnsiballZ_command.py
Oct 14 09:27:00 np0005486759.ooo.test sudo[157133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:00 np0005486759.ooo.test python3.9[157135]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:27:01 np0005486759.ooo.test sudo[157133]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:01 np0005486759.ooo.test sudo[157237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mznjcuiifcgsrjiswkljiynazlbxjuhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434021.218759-79-24664986138131/AnsiballZ_stat.py
Oct 14 09:27:01 np0005486759.ooo.test sudo[157237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:01 np0005486759.ooo.test python3.9[157239]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:01 np0005486759.ooo.test sudo[157237]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:02 np0005486759.ooo.test sudo[157285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otuojombtkdutrejzmyjmaqofnowpmvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434021.218759-79-24664986138131/AnsiballZ_file.py
Oct 14 09:27:02 np0005486759.ooo.test sudo[157285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:02 np0005486759.ooo.test python3.9[157287]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:02 np0005486759.ooo.test sudo[157285]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:02 np0005486759.ooo.test sudo[157377]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjmxaswpgtxoqgckuwdbrizipuaawqty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434022.5623608-91-226770637399858/AnsiballZ_stat.py
Oct 14 09:27:02 np0005486759.ooo.test sudo[157377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:03 np0005486759.ooo.test python3.9[157379]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:03 np0005486759.ooo.test sudo[157377]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:03 np0005486759.ooo.test sudo[157425]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aodfvyepzffywvqmayvkdvtbpwkzvclv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434022.5623608-91-226770637399858/AnsiballZ_file.py
Oct 14 09:27:03 np0005486759.ooo.test sudo[157425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:03 np0005486759.ooo.test python3.9[157427]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:03 np0005486759.ooo.test sudo[157425]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25035 DF PROTO=TCP SPT=36778 DPT=9100 SEQ=3295955974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F822D470000000001030307) 
Oct 14 09:27:04 np0005486759.ooo.test sudo[157517]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmbekwduofckxvvrcxldujslacqaivom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434023.84841-104-233415327903569/AnsiballZ_ini_file.py
Oct 14 09:27:04 np0005486759.ooo.test sudo[157517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:04 np0005486759.ooo.test python3.9[157519]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:04 np0005486759.ooo.test sudo[157517]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:04 np0005486759.ooo.test sudo[157609]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxzkumzquaqqrijkghmluflytqykeltg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434024.6883178-104-63703681099211/AnsiballZ_ini_file.py
Oct 14 09:27:04 np0005486759.ooo.test sudo[157609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:05 np0005486759.ooo.test python3.9[157611]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:05 np0005486759.ooo.test sudo[157609]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25036 DF PROTO=TCP SPT=36778 DPT=9100 SEQ=3295955974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8231410000000001030307) 
Oct 14 09:27:05 np0005486759.ooo.test sudo[157701]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbwugzigxnvsyjhbhhseecrxgusyeohq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434025.348223-104-73850246752979/AnsiballZ_ini_file.py
Oct 14 09:27:05 np0005486759.ooo.test sudo[157701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:05 np0005486759.ooo.test python3.9[157703]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:05 np0005486759.ooo.test sudo[157701]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:06 np0005486759.ooo.test sudo[157793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhtsnerqimspgyeevurhopiawwgfxdsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434025.9963686-104-56259084262855/AnsiballZ_ini_file.py
Oct 14 09:27:06 np0005486759.ooo.test sudo[157793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:06 np0005486759.ooo.test python3.9[157795]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:06 np0005486759.ooo.test sudo[157793]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:06 np0005486759.ooo.test sudo[157885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkhcdbbscnnssddsbnysyflxleuxmzoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434026.731931-135-250381715920970/AnsiballZ_dnf.py
Oct 14 09:27:06 np0005486759.ooo.test sudo[157885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:07 np0005486759.ooo.test python3.9[157887]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:27:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25037 DF PROTO=TCP SPT=36778 DPT=9100 SEQ=3295955974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8239420000000001030307) 
Oct 14 09:27:10 np0005486759.ooo.test sudo[157885]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25038 DF PROTO=TCP SPT=36778 DPT=9100 SEQ=3295955974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8249020000000001030307) 
Oct 14 09:27:11 np0005486759.ooo.test sudo[157979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlzijqhncveyzgbqzhdtrvuchrtxdpcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434031.2271676-146-199265963390445/AnsiballZ_setup.py
Oct 14 09:27:11 np0005486759.ooo.test sudo[157979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:11 np0005486759.ooo.test python3.9[157981]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:27:11 np0005486759.ooo.test sudo[157979]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35109 DF PROTO=TCP SPT=36898 DPT=9882 SEQ=1320743266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F824C980000000001030307) 
Oct 14 09:27:12 np0005486759.ooo.test sudo[158073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtojfpanlzeikajaqlkgycixehcfrdgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434032.030594-154-193417210569616/AnsiballZ_stat.py
Oct 14 09:27:12 np0005486759.ooo.test sudo[158073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:12 np0005486759.ooo.test python3.9[158075]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:27:12 np0005486759.ooo.test sudo[158073]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35110 DF PROTO=TCP SPT=36898 DPT=9882 SEQ=1320743266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8250810000000001030307) 
Oct 14 09:27:14 np0005486759.ooo.test sudo[158165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcmkvnwcfoevzekskcuqahkkholkkyaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434033.691238-163-48608104423869/AnsiballZ_stat.py
Oct 14 09:27:14 np0005486759.ooo.test sudo[158165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:14 np0005486759.ooo.test python3.9[158167]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:27:14 np0005486759.ooo.test sudo[158165]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:14 np0005486759.ooo.test sudo[158257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcaiwfuivuusaeiqlykqjssdywqnnbyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434034.509416-173-60657234758222/AnsiballZ_service_facts.py
Oct 14 09:27:15 np0005486759.ooo.test sudo[158257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:15 np0005486759.ooo.test python3.9[158259]: ansible-service_facts Invoked
Oct 14 09:27:15 np0005486759.ooo.test network[158276]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:27:15 np0005486759.ooo.test network[158277]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:27:15 np0005486759.ooo.test network[158278]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:27:16 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:27:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3136 DF PROTO=TCP SPT=52866 DPT=9105 SEQ=566367447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F825F410000000001030307) 
Oct 14 09:27:18 np0005486759.ooo.test sudo[158257]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35112 DF PROTO=TCP SPT=36898 DPT=9882 SEQ=1320743266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8268420000000001030307) 
Oct 14 09:27:19 np0005486759.ooo.test sudo[158489]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rihyxobltblznrelysxtolbumdhwynts ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1760434039.3386931-186-260229594306844/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1760434039.3386931-186-260229594306844/args
Oct 14 09:27:19 np0005486759.ooo.test sudo[158489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:19 np0005486759.ooo.test sudo[158489]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:20 np0005486759.ooo.test sudo[158596]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgqjdjstcxcbpntvvisbqxzvduzwxpwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434040.035376-197-110011051833842/AnsiballZ_dnf.py
Oct 14 09:27:20 np0005486759.ooo.test sudo[158596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:20 np0005486759.ooo.test python3.9[158598]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:27:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17835 DF PROTO=TCP SPT=47810 DPT=9102 SEQ=2959699051 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8274C10000000001030307) 
Oct 14 09:27:23 np0005486759.ooo.test sudo[158596]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:24 np0005486759.ooo.test sudo[158690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llhlcvebtkgothxqkuiniqanepprkrvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434044.2544796-210-262608732941108/AnsiballZ_package_facts.py
Oct 14 09:27:24 np0005486759.ooo.test sudo[158690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:25 np0005486759.ooo.test python3.9[158692]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 14 09:27:25 np0005486759.ooo.test sudo[158690]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:26 np0005486759.ooo.test sudo[158782]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmkxcvanwhqqqwyluoyaordtfnvizejl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434046.105096-220-260229975258632/AnsiballZ_stat.py
Oct 14 09:27:26 np0005486759.ooo.test sudo[158782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17836 DF PROTO=TCP SPT=47810 DPT=9102 SEQ=2959699051 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8284820000000001030307) 
Oct 14 09:27:26 np0005486759.ooo.test python3.9[158784]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:26 np0005486759.ooo.test sudo[158782]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:27 np0005486759.ooo.test sudo[158857]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wogfatttavdjokkinpexwhemgpttcmxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434046.105096-220-260229975258632/AnsiballZ_copy.py
Oct 14 09:27:27 np0005486759.ooo.test sudo[158857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:27 np0005486759.ooo.test python3.9[158859]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434046.105096-220-260229975258632/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:27 np0005486759.ooo.test sudo[158857]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:28 np0005486759.ooo.test sudo[158951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnzamkmzqoktbqthagjyusppudctppeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434048.3608305-235-75824223578816/AnsiballZ_stat.py
Oct 14 09:27:28 np0005486759.ooo.test sudo[158951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:28 np0005486759.ooo.test python3.9[158953]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:28 np0005486759.ooo.test sudo[158951]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:29 np0005486759.ooo.test sudo[159026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzvmulobbhptkxzxjlvgkydezfyeklde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434048.3608305-235-75824223578816/AnsiballZ_copy.py
Oct 14 09:27:29 np0005486759.ooo.test sudo[159026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:29 np0005486759.ooo.test python3.9[159028]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434048.3608305-235-75824223578816/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:29 np0005486759.ooo.test sudo[159026]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:30 np0005486759.ooo.test sudo[159120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afyptoefngmopwxatibafydmlcbunypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434050.0950341-256-146997916479491/AnsiballZ_lineinfile.py
Oct 14 09:27:30 np0005486759.ooo.test sudo[159120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:30 np0005486759.ooo.test python3.9[159122]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:30 np0005486759.ooo.test sudo[159120]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:31 np0005486759.ooo.test sudo[159214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqmdsotggxqlmoywlvnmharzkddwiqsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434051.3323169-271-111920641239184/AnsiballZ_setup.py
Oct 14 09:27:31 np0005486759.ooo.test sudo[159214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:31 np0005486759.ooo.test python3.9[159216]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:27:32 np0005486759.ooo.test sudo[159214]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:32 np0005486759.ooo.test sudo[159268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erszkcvcnkpvycjdyuumshybkqcksjpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434051.3323169-271-111920641239184/AnsiballZ_systemd.py
Oct 14 09:27:32 np0005486759.ooo.test sudo[159268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:33 np0005486759.ooo.test python3.9[159270]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:27:33 np0005486759.ooo.test sudo[159268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:33 np0005486759.ooo.test sudo[159362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbrqjxyhtgovdftmusexfqgnmfsohxkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434053.6684434-287-276642149942654/AnsiballZ_setup.py
Oct 14 09:27:33 np0005486759.ooo.test sudo[159362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=970 DF PROTO=TCP SPT=59548 DPT=9100 SEQ=1689883406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82A2780000000001030307) 
Oct 14 09:27:34 np0005486759.ooo.test python3.9[159364]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:27:34 np0005486759.ooo.test sudo[159362]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:34 np0005486759.ooo.test sudo[159416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vauuxinygybrfipigstlugrfssnafrur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434053.6684434-287-276642149942654/AnsiballZ_systemd.py
Oct 14 09:27:34 np0005486759.ooo.test sudo[159416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:35 np0005486759.ooo.test python3.9[159418]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:27:35 np0005486759.ooo.test chronyd[33790]: chronyd exiting
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: Stopping NTP client/server...
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: chronyd.service: Deactivated successfully.
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: Stopped NTP client/server.
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: Starting NTP client/server...
Oct 14 09:27:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=971 DF PROTO=TCP SPT=59548 DPT=9100 SEQ=1689883406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82A6820000000001030307) 
Oct 14 09:27:35 np0005486759.ooo.test chronyd[159426]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 14 09:27:35 np0005486759.ooo.test chronyd[159426]: Frequency -30.672 +/- 0.288 ppm read from /var/lib/chrony/drift
Oct 14 09:27:35 np0005486759.ooo.test chronyd[159426]: Loaded seccomp filter (level 2)
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: Started NTP client/server.
Oct 14 09:27:35 np0005486759.ooo.test sudo[159416]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:35 np0005486759.ooo.test sshd[156457]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: session-25.scope: Deactivated successfully.
Oct 14 09:27:35 np0005486759.ooo.test systemd[1]: session-25.scope: Consumed 27.808s CPU time.
Oct 14 09:27:35 np0005486759.ooo.test systemd-logind[759]: Session 25 logged out. Waiting for processes to exit.
Oct 14 09:27:35 np0005486759.ooo.test systemd-logind[759]: Removed session 25.
Oct 14 09:27:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=972 DF PROTO=TCP SPT=59548 DPT=9100 SEQ=1689883406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82AE810000000001030307) 
Oct 14 09:27:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=973 DF PROTO=TCP SPT=59548 DPT=9100 SEQ=1689883406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82BE420000000001030307) 
Oct 14 09:27:41 np0005486759.ooo.test sshd[159442]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:27:41 np0005486759.ooo.test sshd[159442]: Accepted publickey for zuul from 192.168.122.31 port 54672 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:27:41 np0005486759.ooo.test systemd-logind[759]: New session 26 of user zuul.
Oct 14 09:27:41 np0005486759.ooo.test systemd[1]: Started Session 26 of User zuul.
Oct 14 09:27:41 np0005486759.ooo.test sshd[159442]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:27:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33410 DF PROTO=TCP SPT=44448 DPT=9882 SEQ=1108843912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82C1C80000000001030307) 
Oct 14 09:27:43 np0005486759.ooo.test python3.9[159535]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:27:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33411 DF PROTO=TCP SPT=44448 DPT=9882 SEQ=1108843912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82C5C20000000001030307) 
Oct 14 09:27:44 np0005486759.ooo.test sudo[159629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfcgjemjmravdfcbjmuqkiuplwdeujzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434063.7048647-33-259905654645113/AnsiballZ_file.py
Oct 14 09:27:44 np0005486759.ooo.test sudo[159629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:44 np0005486759.ooo.test python3.9[159631]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:44 np0005486759.ooo.test sudo[159629]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:44 np0005486759.ooo.test sudo[159734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjyvpsbcxydjifdacjwwxozmchgbxnbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434064.4718957-41-233434991294034/AnsiballZ_stat.py
Oct 14 09:27:44 np0005486759.ooo.test sudo[159734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:45 np0005486759.ooo.test python3.9[159736]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:45 np0005486759.ooo.test sudo[159734]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:45 np0005486759.ooo.test sudo[159782]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mligpfxnhtucfkzfhzvofoinddjxiidc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434064.4718957-41-233434991294034/AnsiballZ_file.py
Oct 14 09:27:45 np0005486759.ooo.test sudo[159782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:45 np0005486759.ooo.test python3.9[159784]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.c5ppmxr9 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:45 np0005486759.ooo.test sudo[159782]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:46 np0005486759.ooo.test sudo[159874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yupqyozdoxgrfaftvfepvejwyxhayabd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434065.9034586-61-53953139482054/AnsiballZ_stat.py
Oct 14 09:27:46 np0005486759.ooo.test sudo[159874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:46 np0005486759.ooo.test python3.9[159876]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:46 np0005486759.ooo.test sudo[159874]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:46 np0005486759.ooo.test sudo[159949]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmzflstnmfsqnnmmlijorujvqwdgnpid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434065.9034586-61-53953139482054/AnsiballZ_copy.py
Oct 14 09:27:46 np0005486759.ooo.test sudo[159949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:47 np0005486759.ooo.test python3.9[159951]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434065.9034586-61-53953139482054/.source _original_basename=.b13lwbrj follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11633 DF PROTO=TCP SPT=60220 DPT=9105 SEQ=3125987240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82D4820000000001030307) 
Oct 14 09:27:47 np0005486759.ooo.test sudo[159949]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:47 np0005486759.ooo.test sudo[160041]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onecmihenesamvtyujgqlnrwdcnkukof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434067.392601-77-141946625030082/AnsiballZ_file.py
Oct 14 09:27:47 np0005486759.ooo.test sudo[160041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:47 np0005486759.ooo.test python3.9[160043]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:47 np0005486759.ooo.test sudo[160041]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:48 np0005486759.ooo.test sudo[160133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzwoorudonhmzqjmmvxsijdttrobufwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434068.054341-85-106735120533002/AnsiballZ_stat.py
Oct 14 09:27:48 np0005486759.ooo.test sudo[160133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:48 np0005486759.ooo.test python3.9[160135]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:48 np0005486759.ooo.test sudo[160133]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:48 np0005486759.ooo.test sudo[160206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnzbngewqwdwogptnlyxwpaakuamnqst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434068.054341-85-106735120533002/AnsiballZ_copy.py
Oct 14 09:27:48 np0005486759.ooo.test sudo[160206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:49 np0005486759.ooo.test python3.9[160208]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434068.054341-85-106735120533002/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:49 np0005486759.ooo.test sudo[160206]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33413 DF PROTO=TCP SPT=44448 DPT=9882 SEQ=1108843912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82DD810000000001030307) 
Oct 14 09:27:49 np0005486759.ooo.test sudo[160298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrdfouajnhpfqqhpdhdegytkfxoauimz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434069.184954-85-252543408014504/AnsiballZ_stat.py
Oct 14 09:27:49 np0005486759.ooo.test sudo[160298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:49 np0005486759.ooo.test python3.9[160300]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:49 np0005486759.ooo.test sudo[160298]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:50 np0005486759.ooo.test sudo[160371]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-affpkeasmidxgvqvlfgojluxmjxamorr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434069.184954-85-252543408014504/AnsiballZ_copy.py
Oct 14 09:27:50 np0005486759.ooo.test sudo[160371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:50 np0005486759.ooo.test python3.9[160373]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434069.184954-85-252543408014504/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:27:50 np0005486759.ooo.test sudo[160371]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:50 np0005486759.ooo.test sudo[160463]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zglxgubjybnwuyfgmefztsqvzvirnwqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434070.4526544-114-177928443456124/AnsiballZ_file.py
Oct 14 09:27:50 np0005486759.ooo.test sudo[160463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:50 np0005486759.ooo.test python3.9[160465]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:50 np0005486759.ooo.test sudo[160463]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:51 np0005486759.ooo.test sudo[160555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suxjtvmaaqxiexaxwrogxutidticztqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434071.0921276-122-249404490462608/AnsiballZ_stat.py
Oct 14 09:27:51 np0005486759.ooo.test sudo[160555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:51 np0005486759.ooo.test python3.9[160557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:51 np0005486759.ooo.test sudo[160555]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:51 np0005486759.ooo.test sudo[160628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slqsmruxfeqaecktcvdmmatuuvgeqfhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434071.0921276-122-249404490462608/AnsiballZ_copy.py
Oct 14 09:27:51 np0005486759.ooo.test sudo[160628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:52 np0005486759.ooo.test python3.9[160630]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434071.0921276-122-249404490462608/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:52 np0005486759.ooo.test sudo[160628]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38025 DF PROTO=TCP SPT=36704 DPT=9102 SEQ=1521860109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82EA010000000001030307) 
Oct 14 09:27:53 np0005486759.ooo.test sudo[160720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrmcyudqsddspcdcwawinyirmeqhdkbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434073.244167-137-192906922630970/AnsiballZ_stat.py
Oct 14 09:27:53 np0005486759.ooo.test sudo[160720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:53 np0005486759.ooo.test python3.9[160722]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:53 np0005486759.ooo.test sudo[160720]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:54 np0005486759.ooo.test sudo[160793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsljdftpkhmqnzddcvlwjyejsigpxmve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434073.244167-137-192906922630970/AnsiballZ_copy.py
Oct 14 09:27:54 np0005486759.ooo.test sudo[160793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:54 np0005486759.ooo.test python3.9[160795]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434073.244167-137-192906922630970/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:54 np0005486759.ooo.test sudo[160793]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:55 np0005486759.ooo.test sudo[160885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxyotslzzassvuuknsrpxjkbeyuhztds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434074.4065669-152-23662312804296/AnsiballZ_systemd.py
Oct 14 09:27:55 np0005486759.ooo.test sudo[160885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:55 np0005486759.ooo.test python3.9[160887]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:27:55 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:27:55 np0005486759.ooo.test systemd-sysv-generator[160911]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:27:55 np0005486759.ooo.test systemd-rc-local-generator[160904]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:27:56 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:27:56 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:27:56 np0005486759.ooo.test systemd-rc-local-generator[160953]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:27:56 np0005486759.ooo.test systemd-sysv-generator[160956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:27:56 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:27:56 np0005486759.ooo.test systemd[1]: Starting EDPM Container Shutdown...
Oct 14 09:27:56 np0005486759.ooo.test systemd[1]: Finished EDPM Container Shutdown.
Oct 14 09:27:56 np0005486759.ooo.test sudo[160885]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38026 DF PROTO=TCP SPT=36704 DPT=9102 SEQ=1521860109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F82F9C10000000001030307) 
Oct 14 09:27:56 np0005486759.ooo.test sudo[161055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olhsifseteaxczidxyjugdyoaswbwzxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434076.6522863-160-22652307807444/AnsiballZ_stat.py
Oct 14 09:27:56 np0005486759.ooo.test sudo[161055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:57 np0005486759.ooo.test python3.9[161057]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:57 np0005486759.ooo.test sudo[161055]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:57 np0005486759.ooo.test sudo[161128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qridtrenadrqvxwhwighdqyjfztpnlcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434076.6522863-160-22652307807444/AnsiballZ_copy.py
Oct 14 09:27:57 np0005486759.ooo.test sudo[161128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:57 np0005486759.ooo.test python3.9[161130]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434076.6522863-160-22652307807444/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:57 np0005486759.ooo.test sudo[161128]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:58 np0005486759.ooo.test sudo[161220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciynzbozbfvtglsosydmafuegdjggzfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434077.935141-175-275966237379733/AnsiballZ_stat.py
Oct 14 09:27:58 np0005486759.ooo.test sudo[161220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:58 np0005486759.ooo.test python3.9[161222]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:27:58 np0005486759.ooo.test sudo[161220]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:58 np0005486759.ooo.test sudo[161293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pckfzlvyyswnjfeqljfspdpftmynomwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434077.935141-175-275966237379733/AnsiballZ_copy.py
Oct 14 09:27:58 np0005486759.ooo.test sudo[161293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:58 np0005486759.ooo.test python3.9[161295]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434077.935141-175-275966237379733/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:27:58 np0005486759.ooo.test sudo[161293]: pam_unix(sudo:session): session closed for user root
Oct 14 09:27:59 np0005486759.ooo.test sudo[161385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaulkrymdwwovbvdfahknjlgyzdozpes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434079.143585-190-71562299032224/AnsiballZ_systemd.py
Oct 14 09:27:59 np0005486759.ooo.test sudo[161385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:27:59 np0005486759.ooo.test python3.9[161387]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:27:59 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:27:59 np0005486759.ooo.test systemd-rc-local-generator[161406]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:27:59 np0005486759.ooo.test systemd-sysv-generator[161414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:27:59 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:28:00 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:28:00 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:28:00 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:28:00 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:28:00 np0005486759.ooo.test sudo[161385]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:00 np0005486759.ooo.test python3.9[161519]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:28:01 np0005486759.ooo.test network[161536]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:28:01 np0005486759.ooo.test network[161537]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:28:01 np0005486759.ooo.test network[161538]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:28:03 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:28:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28322 DF PROTO=TCP SPT=36698 DPT=9100 SEQ=4072146643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8317A80000000001030307) 
Oct 14 09:28:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28323 DF PROTO=TCP SPT=36698 DPT=9100 SEQ=4072146643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F831BC10000000001030307) 
Oct 14 09:28:06 np0005486759.ooo.test sudo[161736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aibrrkquufmbebuskzxxgewpedpcnlva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434086.6385517-216-13920570463963/AnsiballZ_stat.py
Oct 14 09:28:06 np0005486759.ooo.test sudo[161736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:07 np0005486759.ooo.test python3.9[161738]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:07 np0005486759.ooo.test sudo[161736]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28324 DF PROTO=TCP SPT=36698 DPT=9100 SEQ=4072146643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8323C10000000001030307) 
Oct 14 09:28:07 np0005486759.ooo.test sudo[161811]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvnkstvrezwhlmebbsfizpquejjwpogy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434086.6385517-216-13920570463963/AnsiballZ_copy.py
Oct 14 09:28:07 np0005486759.ooo.test sudo[161811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:07 np0005486759.ooo.test python3.9[161813]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434086.6385517-216-13920570463963/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:07 np0005486759.ooo.test sudo[161811]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:08 np0005486759.ooo.test python3.9[161904]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:28:08 np0005486759.ooo.test polkitd[1035]: Registered Authentication Agent for unix-process:161906:1010914 (system bus name :1.1789 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Oct 14 09:28:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28325 DF PROTO=TCP SPT=36698 DPT=9100 SEQ=4072146643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8333810000000001030307) 
Oct 14 09:28:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42771 DF PROTO=TCP SPT=39752 DPT=9882 SEQ=3322238937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8336F70000000001030307) 
Oct 14 09:28:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42772 DF PROTO=TCP SPT=39752 DPT=9882 SEQ=3322238937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F833B010000000001030307) 
Oct 14 09:28:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57750 DF PROTO=TCP SPT=58322 DPT=9105 SEQ=2463467476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8349C10000000001030307) 
Oct 14 09:28:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42774 DF PROTO=TCP SPT=39752 DPT=9882 SEQ=3322238937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8352C10000000001030307) 
Oct 14 09:28:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31108 DF PROTO=TCP SPT=58170 DPT=9102 SEQ=1134910090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F835F410000000001030307) 
Oct 14 09:28:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31109 DF PROTO=TCP SPT=58170 DPT=9102 SEQ=1134910090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F836F010000000001030307) 
Oct 14 09:28:34 np0005486759.ooo.test polkitd[1035]: Unregistered Authentication Agent for unix-process:161906:1010914 (system bus name :1.1789, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Oct 14 09:28:34 np0005486759.ooo.test polkit-agent-helper-1[161918]: pam_unix(polkit-1:auth): conversation failed
Oct 14 09:28:34 np0005486759.ooo.test polkit-agent-helper-1[161918]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Oct 14 09:28:34 np0005486759.ooo.test polkitd[1035]: Operator of unix-process:161906:1010914 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.1788 [<unknown>] (owned by unix-user:zuul)
Oct 14 09:28:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46361 DF PROTO=TCP SPT=55552 DPT=9100 SEQ=2093651792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F838CD80000000001030307) 
Oct 14 09:28:34 np0005486759.ooo.test sshd[159442]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:28:34 np0005486759.ooo.test systemd[1]: session-26.scope: Deactivated successfully.
Oct 14 09:28:34 np0005486759.ooo.test systemd[1]: session-26.scope: Consumed 13.796s CPU time.
Oct 14 09:28:34 np0005486759.ooo.test systemd-logind[759]: Session 26 logged out. Waiting for processes to exit.
Oct 14 09:28:34 np0005486759.ooo.test systemd-logind[759]: Removed session 26.
Oct 14 09:28:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46362 DF PROTO=TCP SPT=55552 DPT=9100 SEQ=2093651792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8390C10000000001030307) 
Oct 14 09:28:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46363 DF PROTO=TCP SPT=55552 DPT=9100 SEQ=2093651792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8398C10000000001030307) 
Oct 14 09:28:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46364 DF PROTO=TCP SPT=55552 DPT=9100 SEQ=2093651792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83A8810000000001030307) 
Oct 14 09:28:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21287 DF PROTO=TCP SPT=59476 DPT=9882 SEQ=2736415660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83AC280000000001030307) 
Oct 14 09:28:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21288 DF PROTO=TCP SPT=59476 DPT=9882 SEQ=2736415660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83B0410000000001030307) 
Oct 14 09:28:46 np0005486759.ooo.test sshd[161934]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:28:46 np0005486759.ooo.test sshd[161934]: Accepted publickey for zuul from 192.168.122.30 port 53344 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:28:46 np0005486759.ooo.test systemd-logind[759]: New session 27 of user zuul.
Oct 14 09:28:46 np0005486759.ooo.test systemd[1]: Started Session 27 of User zuul.
Oct 14 09:28:46 np0005486759.ooo.test sshd[161934]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:28:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61107 DF PROTO=TCP SPT=49726 DPT=9105 SEQ=472837576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83BF010000000001030307) 
Oct 14 09:28:47 np0005486759.ooo.test python3.9[162027]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:28:48 np0005486759.ooo.test sudo[162121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knayobybugsjfghzgqegxkxeymoparhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434128.2365017-33-84653894796350/AnsiballZ_file.py
Oct 14 09:28:48 np0005486759.ooo.test sudo[162121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:48 np0005486759.ooo.test python3.9[162123]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:48 np0005486759.ooo.test sudo[162121]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21290 DF PROTO=TCP SPT=59476 DPT=9882 SEQ=2736415660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83C8010000000001030307) 
Oct 14 09:28:50 np0005486759.ooo.test sudo[162226]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pshpclffzvnizezrpdlervhltlneeofu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434129.7371416-41-37657781070010/AnsiballZ_stat.py
Oct 14 09:28:50 np0005486759.ooo.test sudo[162226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:50 np0005486759.ooo.test python3.9[162228]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:50 np0005486759.ooo.test sudo[162226]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:50 np0005486759.ooo.test sudo[162274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kudrqxjowwmsvlgaaqgegxyzbwufgjld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434129.7371416-41-37657781070010/AnsiballZ_file.py
Oct 14 09:28:50 np0005486759.ooo.test sudo[162274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:50 np0005486759.ooo.test python3.9[162276]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.vu2gczf9 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:50 np0005486759.ooo.test sudo[162274]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:52 np0005486759.ooo.test sudo[162366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwhwuedyxdadcqvsliipfhthrgquhuyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434132.1586072-61-180874108476745/AnsiballZ_stat.py
Oct 14 09:28:52 np0005486759.ooo.test sudo[162366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12413 DF PROTO=TCP SPT=35532 DPT=9102 SEQ=3457535710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83D4810000000001030307) 
Oct 14 09:28:52 np0005486759.ooo.test python3.9[162368]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:52 np0005486759.ooo.test sudo[162366]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:52 np0005486759.ooo.test sudo[162414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvwlknzbshbauhhcmnxpxaicwqruultz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434132.1586072-61-180874108476745/AnsiballZ_file.py
Oct 14 09:28:52 np0005486759.ooo.test sudo[162414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:53 np0005486759.ooo.test python3.9[162416]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.223qdh63 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:53 np0005486759.ooo.test sudo[162414]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:53 np0005486759.ooo.test sudo[162506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksxshhxqrtnenwaligzktehwywcilynj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434133.351761-74-111413666206820/AnsiballZ_file.py
Oct 14 09:28:53 np0005486759.ooo.test sudo[162506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:53 np0005486759.ooo.test python3.9[162508]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:28:53 np0005486759.ooo.test sudo[162506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:54 np0005486759.ooo.test sudo[162599]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejrqvagcxrxdxyshwjenvvzsksfjnyfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434134.038524-82-23991595953454/AnsiballZ_stat.py
Oct 14 09:28:54 np0005486759.ooo.test sudo[162599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:54 np0005486759.ooo.test python3.9[162601]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:54 np0005486759.ooo.test sudo[162599]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:54 np0005486759.ooo.test sudo[162647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdawbhaenwozqoybwfudbjtpvsccdkeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434134.038524-82-23991595953454/AnsiballZ_file.py
Oct 14 09:28:54 np0005486759.ooo.test sudo[162647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:55 np0005486759.ooo.test python3.9[162649]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:28:55 np0005486759.ooo.test sudo[162647]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:55 np0005486759.ooo.test sudo[162739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kintclkxqyaqdxjkcjyvnwuaxmjzsaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434135.2116637-82-55920849748072/AnsiballZ_stat.py
Oct 14 09:28:55 np0005486759.ooo.test sudo[162739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:55 np0005486759.ooo.test python3.9[162741]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:55 np0005486759.ooo.test sudo[162739]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:55 np0005486759.ooo.test sudo[162787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nectpbhlzxvmdhtwkqsesdmojcrnaslk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434135.2116637-82-55920849748072/AnsiballZ_file.py
Oct 14 09:28:55 np0005486759.ooo.test sudo[162787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:56 np0005486759.ooo.test python3.9[162789]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:28:56 np0005486759.ooo.test sudo[162787]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12414 DF PROTO=TCP SPT=35532 DPT=9102 SEQ=3457535710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F83E4410000000001030307) 
Oct 14 09:28:56 np0005486759.ooo.test sudo[162879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqsymjaherrrrujtgedwckxkrehyoixt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434136.39649-105-29587072443101/AnsiballZ_file.py
Oct 14 09:28:56 np0005486759.ooo.test sudo[162879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:56 np0005486759.ooo.test python3.9[162881]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:56 np0005486759.ooo.test sudo[162879]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:57 np0005486759.ooo.test sudo[162971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwankjtwsbebilvxephngriufkhsxwxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434137.0333328-113-53581542021171/AnsiballZ_stat.py
Oct 14 09:28:57 np0005486759.ooo.test sudo[162971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:57 np0005486759.ooo.test python3.9[162973]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:57 np0005486759.ooo.test sudo[162971]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:57 np0005486759.ooo.test sudo[163019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uapzuvwsiyeuwleyavcdhnbgksbykhbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434137.0333328-113-53581542021171/AnsiballZ_file.py
Oct 14 09:28:57 np0005486759.ooo.test sudo[163019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:58 np0005486759.ooo.test python3.9[163021]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:58 np0005486759.ooo.test sudo[163019]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:58 np0005486759.ooo.test sudo[163111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opsoxsvovbapxbacvruhojomivdykbew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434138.1757288-125-231532773381044/AnsiballZ_stat.py
Oct 14 09:28:58 np0005486759.ooo.test sudo[163111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:58 np0005486759.ooo.test python3.9[163113]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:28:58 np0005486759.ooo.test sudo[163111]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:58 np0005486759.ooo.test sudo[163159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enrdbmonwvitzzoxgoqwtunpydnnwntv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434138.1757288-125-231532773381044/AnsiballZ_file.py
Oct 14 09:28:58 np0005486759.ooo.test sudo[163159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:28:59 np0005486759.ooo.test python3.9[163161]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:28:59 np0005486759.ooo.test sudo[163159]: pam_unix(sudo:session): session closed for user root
Oct 14 09:28:59 np0005486759.ooo.test sudo[163251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pospahwqrbwhrmzewluyqnadkvlaiusw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434139.2124956-137-85829861984976/AnsiballZ_systemd.py
Oct 14 09:28:59 np0005486759.ooo.test sudo[163251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:00 np0005486759.ooo.test python3.9[163253]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:29:00 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:29:00 np0005486759.ooo.test systemd-rc-local-generator[163275]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:29:00 np0005486759.ooo.test systemd-sysv-generator[163281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:29:00 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:29:00 np0005486759.ooo.test sudo[163251]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:00 np0005486759.ooo.test sudo[163381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhrgvijdzbgqwijwxhmqyqimqykvnypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434140.6540637-145-82347105365034/AnsiballZ_stat.py
Oct 14 09:29:00 np0005486759.ooo.test sudo[163381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:01 np0005486759.ooo.test python3.9[163383]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:01 np0005486759.ooo.test sudo[163381]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:01 np0005486759.ooo.test sudo[163429]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjmlhwyzclcizmqbgldjcuhysetomydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434140.6540637-145-82347105365034/AnsiballZ_file.py
Oct 14 09:29:01 np0005486759.ooo.test sudo[163429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:01 np0005486759.ooo.test python3.9[163431]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:01 np0005486759.ooo.test sudo[163429]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:02 np0005486759.ooo.test sudo[163521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caguunapudyroqvrvnofelvenbowhymf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434141.7257693-157-242431679386562/AnsiballZ_stat.py
Oct 14 09:29:02 np0005486759.ooo.test sudo[163521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:02 np0005486759.ooo.test python3.9[163523]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:02 np0005486759.ooo.test sudo[163521]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:02 np0005486759.ooo.test sudo[163569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doptsumchdrfuunziwqnykojmhmvcyec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434141.7257693-157-242431679386562/AnsiballZ_file.py
Oct 14 09:29:02 np0005486759.ooo.test sudo[163569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:02 np0005486759.ooo.test python3.9[163571]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:02 np0005486759.ooo.test sudo[163569]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:03 np0005486759.ooo.test sudo[163661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwjxvwjfjpvfowugqrtiedpxhzzmzdrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434142.8357093-169-246228605388217/AnsiballZ_systemd.py
Oct 14 09:29:03 np0005486759.ooo.test sudo[163661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:03 np0005486759.ooo.test python3.9[163663]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:29:03 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:29:03 np0005486759.ooo.test systemd-rc-local-generator[163691]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:29:03 np0005486759.ooo.test systemd-sysv-generator[163695]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:29:03 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:29:03 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:29:03 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:29:03 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:29:03 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:29:03 np0005486759.ooo.test sudo[163661]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58891 DF PROTO=TCP SPT=57890 DPT=9100 SEQ=1964120906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8402080000000001030307) 
Oct 14 09:29:04 np0005486759.ooo.test python3.9[163796]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:29:04 np0005486759.ooo.test network[163813]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:29:04 np0005486759.ooo.test network[163814]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:29:04 np0005486759.ooo.test network[163815]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:29:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58892 DF PROTO=TCP SPT=57890 DPT=9100 SEQ=1964120906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8406010000000001030307) 
Oct 14 09:29:05 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:29:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58893 DF PROTO=TCP SPT=57890 DPT=9100 SEQ=1964120906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F840E010000000001030307) 
Oct 14 09:29:08 np0005486759.ooo.test sudo[164013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdowygkibvbpdxtkkvmzbjdbnxvbtxhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434147.70763-195-239261721832139/AnsiballZ_stat.py
Oct 14 09:29:08 np0005486759.ooo.test sudo[164013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:08 np0005486759.ooo.test python3.9[164015]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:08 np0005486759.ooo.test sudo[164013]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:08 np0005486759.ooo.test sudo[164061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ticzpohoqwqbbnkllfaawhdtmnxnqkbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434147.70763-195-239261721832139/AnsiballZ_file.py
Oct 14 09:29:08 np0005486759.ooo.test sudo[164061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:08 np0005486759.ooo.test python3.9[164063]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:08 np0005486759.ooo.test sudo[164061]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:09 np0005486759.ooo.test sudo[164153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmsdrxcsnwbifpehwyiqiwgwcrivcwbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434148.9001167-208-180868235612755/AnsiballZ_file.py
Oct 14 09:29:09 np0005486759.ooo.test sudo[164153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:09 np0005486759.ooo.test python3.9[164155]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:09 np0005486759.ooo.test sudo[164153]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:09 np0005486759.ooo.test sudo[164245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehnfuhswtmtswgttzsvdlrlddetypqqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434149.5995088-216-152549831790399/AnsiballZ_stat.py
Oct 14 09:29:09 np0005486759.ooo.test sudo[164245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:10 np0005486759.ooo.test python3.9[164247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:10 np0005486759.ooo.test sudo[164245]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:10 np0005486759.ooo.test sudo[164318]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hryxqmnojjrihxukeprvitlngvhidpab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434149.5995088-216-152549831790399/AnsiballZ_copy.py
Oct 14 09:29:10 np0005486759.ooo.test sudo[164318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:10 np0005486759.ooo.test python3.9[164320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434149.5995088-216-152549831790399/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:10 np0005486759.ooo.test sudo[164318]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58894 DF PROTO=TCP SPT=57890 DPT=9100 SEQ=1964120906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F841DC10000000001030307) 
Oct 14 09:29:11 np0005486759.ooo.test sudo[164410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzzmajdckaywbitzrlkhujrpytvdrpim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434151.2214973-234-1881587121247/AnsiballZ_timezone.py
Oct 14 09:29:11 np0005486759.ooo.test sudo[164410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:11 np0005486759.ooo.test python3.9[164412]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 14 09:29:11 np0005486759.ooo.test systemd[1]: Starting Time & Date Service...
Oct 14 09:29:12 np0005486759.ooo.test systemd[1]: Started Time & Date Service.
Oct 14 09:29:12 np0005486759.ooo.test sudo[164410]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25426 DF PROTO=TCP SPT=39304 DPT=9882 SEQ=234387281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8421580000000001030307) 
Oct 14 09:29:12 np0005486759.ooo.test sudo[164506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gixdricqgjffurfczgsavcfurxvqorgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434152.373945-243-737288850249/AnsiballZ_file.py
Oct 14 09:29:12 np0005486759.ooo.test sudo[164506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:12 np0005486759.ooo.test python3.9[164508]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:12 np0005486759.ooo.test sudo[164506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25427 DF PROTO=TCP SPT=39304 DPT=9882 SEQ=234387281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8425410000000001030307) 
Oct 14 09:29:13 np0005486759.ooo.test sudo[164598]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebnoxtmvpoxtryclsormptynqtggzwhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434153.0870392-251-188898123349588/AnsiballZ_stat.py
Oct 14 09:29:13 np0005486759.ooo.test sudo[164598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:13 np0005486759.ooo.test python3.9[164600]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:13 np0005486759.ooo.test sudo[164598]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:13 np0005486759.ooo.test sudo[164671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahrbzpqqbodegfcthhaqrwubyypofhcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434153.0870392-251-188898123349588/AnsiballZ_copy.py
Oct 14 09:29:13 np0005486759.ooo.test sudo[164671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:14 np0005486759.ooo.test python3.9[164673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434153.0870392-251-188898123349588/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:14 np0005486759.ooo.test sudo[164671]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:14 np0005486759.ooo.test sudo[164763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiydzmyggkvrmcmhirhbdmtpzavuyalk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434154.348527-266-250797650424935/AnsiballZ_stat.py
Oct 14 09:29:14 np0005486759.ooo.test sudo[164763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:14 np0005486759.ooo.test python3.9[164765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:14 np0005486759.ooo.test sudo[164763]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:15 np0005486759.ooo.test sudo[164836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnukzwrysckvkxeubpaaxibwzgzdvokj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434154.348527-266-250797650424935/AnsiballZ_copy.py
Oct 14 09:29:15 np0005486759.ooo.test sudo[164836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:15 np0005486759.ooo.test python3.9[164838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434154.348527-266-250797650424935/.source.yaml _original_basename=.xvrbh79w follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:15 np0005486759.ooo.test sudo[164836]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:15 np0005486759.ooo.test sudo[164928]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-finebovyrfcwczuktsdoqgygtcmylmww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434155.5302505-281-128547926702166/AnsiballZ_stat.py
Oct 14 09:29:15 np0005486759.ooo.test sudo[164928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:16 np0005486759.ooo.test python3.9[164930]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:16 np0005486759.ooo.test sudo[164928]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:16 np0005486759.ooo.test sudo[165003]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqqjexhlonuakxhawbukjrrfclcgmiub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434155.5302505-281-128547926702166/AnsiballZ_copy.py
Oct 14 09:29:16 np0005486759.ooo.test sudo[165003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:16 np0005486759.ooo.test python3.9[165005]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434155.5302505-281-128547926702166/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:16 np0005486759.ooo.test sudo[165003]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65044 DF PROTO=TCP SPT=55282 DPT=9105 SEQ=2233220215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8434010000000001030307) 
Oct 14 09:29:17 np0005486759.ooo.test sudo[165095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glqrrbqaapygmyqhfaaysiyhnqtlwyhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434156.77705-296-247877545475934/AnsiballZ_command.py
Oct 14 09:29:17 np0005486759.ooo.test sudo[165095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:17 np0005486759.ooo.test python3.9[165097]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:29:17 np0005486759.ooo.test sudo[165095]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:18 np0005486759.ooo.test sudo[165188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvufkmrjgzmjdicgtpaxiluqxdqloqqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434158.4353752-304-167634857387221/AnsiballZ_command.py
Oct 14 09:29:18 np0005486759.ooo.test sudo[165188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:18 np0005486759.ooo.test python3.9[165190]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:29:18 np0005486759.ooo.test sudo[165188]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25429 DF PROTO=TCP SPT=39304 DPT=9882 SEQ=234387281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F843D010000000001030307) 
Oct 14 09:29:19 np0005486759.ooo.test sudo[165281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxqtmvkkjwjiynpiulimzozptshduzmo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434159.1397314-312-103057984872187/AnsiballZ_edpm_nftables_from_files.py
Oct 14 09:29:19 np0005486759.ooo.test sudo[165281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:19 np0005486759.ooo.test python3[165283]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 14 09:29:19 np0005486759.ooo.test sudo[165281]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:21 np0005486759.ooo.test sudo[165373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnboggunswxzimreixjhmqmxvyppljtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434160.0747564-320-32600057307925/AnsiballZ_stat.py
Oct 14 09:29:21 np0005486759.ooo.test sudo[165373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:21 np0005486759.ooo.test python3.9[165375]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:21 np0005486759.ooo.test sudo[165373]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:21 np0005486759.ooo.test sudo[165446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfgbavisgnadepzsietebdperdfxystz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434160.0747564-320-32600057307925/AnsiballZ_copy.py
Oct 14 09:29:21 np0005486759.ooo.test sudo[165446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:21 np0005486759.ooo.test python3.9[165448]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434160.0747564-320-32600057307925/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:21 np0005486759.ooo.test sudo[165446]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:22 np0005486759.ooo.test sudo[165538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smsmqjveluyrukyfjzherdmhlnckeldh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434161.997013-335-72917342178548/AnsiballZ_stat.py
Oct 14 09:29:22 np0005486759.ooo.test sudo[165538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:22 np0005486759.ooo.test python3.9[165540]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:22 np0005486759.ooo.test sudo[165538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:22 np0005486759.ooo.test sudo[165611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzkvaraohvihyomvtqwfmljdzgzqdcjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434161.997013-335-72917342178548/AnsiballZ_copy.py
Oct 14 09:29:22 np0005486759.ooo.test sudo[165611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:23 np0005486759.ooo.test python3.9[165613]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434161.997013-335-72917342178548/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:23 np0005486759.ooo.test sudo[165611]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:23 np0005486759.ooo.test sudo[165703]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npbnfpiafjkxdtyolzaynsrtjjnuassn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434163.2499564-350-97512528206480/AnsiballZ_stat.py
Oct 14 09:29:23 np0005486759.ooo.test sudo[165703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:23 np0005486759.ooo.test python3.9[165705]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:23 np0005486759.ooo.test sudo[165703]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:24 np0005486759.ooo.test sudo[165776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mivnanumqkmmfewebceidlzjqoahekmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434163.2499564-350-97512528206480/AnsiballZ_copy.py
Oct 14 09:29:24 np0005486759.ooo.test sudo[165776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:24 np0005486759.ooo.test python3.9[165778]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434163.2499564-350-97512528206480/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:24 np0005486759.ooo.test sudo[165776]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:24 np0005486759.ooo.test sudo[165868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqlzvaxgaecubzauudxbialweqwlpipo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434164.4920075-365-179626092700886/AnsiballZ_stat.py
Oct 14 09:29:24 np0005486759.ooo.test sudo[165868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:24 np0005486759.ooo.test python3.9[165870]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:25 np0005486759.ooo.test sudo[165868]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:25 np0005486759.ooo.test sudo[165941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iullznmqdcnkqdtmknvqflfridgmpewi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434164.4920075-365-179626092700886/AnsiballZ_copy.py
Oct 14 09:29:25 np0005486759.ooo.test sudo[165941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:25 np0005486759.ooo.test python3.9[165943]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434164.4920075-365-179626092700886/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:25 np0005486759.ooo.test sudo[165941]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:26 np0005486759.ooo.test sudo[166033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcmnxghnyjseitrintekapnyoyjymwrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434165.7636366-380-91329066252517/AnsiballZ_stat.py
Oct 14 09:29:26 np0005486759.ooo.test sudo[166033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:26 np0005486759.ooo.test python3.9[166035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:26 np0005486759.ooo.test sudo[166033]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:26 np0005486759.ooo.test sudo[166106]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biwltcppokcvrkcdogjriagspphkwfee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434165.7636366-380-91329066252517/AnsiballZ_copy.py
Oct 14 09:29:26 np0005486759.ooo.test sudo[166106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:26 np0005486759.ooo.test python3.9[166108]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434165.7636366-380-91329066252517/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:26 np0005486759.ooo.test sudo[166106]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:27 np0005486759.ooo.test sudo[166198]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isnyijaxrmpamakgfyvvlqoxbyajvhtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434167.0453527-395-230917994298500/AnsiballZ_file.py
Oct 14 09:29:27 np0005486759.ooo.test sudo[166198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:27 np0005486759.ooo.test python3.9[166200]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:27 np0005486759.ooo.test sudo[166198]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:28 np0005486759.ooo.test sudo[166290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sktjllxrfbphgvtnmtyrtlfbqovvuvwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434167.741441-403-272524916173695/AnsiballZ_command.py
Oct 14 09:29:28 np0005486759.ooo.test sudo[166290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:28 np0005486759.ooo.test python3.9[166292]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:29:28 np0005486759.ooo.test sudo[166290]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:28 np0005486759.ooo.test sudo[166385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztlsepffgvvkeajkivmzneppfcptofig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434168.4316099-411-249182800363401/AnsiballZ_blockinfile.py
Oct 14 09:29:28 np0005486759.ooo.test sudo[166385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:29 np0005486759.ooo.test python3.9[166387]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/edpm-chains.nft"
                                                         include "/etc/nftables/edpm-rules.nft"
                                                         include "/etc/nftables/edpm-jumps.nft"
                                                          path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:29 np0005486759.ooo.test sudo[166385]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:29 np0005486759.ooo.test sudo[166478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhthelvnizoxbqjzlrmfxnympfsscujt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434169.3823287-420-66418578382951/AnsiballZ_file.py
Oct 14 09:29:29 np0005486759.ooo.test sudo[166478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:29 np0005486759.ooo.test python3.9[166480]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:29 np0005486759.ooo.test sudo[166478]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:30 np0005486759.ooo.test sudo[166570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jitrdnprgbnyuzevvoypfqkkdlwdnzpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434170.051851-420-57965766832951/AnsiballZ_file.py
Oct 14 09:29:30 np0005486759.ooo.test sudo[166570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:30 np0005486759.ooo.test python3.9[166572]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:30 np0005486759.ooo.test sudo[166570]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:31 np0005486759.ooo.test sudo[166662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzibgcrnolzwoqpdcjafptfshxbzodgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434170.7620006-435-195170798074936/AnsiballZ_mount.py
Oct 14 09:29:31 np0005486759.ooo.test sudo[166662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:31 np0005486759.ooo.test python3.9[166664]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 14 09:29:31 np0005486759.ooo.test sudo[166662]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:31 np0005486759.ooo.test sudo[166755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtnrbsapeqvrdjtszzdxhtrhgpcvcovs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434171.591189-435-148790809700529/AnsiballZ_mount.py
Oct 14 09:29:31 np0005486759.ooo.test sudo[166755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:32 np0005486759.ooo.test python3.9[166757]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 14 09:29:32 np0005486759.ooo.test sudo[166755]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:32 np0005486759.ooo.test sshd[161934]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:29:32 np0005486759.ooo.test systemd[1]: session-27.scope: Deactivated successfully.
Oct 14 09:29:32 np0005486759.ooo.test systemd[1]: session-27.scope: Consumed 27.132s CPU time.
Oct 14 09:29:32 np0005486759.ooo.test systemd-logind[759]: Session 27 logged out. Waiting for processes to exit.
Oct 14 09:29:32 np0005486759.ooo.test systemd-logind[759]: Removed session 27.
Oct 14 09:29:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40877 DF PROTO=TCP SPT=55296 DPT=9100 SEQ=4183133542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8477380000000001030307) 
Oct 14 09:29:38 np0005486759.ooo.test sshd[166773]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:29:38 np0005486759.ooo.test sshd[166773]: Accepted publickey for zuul from 192.168.122.30 port 44552 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:29:38 np0005486759.ooo.test systemd-logind[759]: New session 28 of user zuul.
Oct 14 09:29:38 np0005486759.ooo.test systemd[1]: Started Session 28 of User zuul.
Oct 14 09:29:38 np0005486759.ooo.test sshd[166773]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:29:39 np0005486759.ooo.test python3.9[166866]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:29:40 np0005486759.ooo.test sudo[166960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtllensvkljfdglkzftgmtspwdnipbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434179.6596272-32-213084207565806/AnsiballZ_systemd.py
Oct 14 09:29:40 np0005486759.ooo.test sudo[166960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:40 np0005486759.ooo.test python3.9[166962]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 14 09:29:40 np0005486759.ooo.test sudo[166960]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:41 np0005486759.ooo.test sudo[167054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmwquwaqkapqsnqskgrvznjuqjvdinvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434180.8542576-40-181024638022185/AnsiballZ_systemd.py
Oct 14 09:29:41 np0005486759.ooo.test sudo[167054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:41 np0005486759.ooo.test python3.9[167056]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:29:41 np0005486759.ooo.test sudo[167054]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:42 np0005486759.ooo.test systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 14 09:29:42 np0005486759.ooo.test sudo[167147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvpopqgpcokfxxcpnfqiqdoegfvzwafy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434181.6940866-49-24670921241035/AnsiballZ_command.py
Oct 14 09:29:42 np0005486759.ooo.test sudo[167147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49153 DF PROTO=TCP SPT=46108 DPT=9882 SEQ=2111390620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8496880000000001030307) 
Oct 14 09:29:42 np0005486759.ooo.test python3.9[167151]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:29:42 np0005486759.ooo.test sudo[167147]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:42 np0005486759.ooo.test sudo[167242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uczeqdestezogmyccsajdbwoadcqtjeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434182.539613-57-19212814325842/AnsiballZ_stat.py
Oct 14 09:29:42 np0005486759.ooo.test sudo[167242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:43 np0005486759.ooo.test python3.9[167244]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:29:43 np0005486759.ooo.test sudo[167242]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:43 np0005486759.ooo.test sudo[167336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wovtgqxzwbralnpsykljbspqoqfieouj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434183.36923-65-123345924925747/AnsiballZ_command.py
Oct 14 09:29:43 np0005486759.ooo.test sudo[167336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:43 np0005486759.ooo.test python3.9[167338]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:29:43 np0005486759.ooo.test sudo[167336]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:44 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9082 DF PROTO=TCP SPT=56204 DPT=9105 SEQ=1723035632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F849D3C0000000001030307) 
Oct 14 09:29:44 np0005486759.ooo.test sudo[167431]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iodvfxcszwvnjukiavzbwyoftfbkrydb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434183.9799488-73-104869683226791/AnsiballZ_file.py
Oct 14 09:29:44 np0005486759.ooo.test sudo[167431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:44 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7083 DF PROTO=TCP SPT=41882 DPT=9101 SEQ=2604926378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F849EF50000000001030307) 
Oct 14 09:29:44 np0005486759.ooo.test python3.9[167433]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:44 np0005486759.ooo.test sudo[167431]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:45 np0005486759.ooo.test sshd[166773]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:29:45 np0005486759.ooo.test systemd[1]: session-28.scope: Deactivated successfully.
Oct 14 09:29:45 np0005486759.ooo.test systemd[1]: session-28.scope: Consumed 3.663s CPU time.
Oct 14 09:29:45 np0005486759.ooo.test systemd-logind[759]: Session 28 logged out. Waiting for processes to exit.
Oct 14 09:29:45 np0005486759.ooo.test systemd-logind[759]: Removed session 28.
Oct 14 09:29:45 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9083 DF PROTO=TCP SPT=56204 DPT=9105 SEQ=1723035632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84A1410000000001030307) 
Oct 14 09:29:45 np0005486759.ooo.test chronyd[159426]: Selected source 216.232.132.102 (pool.ntp.org)
Oct 14 09:29:45 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7084 DF PROTO=TCP SPT=41882 DPT=9101 SEQ=2604926378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84A3010000000001030307) 
Oct 14 09:29:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9084 DF PROTO=TCP SPT=56204 DPT=9105 SEQ=1723035632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84A9410000000001030307) 
Oct 14 09:29:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7085 DF PROTO=TCP SPT=41882 DPT=9101 SEQ=2604926378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84AB010000000001030307) 
Oct 14 09:29:47 np0005486759.ooo.test sshd[167448]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:29:47 np0005486759.ooo.test sshd[167448]: Accepted publickey for zuul from 192.168.122.31 port 55202 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:29:47 np0005486759.ooo.test systemd-logind[759]: New session 29 of user zuul.
Oct 14 09:29:47 np0005486759.ooo.test systemd[1]: Started Session 29 of User zuul.
Oct 14 09:29:47 np0005486759.ooo.test sshd[167448]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:29:49 np0005486759.ooo.test sudo[167541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwkfsmhiuniwyafrltoyjkcunvsuvctf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434187.9539633-17-145688307830694/AnsiballZ_tempfile.py
Oct 14 09:29:49 np0005486759.ooo.test sudo[167541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:49 np0005486759.ooo.test python3.9[167543]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 14 09:29:49 np0005486759.ooo.test sudo[167541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6030 DF PROTO=TCP SPT=42494 DPT=9102 SEQ=1969631894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84B2B70000000001030307) 
Oct 14 09:29:50 np0005486759.ooo.test sudo[167633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcseblzfonsycekkwnibmbwfxaeolrhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434189.6608932-41-52964220640592/AnsiballZ_stat.py
Oct 14 09:29:50 np0005486759.ooo.test sudo[167633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:50 np0005486759.ooo.test python3.9[167635]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:29:50 np0005486759.ooo.test sudo[167633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6031 DF PROTO=TCP SPT=42494 DPT=9102 SEQ=1969631894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84B6C10000000001030307) 
Oct 14 09:29:51 np0005486759.ooo.test sudo[167727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvkiahrukadhgnmjsksnrsvdzsycgcym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434190.5832095-57-30347260789390/AnsiballZ_slurp.py
Oct 14 09:29:51 np0005486759.ooo.test sudo[167727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:51 np0005486759.ooo.test sshd[167730]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:29:51 np0005486759.ooo.test python3.9[167729]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Oct 14 09:29:51 np0005486759.ooo.test sudo[167727]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:51 np0005486759.ooo.test sshd[167730]: Accepted publickey for zuul from 192.168.122.30 port 40028 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:29:51 np0005486759.ooo.test systemd-logind[759]: New session 30 of user zuul.
Oct 14 09:29:51 np0005486759.ooo.test systemd[1]: Started Session 30 of User zuul.
Oct 14 09:29:51 np0005486759.ooo.test sshd[167730]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:29:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6032 DF PROTO=TCP SPT=42494 DPT=9102 SEQ=1969631894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84BEC20000000001030307) 
Oct 14 09:29:52 np0005486759.ooo.test sudo[167895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufjfjhawyzoinkjuonefcywfrkctnuok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434191.4212632-73-72729604133738/AnsiballZ_stat.py
Oct 14 09:29:52 np0005486759.ooo.test sudo[167895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:52 np0005486759.ooo.test python3.9[167901]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.7ccl7tpk follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:29:52 np0005486759.ooo.test sudo[167895]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:53 np0005486759.ooo.test python3.9[167915]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:29:53 np0005486759.ooo.test sudo[168006]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xawcbsdebfimqvbnpxfgcmvyjbiuftvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434191.4212632-73-72729604133738/AnsiballZ_copy.py
Oct 14 09:29:53 np0005486759.ooo.test sudo[168006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:53 np0005486759.ooo.test python3.9[168008]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.7ccl7tpk mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434191.4212632-73-72729604133738/.source.7ccl7tpk _original_basename=.if_jz_iq follow=False checksum=617c1a25454733d2263b06b139316ca3d32cda2e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:53 np0005486759.ooo.test sudo[168006]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:53 np0005486759.ooo.test sudo[168129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uckeczhvwhzffusmroguzrihpnxwcxxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434193.604269-34-132376839877112/AnsiballZ_setup.py
Oct 14 09:29:53 np0005486759.ooo.test sudo[168129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:54 np0005486759.ooo.test python3.9[168132]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:29:54 np0005486759.ooo.test sudo[168180]:     zuul : TTY=pts/2 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jefmtgvhiyisixceupxmvaenkfmwfqdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434193.8135042-103-94064544779332/AnsiballZ_setup.py
Oct 14 09:29:54 np0005486759.ooo.test sudo[168180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:54 np0005486759.ooo.test sudo[168129]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:54 np0005486759.ooo.test python3.9[168182]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:29:54 np0005486759.ooo.test sudo[168180]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:54 np0005486759.ooo.test sudo[168244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjewpkzpayhnbbbuirmsflqmccyqqbfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434193.604269-34-132376839877112/AnsiballZ_dnf.py
Oct 14 09:29:54 np0005486759.ooo.test sudo[168244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:55 np0005486759.ooo.test python3.9[168246]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 14 09:29:55 np0005486759.ooo.test sudo[168323]:     zuul : TTY=pts/2 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzdmlvgjvfhgugtsyxagwuflhhxdswuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434194.9635463-120-255334462830561/AnsiballZ_blockinfile.py
Oct 14 09:29:55 np0005486759.ooo.test sudo[168323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:55 np0005486759.ooo.test python3.9[168325]: ansible-ansible.builtin.blockinfile Invoked with block=np0005486759.ooo.test,192.168.122.107,np0005486759* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDa7N/yf5kblytISC8YE3UN49GJulR+55hCVck3l5AqdE+beyJ+w8p1C78UnecqfQMxqRm33gN6DojHe1jClFu5yXaX4PkXntTMh9OVTmHf4h+I1VDHv24Pk9IVHv/+p005cD/6p0aUXc3UlKcftzByKVCQz0hQ8VWKbVAutMFA0CybLnUKZD6ev92/TcYBkjFVAGBFdYisqLFLXLZAhKw/Vi30rEZYweRPLcWAs1HsEM3B0H8fejbp0qbBeYxafRFhfnNgGhtfYu/qAj4DjOmpAwiVKEiaaCH39yKCMuGFhU+FNoKpxDsgv+pvy4XMhOrkv8r+dAydiNrrunuHMXW9w+x5ifTxJCnpbjXSsksH6btN3AnB0QnRZ3e+Go0fnivXr/F0oOZDUcziGRnyJAik4Ycd2T/Wy0SegD/VQJ6RTln2lEYQU5N6lDsWh/fs/Fo3/Xg4/g8TKAsjYuZZzPQ4IbpfE+oyhRaz6qpi2a98pmsVsbJhuFsddikOYK9BTV8=
                                                         np0005486759.ooo.test,192.168.122.107,np0005486759* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHaZSDrxcTA28ABEaLk4iZkZ8bws44u+wGoBxkvKu9RD
                                                         np0005486759.ooo.test,192.168.122.107,np0005486759* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPzKzMXNg2N5TRwf8sUh+nVdyWNwhEr4wQtRpTjiIRummZ6NlE5mwb3U6IaRqWxQMzoRwrObcDy4+03oP6QDjbI=
                                                         np0005486761.ooo.test,192.168.122.109,np0005486761* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCcu1MZlfouL0B/zhrpzzQ8XMvU8sl1bmt1ZFTQANdMXL1E3y4+2OoW3JZOmZ2XEWUs79YZwLyguN78QNlI7UeuqWBSNKpb3053jLWi3Y7M+VEXFzjOBHGbNPLYVVnE9F2IoYrQvVK0/6adHvtHmYwP+a9WFJWnuUI7KrSc89JrEZMsw2jTcfPejsxM3Gib0F8otmSkZ2q7w2uT+btlqv5pyM2RBZwEOUe0wJHc3/jO4mBg7A9wzx1Tb8FuYMBHQ4UOz54pAtf/CI4p3Aiorwguxke/XjR2hw8t/C2KjP4+p4TYTI3d7Mr+BOhj6I0St24moGniV6g5JnW1MBUNlenMa4HC2QzQEv0AlPTNzu8usLI81X+cf44HM8/3GQSmgO3pbTIL4hHyB/sfWN4zmTDDxVYtT7P/KkE9VjzZk6doH+z9IjRFhtEMWfzTP5DrDlJqvxpmay4q3+WMP7bob1UC1nJehFaqLQ06Qe3wZiTtg4YJffVgOPZTcCz8MMJ9BFs=
                                                         np0005486761.ooo.test,192.168.122.109,np0005486761* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPkKgroHxoLvKgzu9QNbvm2G1lP3X7nOQ41MbzAyBITO
                                                         np0005486761.ooo.test,192.168.122.109,np0005486761* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA1wowxneWMUBUv7Iwe3GF4aejqtCXeN0ZtiY6foEAidabjIESadHyO2eLGmrmkbaVbO9Ah0/N4BLN4Ar/rUOCM=
                                                          create=True mode=0644 path=/tmp/ansible.7ccl7tpk state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:55 np0005486759.ooo.test sudo[168323]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:56 np0005486759.ooo.test sudo[168416]:     zuul : TTY=pts/2 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtfdrufiqrzjygbxaenjloknoahnzdoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434195.788669-136-90434874700612/AnsiballZ_command.py
Oct 14 09:29:56 np0005486759.ooo.test sudo[168416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:56 np0005486759.ooo.test python3.9[168418]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.7ccl7tpk' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:29:56 np0005486759.ooo.test sudo[168416]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6033 DF PROTO=TCP SPT=42494 DPT=9102 SEQ=1969631894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84CE820000000001030307) 
Oct 14 09:29:57 np0005486759.ooo.test sudo[168510]:     zuul : TTY=pts/2 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fubkrjrpudbvjqsriplvgpdyckoathlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434196.705256-152-176134521108801/AnsiballZ_file.py
Oct 14 09:29:57 np0005486759.ooo.test sudo[168510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:29:57 np0005486759.ooo.test python3.9[168512]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.7ccl7tpk state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:29:57 np0005486759.ooo.test sudo[168510]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:57 np0005486759.ooo.test sshd[167448]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:29:57 np0005486759.ooo.test systemd[1]: session-29.scope: Deactivated successfully.
Oct 14 09:29:57 np0005486759.ooo.test systemd[1]: session-29.scope: Consumed 4.208s CPU time.
Oct 14 09:29:57 np0005486759.ooo.test systemd-logind[759]: Session 29 logged out. Waiting for processes to exit.
Oct 14 09:29:57 np0005486759.ooo.test systemd-logind[759]: Removed session 29.
Oct 14 09:29:58 np0005486759.ooo.test sudo[168244]: pam_unix(sudo:session): session closed for user root
Oct 14 09:29:59 np0005486759.ooo.test python3.9[168616]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:30:00 np0005486759.ooo.test sudo[168707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akxeucbsjwukdkunlypzezfhknxorzyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434200.0125325-55-128372542292179/AnsiballZ_file.py
Oct 14 09:30:00 np0005486759.ooo.test sudo[168707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:00 np0005486759.ooo.test python3.9[168709]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:00 np0005486759.ooo.test sudo[168707]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:01 np0005486759.ooo.test sudo[168799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzcqlpjqrjuhumuuhcicgqlsujyswuvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434200.908707-63-179927492275695/AnsiballZ_file.py
Oct 14 09:30:01 np0005486759.ooo.test sudo[168799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:01 np0005486759.ooo.test python3.9[168801]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:01 np0005486759.ooo.test sudo[168799]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:02 np0005486759.ooo.test sudo[168891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jieoqpboeftypxgpsrymmahfsdrejcos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434201.6090534-71-56585705659701/AnsiballZ_lineinfile.py
Oct 14 09:30:02 np0005486759.ooo.test sudo[168891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:02 np0005486759.ooo.test python3.9[168893]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                         Core libraries or services have been updated since boot-up:
                                                           * systemd
                                                         
                                                         Reboot is required to fully utilize these updates.
                                                         More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:02 np0005486759.ooo.test sudo[168891]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:03 np0005486759.ooo.test python3.9[168983]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:30:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27038 DF PROTO=TCP SPT=37402 DPT=9100 SEQ=2288620315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84EC670000000001030307) 
Oct 14 09:30:04 np0005486759.ooo.test python3.9[169073]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:30:05 np0005486759.ooo.test python3.9[169165]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:30:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27039 DF PROTO=TCP SPT=37402 DPT=9100 SEQ=2288620315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84F0810000000001030307) 
Oct 14 09:30:06 np0005486759.ooo.test sshd[167730]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:30:06 np0005486759.ooo.test systemd[1]: session-30.scope: Deactivated successfully.
Oct 14 09:30:06 np0005486759.ooo.test systemd[1]: session-30.scope: Consumed 8.972s CPU time.
Oct 14 09:30:06 np0005486759.ooo.test systemd-logind[759]: Session 30 logged out. Waiting for processes to exit.
Oct 14 09:30:06 np0005486759.ooo.test systemd-logind[759]: Removed session 30.
Oct 14 09:30:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27040 DF PROTO=TCP SPT=37402 DPT=9100 SEQ=2288620315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F84F8810000000001030307) 
Oct 14 09:30:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27041 DF PROTO=TCP SPT=37402 DPT=9100 SEQ=2288620315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8508410000000001030307) 
Oct 14 09:30:12 np0005486759.ooo.test sshd[169181]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:30:12 np0005486759.ooo.test sshd[169181]: Accepted publickey for zuul from 192.168.122.30 port 56918 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:30:12 np0005486759.ooo.test systemd-logind[759]: New session 31 of user zuul.
Oct 14 09:30:12 np0005486759.ooo.test systemd[1]: Started Session 31 of User zuul.
Oct 14 09:30:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50092 DF PROTO=TCP SPT=52072 DPT=9882 SEQ=1387411331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F850BB80000000001030307) 
Oct 14 09:30:12 np0005486759.ooo.test sshd[169181]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:30:13 np0005486759.ooo.test python3.9[169274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:30:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50093 DF PROTO=TCP SPT=52072 DPT=9882 SEQ=1387411331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F850FC10000000001030307) 
Oct 14 09:30:15 np0005486759.ooo.test sudo[169368]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlewlxdnhvnmzmrjjpwtkxhfdpaxvbgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434215.0001185-66-197173161372461/AnsiballZ_file.py
Oct 14 09:30:15 np0005486759.ooo.test sudo[169368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:15 np0005486759.ooo.test python3.9[169370]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:15 np0005486759.ooo.test sudo[169368]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:16 np0005486759.ooo.test sudo[169460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odbtyvigpgolsuufmtszkatkpcgqiolz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434215.847539-74-37818788499299/AnsiballZ_stat.py
Oct 14 09:30:16 np0005486759.ooo.test sudo[169460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25072 DF PROTO=TCP SPT=50162 DPT=9105 SEQ=3681887882 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F851E820000000001030307) 
Oct 14 09:30:17 np0005486759.ooo.test python3.9[169462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:17 np0005486759.ooo.test sudo[169460]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:17 np0005486759.ooo.test sudo[169533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqzejxjknylggmpwycugmcllwxegcxxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434215.847539-74-37818788499299/AnsiballZ_copy.py
Oct 14 09:30:17 np0005486759.ooo.test sudo[169533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:17 np0005486759.ooo.test python3.9[169535]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434215.847539-74-37818788499299/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:17 np0005486759.ooo.test sudo[169533]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:18 np0005486759.ooo.test sudo[169625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcytltynsaertkmmtioxkgcryflfbkpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434218.140371-90-81700154076624/AnsiballZ_file.py
Oct 14 09:30:18 np0005486759.ooo.test sudo[169625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:18 np0005486759.ooo.test python3.9[169627]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:18 np0005486759.ooo.test sudo[169625]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50095 DF PROTO=TCP SPT=52072 DPT=9882 SEQ=1387411331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8527810000000001030307) 
Oct 14 09:30:19 np0005486759.ooo.test sudo[169717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxizyqlvxvygwbrnbfqaniuhxbzjpinw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434218.7659607-98-251826021878075/AnsiballZ_stat.py
Oct 14 09:30:19 np0005486759.ooo.test sudo[169717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:20 np0005486759.ooo.test python3.9[169719]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:20 np0005486759.ooo.test sudo[169717]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:20 np0005486759.ooo.test sudo[169790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxdogdotnyxmofivktlhvfzspatzbqnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434218.7659607-98-251826021878075/AnsiballZ_copy.py
Oct 14 09:30:20 np0005486759.ooo.test sudo[169790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:20 np0005486759.ooo.test python3.9[169792]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434218.7659607-98-251826021878075/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:20 np0005486759.ooo.test sudo[169790]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:21 np0005486759.ooo.test sudo[169882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyhylshxyqblqmhxqfgzfsxzhjffzlss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434220.7895715-114-110023512027456/AnsiballZ_file.py
Oct 14 09:30:21 np0005486759.ooo.test sudo[169882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:21 np0005486759.ooo.test python3.9[169884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:21 np0005486759.ooo.test sudo[169882]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:21 np0005486759.ooo.test sudo[169974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoywswnzhqpnnhievzzgdsbrfjddajhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434221.5036824-122-114493551390355/AnsiballZ_stat.py
Oct 14 09:30:21 np0005486759.ooo.test sudo[169974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:22 np0005486759.ooo.test python3.9[169976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:22 np0005486759.ooo.test sudo[169974]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:22 np0005486759.ooo.test sudo[170047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cujwwfexcrhxbitkaiegyiqzmmsdshdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434221.5036824-122-114493551390355/AnsiballZ_copy.py
Oct 14 09:30:22 np0005486759.ooo.test sudo[170047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5094 DF PROTO=TCP SPT=41666 DPT=9102 SEQ=1000680039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8534010000000001030307) 
Oct 14 09:30:22 np0005486759.ooo.test python3.9[170049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434221.5036824-122-114493551390355/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:22 np0005486759.ooo.test sudo[170047]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:23 np0005486759.ooo.test sudo[170139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlunqketrkywxmnucjxaddqzbrvrracy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434222.8813808-138-227854752546457/AnsiballZ_file.py
Oct 14 09:30:23 np0005486759.ooo.test sudo[170139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:23 np0005486759.ooo.test python3.9[170141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:23 np0005486759.ooo.test sudo[170139]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:23 np0005486759.ooo.test sudo[170231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpiwgulrpndpnfxvjbeuslsqrsidhiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434223.5715475-146-16642476432416/AnsiballZ_stat.py
Oct 14 09:30:23 np0005486759.ooo.test sudo[170231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:24 np0005486759.ooo.test python3.9[170233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:24 np0005486759.ooo.test sudo[170231]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:24 np0005486759.ooo.test sudo[170304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycshwrmircfhgqpdwufokszpgtkfscmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434223.5715475-146-16642476432416/AnsiballZ_copy.py
Oct 14 09:30:24 np0005486759.ooo.test sudo[170304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:24 np0005486759.ooo.test python3.9[170306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434223.5715475-146-16642476432416/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:24 np0005486759.ooo.test sudo[170304]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:25 np0005486759.ooo.test sudo[170396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkbvrrdwfjzbckatshshqozkxvuvkgod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434224.9108942-162-279089857053324/AnsiballZ_file.py
Oct 14 09:30:25 np0005486759.ooo.test sudo[170396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:25 np0005486759.ooo.test python3.9[170398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:25 np0005486759.ooo.test sudo[170396]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:25 np0005486759.ooo.test sudo[170488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lupivlrvzptbtodhjaldzsvmxluindwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434225.605842-170-110084096546998/AnsiballZ_stat.py
Oct 14 09:30:25 np0005486759.ooo.test sudo[170488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:26 np0005486759.ooo.test python3.9[170490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:26 np0005486759.ooo.test sudo[170488]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:26 np0005486759.ooo.test sudo[170561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chidjlaztqaksikdfbrhfyazmconkepv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434225.605842-170-110084096546998/AnsiballZ_copy.py
Oct 14 09:30:26 np0005486759.ooo.test sudo[170561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5095 DF PROTO=TCP SPT=41666 DPT=9102 SEQ=1000680039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8543C10000000001030307) 
Oct 14 09:30:26 np0005486759.ooo.test python3.9[170563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434225.605842-170-110084096546998/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:26 np0005486759.ooo.test sudo[170561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:27 np0005486759.ooo.test sudo[170653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slxsyizbsmltibcoeucusjnlqmhxfdwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434226.9766824-186-187558945333064/AnsiballZ_file.py
Oct 14 09:30:27 np0005486759.ooo.test sudo[170653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:27 np0005486759.ooo.test python3.9[170655]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:27 np0005486759.ooo.test sudo[170653]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:27 np0005486759.ooo.test sudo[170745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdiztopzybdwrdwvzgnpgbmnqaryhfxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434227.6702278-194-227670747496265/AnsiballZ_stat.py
Oct 14 09:30:27 np0005486759.ooo.test sudo[170745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:28 np0005486759.ooo.test python3.9[170747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:28 np0005486759.ooo.test sudo[170745]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:28 np0005486759.ooo.test sudo[170818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptifkjzbjvxywqakxlvgihiwmkskowse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434227.6702278-194-227670747496265/AnsiballZ_copy.py
Oct 14 09:30:28 np0005486759.ooo.test sudo[170818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:28 np0005486759.ooo.test python3.9[170820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434227.6702278-194-227670747496265/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:28 np0005486759.ooo.test sudo[170818]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:29 np0005486759.ooo.test sudo[170910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zprnclfxyezoalkauntxzabsfzxjzist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434229.0034075-210-274613438574226/AnsiballZ_file.py
Oct 14 09:30:29 np0005486759.ooo.test sudo[170910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:29 np0005486759.ooo.test python3.9[170912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:29 np0005486759.ooo.test sudo[170910]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:29 np0005486759.ooo.test sudo[171002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxjjfwbkfowxlbirjtqmywvggfuojeez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434229.6533172-218-154295595759942/AnsiballZ_stat.py
Oct 14 09:30:29 np0005486759.ooo.test sudo[171002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:30 np0005486759.ooo.test python3.9[171004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:30 np0005486759.ooo.test sudo[171002]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:30 np0005486759.ooo.test sudo[171075]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fflllhfdusznpzjclyubvovyjmhdjauz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434229.6533172-218-154295595759942/AnsiballZ_copy.py
Oct 14 09:30:30 np0005486759.ooo.test sudo[171075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:30 np0005486759.ooo.test python3.9[171077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434229.6533172-218-154295595759942/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:30 np0005486759.ooo.test sudo[171075]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:32 np0005486759.ooo.test sudo[171167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yydcdfwnlthnoddkzurhyjqynozeqzaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434231.7786763-234-43300142829496/AnsiballZ_file.py
Oct 14 09:30:32 np0005486759.ooo.test sudo[171167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:32 np0005486759.ooo.test python3.9[171169]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:32 np0005486759.ooo.test sudo[171167]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:32 np0005486759.ooo.test sudo[171259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntivhfrgtklqmrlyikbnavrkqlsrtyuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434232.4736693-242-52822866314259/AnsiballZ_stat.py
Oct 14 09:30:32 np0005486759.ooo.test sudo[171259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:32 np0005486759.ooo.test python3.9[171261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:32 np0005486759.ooo.test sudo[171259]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:34 np0005486759.ooo.test sudo[171332]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imhhtchdrrqlcloexzekzfkgvtxibfrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434232.4736693-242-52822866314259/AnsiballZ_copy.py
Oct 14 09:30:34 np0005486759.ooo.test sudo[171332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:34 np0005486759.ooo.test python3.9[171334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434232.4736693-242-52822866314259/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=4dac229665d2e79533df620196ec4c755a19cff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:34 np0005486759.ooo.test sudo[171332]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12038 DF PROTO=TCP SPT=50290 DPT=9100 SEQ=3227262872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8561980000000001030307) 
Oct 14 09:30:34 np0005486759.ooo.test sshd[169181]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:30:34 np0005486759.ooo.test systemd-logind[759]: Session 31 logged out. Waiting for processes to exit.
Oct 14 09:30:34 np0005486759.ooo.test systemd[1]: session-31.scope: Deactivated successfully.
Oct 14 09:30:34 np0005486759.ooo.test systemd[1]: session-31.scope: Consumed 11.976s CPU time.
Oct 14 09:30:34 np0005486759.ooo.test systemd-logind[759]: Removed session 31.
Oct 14 09:30:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12039 DF PROTO=TCP SPT=50290 DPT=9100 SEQ=3227262872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8565820000000001030307) 
Oct 14 09:30:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12040 DF PROTO=TCP SPT=50290 DPT=9100 SEQ=3227262872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F856D820000000001030307) 
Oct 14 09:30:41 np0005486759.ooo.test sshd[171349]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:30:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12041 DF PROTO=TCP SPT=50290 DPT=9100 SEQ=3227262872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F857D420000000001030307) 
Oct 14 09:30:41 np0005486759.ooo.test sshd[171349]: Accepted publickey for zuul from 192.168.122.31 port 35208 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:30:41 np0005486759.ooo.test systemd-logind[759]: New session 32 of user zuul.
Oct 14 09:30:41 np0005486759.ooo.test systemd[1]: Started Session 32 of User zuul.
Oct 14 09:30:41 np0005486759.ooo.test sshd[171349]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:30:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33479 DF PROTO=TCP SPT=52988 DPT=9882 SEQ=3189411885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8580E80000000001030307) 
Oct 14 09:30:42 np0005486759.ooo.test python3.9[171442]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:30:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33480 DF PROTO=TCP SPT=52988 DPT=9882 SEQ=3189411885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8585020000000001030307) 
Oct 14 09:30:43 np0005486759.ooo.test sudo[171536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfisjzjjpikbfhpbibggpmcjtcimurfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434242.9874992-34-14430254166636/AnsiballZ_file.py
Oct 14 09:30:43 np0005486759.ooo.test sudo[171536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:43 np0005486759.ooo.test python3.9[171538]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:43 np0005486759.ooo.test sudo[171536]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:44 np0005486759.ooo.test sudo[171628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmoiirkzjqbukumevummxuljofbdcvls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434243.8305125-34-30096456958236/AnsiballZ_file.py
Oct 14 09:30:44 np0005486759.ooo.test sudo[171628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:44 np0005486759.ooo.test python3.9[171630]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:30:44 np0005486759.ooo.test sudo[171628]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:44 np0005486759.ooo.test python3.9[171720]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:30:45 np0005486759.ooo.test sudo[171810]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boeevbzglibnyrcepkolwaztbdahabhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434245.1566024-57-119473160575182/AnsiballZ_seboolean.py
Oct 14 09:30:45 np0005486759.ooo.test sudo[171810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:45 np0005486759.ooo.test python3.9[171812]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 14 09:30:46 np0005486759.ooo.test sudo[171810]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:46 np0005486759.ooo.test systemd-journald[35787]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Oct 14 09:30:46 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 09:30:46 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:30:46 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:30:46 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:30:46 np0005486759.ooo.test sudo[171903]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnaupvzdyvippescjqdfizzbrsxzwyog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434246.3450613-67-144616149676880/AnsiballZ_setup.py
Oct 14 09:30:46 np0005486759.ooo.test sudo[171903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:46 np0005486759.ooo.test python3.9[171905]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:30:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35718 DF PROTO=TCP SPT=39646 DPT=9105 SEQ=3515621031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8593C10000000001030307) 
Oct 14 09:30:47 np0005486759.ooo.test sudo[171903]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:47 np0005486759.ooo.test sudo[171957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdvwknthxzkoaslrvgxcyrflvpevhyry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434246.3450613-67-144616149676880/AnsiballZ_dnf.py
Oct 14 09:30:47 np0005486759.ooo.test sudo[171957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:47 np0005486759.ooo.test python3.9[171959]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:30:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33482 DF PROTO=TCP SPT=52988 DPT=9882 SEQ=3189411885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F859CC20000000001030307) 
Oct 14 09:30:51 np0005486759.ooo.test sudo[171957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:51 np0005486759.ooo.test auditd[725]: Audit daemon rotating log files
Oct 14 09:30:51 np0005486759.ooo.test sudo[172051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wihvstmmtjtltwukedterlbnppbmwvgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434251.1635997-79-177366255546562/AnsiballZ_systemd.py
Oct 14 09:30:51 np0005486759.ooo.test sudo[172051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:52 np0005486759.ooo.test python3.9[172053]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:30:52 np0005486759.ooo.test sudo[172051]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13378 DF PROTO=TCP SPT=51564 DPT=9102 SEQ=162031691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85A9010000000001030307) 
Oct 14 09:30:52 np0005486759.ooo.test sudo[172146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrmrmdumwwmxncjfgmepbygtlolkhtcq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434252.3821442-87-172437756480198/AnsiballZ_edpm_nftables_snippet.py
Oct 14 09:30:52 np0005486759.ooo.test sudo[172146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:53 np0005486759.ooo.test python3[172148]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                         rule:
                                                           proto: udp
                                                           dport: 4789
                                                       - rule_name: 119 neutron geneve networks
                                                         rule:
                                                           proto: udp
                                                           dport: 6081
                                                           state: ["UNTRACKED"]
                                                       - rule_name: 120 neutron geneve networks no conntrack
                                                         rule:
                                                           proto: udp
                                                           dport: 6081
                                                           table: raw
                                                           chain: OUTPUT
                                                           jump: NOTRACK
                                                           action: append
                                                           state: []
                                                       - rule_name: 121 neutron geneve networks no conntrack
                                                         rule:
                                                           proto: udp
                                                           dport: 6081
                                                           table: raw
                                                           chain: PREROUTING
                                                           jump: NOTRACK
                                                           action: append
                                                           state: []
                                                        dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 14 09:30:53 np0005486759.ooo.test sudo[172146]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:53 np0005486759.ooo.test sudo[172238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abrixpxrunlfglbuxnvxrirmwmewfrsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434253.3760748-96-186412289475146/AnsiballZ_file.py
Oct 14 09:30:53 np0005486759.ooo.test sudo[172238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:53 np0005486759.ooo.test python3.9[172240]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:53 np0005486759.ooo.test sudo[172238]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:54 np0005486759.ooo.test sudo[172330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzdnjrkxtagesruvbruutmfxrqpmtuwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434254.0464983-104-88680173768409/AnsiballZ_stat.py
Oct 14 09:30:54 np0005486759.ooo.test sudo[172330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:54 np0005486759.ooo.test python3.9[172332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:54 np0005486759.ooo.test sudo[172330]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:54 np0005486759.ooo.test sudo[172378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxiyvgpxmvqygsfskkrzctcbzjrvjfdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434254.0464983-104-88680173768409/AnsiballZ_file.py
Oct 14 09:30:54 np0005486759.ooo.test sudo[172378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:55 np0005486759.ooo.test python3.9[172380]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:55 np0005486759.ooo.test sudo[172378]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:55 np0005486759.ooo.test sudo[172470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycculiiziztnoiafpobglnznponpaucj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434255.2818313-116-109770724041647/AnsiballZ_stat.py
Oct 14 09:30:55 np0005486759.ooo.test sudo[172470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:55 np0005486759.ooo.test python3.9[172472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:55 np0005486759.ooo.test sudo[172470]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:56 np0005486759.ooo.test sudo[172518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pggwlnekektzpellguqaqinwrutoevcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434255.2818313-116-109770724041647/AnsiballZ_file.py
Oct 14 09:30:56 np0005486759.ooo.test sudo[172518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:56 np0005486759.ooo.test python3.9[172520]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.47wr64mc recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:56 np0005486759.ooo.test sudo[172518]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13379 DF PROTO=TCP SPT=51564 DPT=9102 SEQ=162031691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85B8C20000000001030307) 
Oct 14 09:30:56 np0005486759.ooo.test sudo[172610]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgarxmxazljstjcogneoqgbfhydalork ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434256.437751-128-26811748562857/AnsiballZ_stat.py
Oct 14 09:30:56 np0005486759.ooo.test sudo[172610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:56 np0005486759.ooo.test python3.9[172612]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:56 np0005486759.ooo.test sudo[172610]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:57 np0005486759.ooo.test sudo[172658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpqsyqoaauqvoydjreervhnxevutnrgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434256.437751-128-26811748562857/AnsiballZ_file.py
Oct 14 09:30:57 np0005486759.ooo.test sudo[172658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:57 np0005486759.ooo.test python3.9[172660]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:30:57 np0005486759.ooo.test sudo[172658]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:57 np0005486759.ooo.test sudo[172750]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpyqizbkznsavituvdjotyuutfadgwwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434257.4906495-141-201863627055032/AnsiballZ_command.py
Oct 14 09:30:57 np0005486759.ooo.test sudo[172750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:58 np0005486759.ooo.test python3.9[172752]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:30:58 np0005486759.ooo.test sudo[172750]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:58 np0005486759.ooo.test sudo[172843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewsyamlwsddnjlvtywrbuvmpvbkrliyn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434258.3089883-149-66787474270619/AnsiballZ_edpm_nftables_from_files.py
Oct 14 09:30:58 np0005486759.ooo.test sudo[172843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:58 np0005486759.ooo.test python3[172845]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 14 09:30:58 np0005486759.ooo.test sudo[172843]: pam_unix(sudo:session): session closed for user root
Oct 14 09:30:59 np0005486759.ooo.test sudo[172935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uosicqguygrlznmfscnzrajgckipebkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434259.1100805-157-12779913583725/AnsiballZ_stat.py
Oct 14 09:30:59 np0005486759.ooo.test sudo[172935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:30:59 np0005486759.ooo.test python3.9[172937]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:30:59 np0005486759.ooo.test sudo[172935]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:00 np0005486759.ooo.test sudo[173010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muwrbraelmcgwcijksjkiaochkvabnkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434259.1100805-157-12779913583725/AnsiballZ_copy.py
Oct 14 09:31:00 np0005486759.ooo.test sudo[173010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:00 np0005486759.ooo.test python3.9[173012]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434259.1100805-157-12779913583725/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:00 np0005486759.ooo.test sudo[173010]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:00 np0005486759.ooo.test sudo[173102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjwwfcsrugktfhxvynclkmovgkeshjby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434260.5550506-172-55989672110226/AnsiballZ_stat.py
Oct 14 09:31:00 np0005486759.ooo.test sudo[173102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:01 np0005486759.ooo.test python3.9[173104]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:01 np0005486759.ooo.test sudo[173102]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:01 np0005486759.ooo.test sudo[173177]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkgpfddxeoykywiogkxvccxxhahyojcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434260.5550506-172-55989672110226/AnsiballZ_copy.py
Oct 14 09:31:01 np0005486759.ooo.test sudo[173177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:01 np0005486759.ooo.test python3.9[173179]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434260.5550506-172-55989672110226/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:01 np0005486759.ooo.test sudo[173177]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:02 np0005486759.ooo.test sudo[173269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aghdvdpzzjbewfyjdfzlygzanpgsgmlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434261.7693334-187-96234284751244/AnsiballZ_stat.py
Oct 14 09:31:02 np0005486759.ooo.test sudo[173269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:02 np0005486759.ooo.test python3.9[173271]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:02 np0005486759.ooo.test sudo[173269]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:02 np0005486759.ooo.test sudo[173344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmibmlheakmosssyppwxlbrhdpzorfal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434261.7693334-187-96234284751244/AnsiballZ_copy.py
Oct 14 09:31:02 np0005486759.ooo.test sudo[173344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:02 np0005486759.ooo.test python3.9[173346]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434261.7693334-187-96234284751244/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:02 np0005486759.ooo.test sudo[173344]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:03 np0005486759.ooo.test sudo[173436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mclikkwxcgnrjhxicrvwekbgtxttujix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434263.07038-202-68139773889999/AnsiballZ_stat.py
Oct 14 09:31:03 np0005486759.ooo.test sudo[173436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:03 np0005486759.ooo.test python3.9[173438]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:03 np0005486759.ooo.test sudo[173436]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:03 np0005486759.ooo.test sudo[173511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyvakiasjriwmrsfobaxvstjkzacyows ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434263.07038-202-68139773889999/AnsiballZ_copy.py
Oct 14 09:31:03 np0005486759.ooo.test sudo[173511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:04 np0005486759.ooo.test python3.9[173513]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434263.07038-202-68139773889999/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:04 np0005486759.ooo.test sudo[173511]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18847 DF PROTO=TCP SPT=36470 DPT=9100 SEQ=395370742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85D6C80000000001030307) 
Oct 14 09:31:04 np0005486759.ooo.test sudo[173603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhbilajttzhziufbhodxzyimiajhkdad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434264.3887157-217-224443253548375/AnsiballZ_stat.py
Oct 14 09:31:04 np0005486759.ooo.test sudo[173603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:04 np0005486759.ooo.test python3.9[173605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:04 np0005486759.ooo.test sudo[173603]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18848 DF PROTO=TCP SPT=36470 DPT=9100 SEQ=395370742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85DAC20000000001030307) 
Oct 14 09:31:05 np0005486759.ooo.test sudo[173678]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhbxrnvfeffmutjylqjqjhqwqccxpwoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434264.3887157-217-224443253548375/AnsiballZ_copy.py
Oct 14 09:31:05 np0005486759.ooo.test sudo[173678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:05 np0005486759.ooo.test python3.9[173680]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434264.3887157-217-224443253548375/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:05 np0005486759.ooo.test sudo[173678]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:06 np0005486759.ooo.test sudo[173770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnalbnorqahocjeagzfkuwyhlcbwvssl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434265.796634-232-234949420735294/AnsiballZ_file.py
Oct 14 09:31:06 np0005486759.ooo.test sudo[173770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:06 np0005486759.ooo.test python3.9[173772]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:06 np0005486759.ooo.test sudo[173770]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:06 np0005486759.ooo.test sudo[173862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqqiivjlzxlkfapwnqjmlxpizajfepdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434266.3925388-240-124737076418263/AnsiballZ_command.py
Oct 14 09:31:06 np0005486759.ooo.test sudo[173862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:06 np0005486759.ooo.test python3.9[173864]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:06 np0005486759.ooo.test sudo[173862]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18849 DF PROTO=TCP SPT=36470 DPT=9100 SEQ=395370742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85E2C10000000001030307) 
Oct 14 09:31:08 np0005486759.ooo.test sudo[173957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fawtwdvjbjkovkvurrqalndoglchbtwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434267.0663793-248-123271417302557/AnsiballZ_blockinfile.py
Oct 14 09:31:08 np0005486759.ooo.test sudo[173957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:08 np0005486759.ooo.test python3.9[173959]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/edpm-chains.nft"
                                                         include "/etc/nftables/edpm-rules.nft"
                                                         include "/etc/nftables/edpm-jumps.nft"
                                                          path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:08 np0005486759.ooo.test sudo[173957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:09 np0005486759.ooo.test sudo[174049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hawdauvxdzjoihwunitkcjktkanptlmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434269.1418517-257-254814939497784/AnsiballZ_command.py
Oct 14 09:31:09 np0005486759.ooo.test sudo[174049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:09 np0005486759.ooo.test python3.9[174051]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:09 np0005486759.ooo.test sudo[174049]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:10 np0005486759.ooo.test sudo[174142]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yibzzmojhooahquymlciuyapmvnrshau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434270.4357302-265-199515161677150/AnsiballZ_stat.py
Oct 14 09:31:10 np0005486759.ooo.test sudo[174142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:10 np0005486759.ooo.test python3.9[174144]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:31:10 np0005486759.ooo.test sudo[174142]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18850 DF PROTO=TCP SPT=36470 DPT=9100 SEQ=395370742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85F2820000000001030307) 
Oct 14 09:31:11 np0005486759.ooo.test sudo[174236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-furdtslnpnbpbluukbowpitlfgessmgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434271.149788-273-232886024736851/AnsiballZ_command.py
Oct 14 09:31:11 np0005486759.ooo.test sudo[174236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:11 np0005486759.ooo.test python3.9[174238]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:11 np0005486759.ooo.test sudo[174236]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:12 np0005486759.ooo.test sudo[174331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qobnuheihvwrjmbioldjarsrmdtuylwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434271.8022947-281-50312815730974/AnsiballZ_file.py
Oct 14 09:31:12 np0005486759.ooo.test sudo[174331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21046 DF PROTO=TCP SPT=45926 DPT=9882 SEQ=1407375546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85F6190000000001030307) 
Oct 14 09:31:12 np0005486759.ooo.test python3.9[174333]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:12 np0005486759.ooo.test sudo[174331]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21047 DF PROTO=TCP SPT=45926 DPT=9882 SEQ=1407375546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F85FA410000000001030307) 
Oct 14 09:31:13 np0005486759.ooo.test python3.9[174423]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:31:14 np0005486759.ooo.test sudo[174514]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyrfiggsdjpvxlytgltwbmjiknizjolx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434274.0243459-321-48067428492225/AnsiballZ_command.py
Oct 14 09:31:14 np0005486759.ooo.test sudo[174514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:14 np0005486759.ooo.test python3.9[174516]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005486759.ooo.test external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:2c:0c:de:0a" external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:14 np0005486759.ooo.test ovs-vsctl[174517]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005486759.ooo.test external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:2c:0c:de:0a external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 14 09:31:14 np0005486759.ooo.test sudo[174514]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:14 np0005486759.ooo.test sudo[174607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxkzandomkjogueqvmqcvnmrfaxpyvjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434274.6815171-330-180494994937858/AnsiballZ_command.py
Oct 14 09:31:14 np0005486759.ooo.test sudo[174607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:15 np0005486759.ooo.test python3.9[174609]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                         ovs-vsctl show | grep -q "Manager"
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:15 np0005486759.ooo.test sudo[174607]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:15 np0005486759.ooo.test python3.9[174702]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:31:16 np0005486759.ooo.test sudo[174794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmwbkeflmwvyloehothycwonttyuxvda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434276.0816498-348-277131109234851/AnsiballZ_file.py
Oct 14 09:31:16 np0005486759.ooo.test sudo[174794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:16 np0005486759.ooo.test python3.9[174796]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:31:16 np0005486759.ooo.test sudo[174794]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:16 np0005486759.ooo.test sudo[174886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryiaushtmewsbynvpwdsldnipebvjxaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434276.738082-356-278312633536849/AnsiballZ_stat.py
Oct 14 09:31:16 np0005486759.ooo.test sudo[174886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7970 DF PROTO=TCP SPT=42782 DPT=9105 SEQ=2456401529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8608C10000000001030307) 
Oct 14 09:31:17 np0005486759.ooo.test python3.9[174888]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:17 np0005486759.ooo.test sudo[174886]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:17 np0005486759.ooo.test sudo[174934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsgcusdozbeztazvizmcfnlkdipcuuez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434276.738082-356-278312633536849/AnsiballZ_file.py
Oct 14 09:31:17 np0005486759.ooo.test sudo[174934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:17 np0005486759.ooo.test python3.9[174936]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:31:17 np0005486759.ooo.test sudo[174934]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:17 np0005486759.ooo.test sudo[175026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svnhebtvozkkcqycplatzxamcvsvvkoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434277.7059965-356-110482142267583/AnsiballZ_stat.py
Oct 14 09:31:17 np0005486759.ooo.test sudo[175026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:18 np0005486759.ooo.test python3.9[175028]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:18 np0005486759.ooo.test sudo[175026]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:18 np0005486759.ooo.test sudo[175074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idjncfsliclayccecwrmkmcsbmtvufcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434277.7059965-356-110482142267583/AnsiballZ_file.py
Oct 14 09:31:18 np0005486759.ooo.test sudo[175074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:18 np0005486759.ooo.test python3.9[175076]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:31:18 np0005486759.ooo.test sudo[175074]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:19 np0005486759.ooo.test sudo[175166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyxgcvlresjwyobwaoxpezswagoddqzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434278.8137202-379-250600648601522/AnsiballZ_file.py
Oct 14 09:31:19 np0005486759.ooo.test sudo[175166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:19 np0005486759.ooo.test python3.9[175168]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:19 np0005486759.ooo.test sudo[175166]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21049 DF PROTO=TCP SPT=45926 DPT=9882 SEQ=1407375546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8612010000000001030307) 
Oct 14 09:31:20 np0005486759.ooo.test sudo[175258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwrxeveshpzfzoxpvvhhybnepsihcnym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434279.522551-387-19251386623369/AnsiballZ_stat.py
Oct 14 09:31:20 np0005486759.ooo.test sudo[175258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:20 np0005486759.ooo.test python3.9[175260]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:20 np0005486759.ooo.test sudo[175258]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:20 np0005486759.ooo.test sudo[175306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyjxretcjkxobvmihahfocnfevphpmkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434279.522551-387-19251386623369/AnsiballZ_file.py
Oct 14 09:31:20 np0005486759.ooo.test sudo[175306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:21 np0005486759.ooo.test python3.9[175308]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:21 np0005486759.ooo.test sudo[175306]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:21 np0005486759.ooo.test sudo[175398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pesvccivqasrkapwcbvkpvafbfawxaun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434281.3502634-399-100166672151050/AnsiballZ_stat.py
Oct 14 09:31:21 np0005486759.ooo.test sudo[175398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:21 np0005486759.ooo.test python3.9[175400]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:21 np0005486759.ooo.test sudo[175398]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7021 DF PROTO=TCP SPT=56132 DPT=9102 SEQ=3476887677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F861E410000000001030307) 
Oct 14 09:31:23 np0005486759.ooo.test sudo[175446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fffvglmlyumkjwjfzkyfdomlycavwwus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434281.3502634-399-100166672151050/AnsiballZ_file.py
Oct 14 09:31:23 np0005486759.ooo.test sudo[175446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:23 np0005486759.ooo.test python3.9[175448]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:23 np0005486759.ooo.test sudo[175446]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:23 np0005486759.ooo.test sudo[175538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpytnbzrdyrivmfzfmbeotwqzyrdnnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434283.5285757-411-65426736557053/AnsiballZ_systemd.py
Oct 14 09:31:23 np0005486759.ooo.test sudo[175538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:24 np0005486759.ooo.test python3.9[175540]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:31:24 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:31:24 np0005486759.ooo.test systemd-sysv-generator[175566]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:31:24 np0005486759.ooo.test systemd-rc-local-generator[175561]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:31:24 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:31:24 np0005486759.ooo.test sudo[175538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:24 np0005486759.ooo.test sudo[175668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqchmvlsbagrsncvhmneyslergodcytm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434284.6568596-419-213110959720499/AnsiballZ_stat.py
Oct 14 09:31:24 np0005486759.ooo.test sudo[175668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:25 np0005486759.ooo.test python3.9[175670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:25 np0005486759.ooo.test sudo[175668]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:25 np0005486759.ooo.test sudo[175716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byiicwolftfswrvrvotoibcusgmcgjjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434284.6568596-419-213110959720499/AnsiballZ_file.py
Oct 14 09:31:25 np0005486759.ooo.test sudo[175716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:25 np0005486759.ooo.test python3.9[175718]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:25 np0005486759.ooo.test sudo[175716]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:25 np0005486759.ooo.test sudo[175808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twtdwdepuvwqfcvlqwfdscndarzegtji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434285.7475224-431-117885866804999/AnsiballZ_stat.py
Oct 14 09:31:25 np0005486759.ooo.test sudo[175808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:26 np0005486759.ooo.test python3.9[175810]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:26 np0005486759.ooo.test sudo[175808]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:26 np0005486759.ooo.test sudo[175856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnzrrsnfbnakvwmmllikkifzakjsuczc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434285.7475224-431-117885866804999/AnsiballZ_file.py
Oct 14 09:31:26 np0005486759.ooo.test sudo[175856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7022 DF PROTO=TCP SPT=56132 DPT=9102 SEQ=3476887677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F862E010000000001030307) 
Oct 14 09:31:26 np0005486759.ooo.test python3.9[175858]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:26 np0005486759.ooo.test sudo[175856]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:27 np0005486759.ooo.test sudo[175948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzvmzjpebkafwvwnkwfiyhpudoxoefdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434286.7693703-443-97804950235709/AnsiballZ_systemd.py
Oct 14 09:31:27 np0005486759.ooo.test sudo[175948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:27 np0005486759.ooo.test python3.9[175950]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:31:27 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:31:27 np0005486759.ooo.test systemd-rc-local-generator[175974]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:31:27 np0005486759.ooo.test systemd-sysv-generator[175981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:31:27 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:31:27 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:31:27 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:31:27 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:31:27 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:31:27 np0005486759.ooo.test sudo[175948]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:28 np0005486759.ooo.test sudo[176082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwmrkscmogfayyslilcnmohpgqotfhah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434287.9501066-453-226319931566362/AnsiballZ_file.py
Oct 14 09:31:28 np0005486759.ooo.test sudo[176082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:28 np0005486759.ooo.test python3.9[176084]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:31:28 np0005486759.ooo.test sudo[176082]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:28 np0005486759.ooo.test sudo[176174]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tinhxmqnosihswsiihtwknbdehuxiypn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434288.5576165-461-147004893584810/AnsiballZ_stat.py
Oct 14 09:31:28 np0005486759.ooo.test sudo[176174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:29 np0005486759.ooo.test python3.9[176176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:29 np0005486759.ooo.test sudo[176174]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:29 np0005486759.ooo.test sudo[176247]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jemanlrgatolkralexoxjxxknnmszsnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434288.5576165-461-147004893584810/AnsiballZ_copy.py
Oct 14 09:31:29 np0005486759.ooo.test sudo[176247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:29 np0005486759.ooo.test python3.9[176249]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434288.5576165-461-147004893584810/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:31:29 np0005486759.ooo.test sudo[176247]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:30 np0005486759.ooo.test sudo[176339]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eunpiaqyxkgqhaalwhfabpadhbsrnxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434289.8735416-478-127916309225457/AnsiballZ_file.py
Oct 14 09:31:30 np0005486759.ooo.test sudo[176339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:30 np0005486759.ooo.test python3.9[176341]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:31:30 np0005486759.ooo.test sudo[176339]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:30 np0005486759.ooo.test sudo[176431]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksrrrxqdbkjlzziqtcqmwsvtkaygdjsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434290.531113-486-868374332609/AnsiballZ_stat.py
Oct 14 09:31:30 np0005486759.ooo.test sudo[176431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:31 np0005486759.ooo.test python3.9[176433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:31:31 np0005486759.ooo.test sudo[176431]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:31 np0005486759.ooo.test sudo[176506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tahxaczvmsczfzybhmhfonhkjvriqrij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434290.531113-486-868374332609/AnsiballZ_copy.py
Oct 14 09:31:31 np0005486759.ooo.test sudo[176506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:31 np0005486759.ooo.test python3.9[176508]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434290.531113-486-868374332609/.source.json _original_basename=.8h1zr_m5 follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:31 np0005486759.ooo.test sudo[176506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:32 np0005486759.ooo.test sudo[176598]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fehpwefumjudmigswgstvajowzxgnqta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434291.9596775-501-10612359062621/AnsiballZ_file.py
Oct 14 09:31:32 np0005486759.ooo.test sudo[176598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:32 np0005486759.ooo.test python3.9[176600]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:32 np0005486759.ooo.test sudo[176598]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:33 np0005486759.ooo.test sudo[176690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkemwnsblxtfycaczzswysqzlfebbqco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434292.7296453-509-103443502466033/AnsiballZ_stat.py
Oct 14 09:31:33 np0005486759.ooo.test sudo[176690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:33 np0005486759.ooo.test sudo[176690]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62569 DF PROTO=TCP SPT=47038 DPT=9100 SEQ=1833335199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F864BF80000000001030307) 
Oct 14 09:31:34 np0005486759.ooo.test sudo[176763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfxzdopeaoihsbrnifguwpmvwujjdzhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434292.7296453-509-103443502466033/AnsiballZ_copy.py
Oct 14 09:31:34 np0005486759.ooo.test sudo[176763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:34 np0005486759.ooo.test sudo[176763]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62570 DF PROTO=TCP SPT=47038 DPT=9100 SEQ=1833335199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8650010000000001030307) 
Oct 14 09:31:35 np0005486759.ooo.test sudo[176855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljjhvexssaprfjgnikspirgkbvyizdtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434294.9354777-526-270635201355006/AnsiballZ_container_config_data.py
Oct 14 09:31:35 np0005486759.ooo.test sudo[176855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:35 np0005486759.ooo.test python3.9[176857]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 14 09:31:35 np0005486759.ooo.test sudo[176855]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62571 DF PROTO=TCP SPT=47038 DPT=9100 SEQ=1833335199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8658010000000001030307) 
Oct 14 09:31:37 np0005486759.ooo.test sudo[176947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyttrdoeinwhfeiwwxlvwdngpauiwfxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434295.8897882-535-170657603195332/AnsiballZ_container_config_hash.py
Oct 14 09:31:37 np0005486759.ooo.test sudo[176947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:37 np0005486759.ooo.test python3.9[176949]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:31:37 np0005486759.ooo.test sudo[176947]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:38 np0005486759.ooo.test sudo[177039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkpofnxmzpkvequmeewvgqcqdkjnmqtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434297.9359105-544-205845879563739/AnsiballZ_podman_container_info.py
Oct 14 09:31:38 np0005486759.ooo.test sudo[177039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:38 np0005486759.ooo.test python3.9[177041]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:31:39 np0005486759.ooo.test sudo[177039]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:41 np0005486759.ooo.test sudo[177157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdmszahbfkjdbbmuoaipejgvfhnjpqhh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434300.579584-557-261654074083862/AnsiballZ_edpm_container_manage.py
Oct 14 09:31:41 np0005486759.ooo.test sudo[177157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62572 DF PROTO=TCP SPT=47038 DPT=9100 SEQ=1833335199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8667C10000000001030307) 
Oct 14 09:31:41 np0005486759.ooo.test python3[177159]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:31:41 np0005486759.ooo.test python3[177159]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "8808c3fcdd35e5a4eacb6d3f5ed89688361f4338056395008c191e57b6afaf7d",
                                                                 "Digest": "sha256:31464fe4defe28fe4896a946cfe50ee0b001d1a03081174d9f69e4a313b0f21e",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:31464fe4defe28fe4896a946cfe50ee0b001d1a03081174d9f69e4a313b0f21e"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-13T13:00:39.999290816Z",
                                                                 "Config": {
                                                                      "User": "root",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 345598922,
                                                                 "VirtualSize": 345598922,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",
                                                                           "sha256:941d6c62fda0ad5502f66ca2e71ffe6e3f64b2a5a0db75dac0075fa750a883f2",
                                                                           "sha256:a82e45bff332403f46d24749948c917d1a37ea0b8ab922688da4f6038dc99c66"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "root",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843286399Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843354051Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843394192Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843417133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843442193Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843461914Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:43.236856724Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:17.539596691Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.007092512Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.334560883Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.713915587Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.426474494Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.742526819Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.072068096Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.376327744Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.639696917Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.946940986Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.329166855Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.709072452Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.066214819Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.407947122Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.744473297Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.044338828Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.376253048Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:29.890793292Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.186632274Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.418527973Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:31.913162322Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817436155Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817485046Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817496507Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817505987Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:34.821748777Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:04.283215401Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:32:11.363313509Z",
                                                                           "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:32:28.220191415Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:59:59.449268085Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T13:00:39.99718509Z",
                                                                           "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T13:00:41.283456225Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 14 09:31:41 np0005486759.ooo.test podman[177210]: 2025-10-14 09:31:41.743217929 +0000 UTC m=+0.079664973 container remove c602b8ad828f0873d520121fb45f096ecd1218871a3f52cf118e3dabdc3046c6 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, container_name=ovn_controller, vcs-ref=f1f0bbd48091f4ceb6d7f5422dfd17725d070245, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ovn-controller/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, build-date=2025-07-21T13:28:44, io.openshift.tags=rhosp osp openstack osp-17.1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Oct 14 09:31:41 np0005486759.ooo.test python3[177159]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Oct 14 09:31:41 np0005486759.ooo.test podman[177224]: 
Oct 14 09:31:41 np0005486759.ooo.test podman[177224]: 2025-10-14 09:31:41.823086948 +0000 UTC m=+0.065949328 container create 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct 14 09:31:41 np0005486759.ooo.test podman[177224]: 2025-10-14 09:31:41.785933636 +0000 UTC m=+0.028796026 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 14 09:31:41 np0005486759.ooo.test python3[177159]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 14 09:31:41 np0005486759.ooo.test sudo[177157]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24179 DF PROTO=TCP SPT=34068 DPT=9882 SEQ=910511605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F866B480000000001030307) 
Oct 14 09:31:42 np0005486759.ooo.test sudo[177350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejgxjujwqgspttvlcrjiwcvoztznixrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434302.15191-565-61740067721045/AnsiballZ_stat.py
Oct 14 09:31:42 np0005486759.ooo.test sudo[177350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:42 np0005486759.ooo.test python3.9[177352]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:31:42 np0005486759.ooo.test sudo[177350]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:43 np0005486759.ooo.test sudo[177444]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifffsmgftmfmkaitmcltwdpecckdwaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434302.9279466-574-101061882104959/AnsiballZ_file.py
Oct 14 09:31:43 np0005486759.ooo.test sudo[177444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24180 DF PROTO=TCP SPT=34068 DPT=9882 SEQ=910511605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F866F410000000001030307) 
Oct 14 09:31:43 np0005486759.ooo.test python3.9[177446]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:43 np0005486759.ooo.test sudo[177444]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:43 np0005486759.ooo.test sudo[177490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrrrqvmileoeerrdvyynoepaqomqbuhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434302.9279466-574-101061882104959/AnsiballZ_stat.py
Oct 14 09:31:43 np0005486759.ooo.test sudo[177490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:43 np0005486759.ooo.test python3.9[177492]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:31:43 np0005486759.ooo.test sudo[177490]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:44 np0005486759.ooo.test sudo[177581]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuqmtgjanukapujfriejdnrqmnnbhhgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434304.0154555-574-67009788773486/AnsiballZ_copy.py
Oct 14 09:31:44 np0005486759.ooo.test sudo[177581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:44 np0005486759.ooo.test python3.9[177583]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434304.0154555-574-67009788773486/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:31:44 np0005486759.ooo.test sudo[177581]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:44 np0005486759.ooo.test sudo[177627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrvhhkdizwtkgrwrhhebwoxztshdixxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434304.0154555-574-67009788773486/AnsiballZ_systemd.py
Oct 14 09:31:44 np0005486759.ooo.test sudo[177627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:45 np0005486759.ooo.test python3.9[177629]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:31:45 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:31:45 np0005486759.ooo.test systemd-rc-local-generator[177650]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:31:45 np0005486759.ooo.test systemd-sysv-generator[177655]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:31:45 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:31:45 np0005486759.ooo.test sudo[177627]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:45 np0005486759.ooo.test sudo[177708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvozqpzteugdekweyfdofzvgtfybzwtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434304.0154555-574-67009788773486/AnsiballZ_systemd.py
Oct 14 09:31:45 np0005486759.ooo.test sudo[177708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:46 np0005486759.ooo.test python3.9[177710]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:31:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56533 DF PROTO=TCP SPT=53162 DPT=9105 SEQ=3030501641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F867E010000000001030307) 
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:31:47 np0005486759.ooo.test systemd-rc-local-generator[177739]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:31:47 np0005486759.ooo.test systemd-sysv-generator[177743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Starting ovn_controller container...
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:31:47 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222ee45f389fe57e819fa78df7171c155a5600d0fa30b02f5d79ae23021c0c5b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:31:47 np0005486759.ooo.test podman[177752]: 2025-10-14 09:31:47.757935149 +0000 UTC m=+0.168164489 container init 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 09:31:47 np0005486759.ooo.test ovn_controller[177766]: + sudo -E kolla_set_configs
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:31:47 np0005486759.ooo.test podman[177752]: 2025-10-14 09:31:47.805682282 +0000 UTC m=+0.215911652 container start 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:31:47 np0005486759.ooo.test edpm-start-podman-container[177752]: ovn_controller
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 09:31:47 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 09:31:47 np0005486759.ooo.test systemd[177795]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:31:47 np0005486759.ooo.test podman[177773]: 2025-10-14 09:31:47.92260595 +0000 UTC m=+0.109342264 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct 14 09:31:47 np0005486759.ooo.test edpm-start-podman-container[177751]: Creating additional drop-in dependency for "ovn_controller" (1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b)
Oct 14 09:31:48 np0005486759.ooo.test podman[177773]: 2025-10-14 09:31:48.004339256 +0000 UTC m=+0.191075610 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:31:48 np0005486759.ooo.test podman[177773]: unhealthy
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Queued start job for default target Main User Target.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Created slice User Application Slice.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Reached target Paths.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Reached target Timers.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Starting D-Bus User Message Bus Socket...
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Starting Create User's Volatile Files and Directories...
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Listening on D-Bus User Message Bus Socket.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Finished Create User's Volatile Files and Directories.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Reached target Sockets.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Reached target Basic System.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Reached target Main User Target.
Oct 14 09:31:48 np0005486759.ooo.test systemd[177795]: Startup finished in 145ms.
Oct 14 09:31:48 np0005486759.ooo.test systemd-rc-local-generator[177853]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:31:48 np0005486759.ooo.test systemd-sysv-generator[177856]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: Started ovn_controller container.
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Failed with result 'exit-code'.
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: Started Session c13 of User root.
Oct 14 09:31:48 np0005486759.ooo.test sudo[177708]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: INFO:__main__:Validating config file
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: INFO:__main__:Writing out command to execute
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: session-c13.scope: Deactivated successfully.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: ++ cat /run_command
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + ARGS=
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + sudo kolla_copy_cacerts
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: Started Session c14 of User root.
Oct 14 09:31:48 np0005486759.ooo.test systemd[1]: session-c14.scope: Deactivated successfully.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + [[ ! -n '' ]]
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + . kolla_extend_start
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + umask 0022
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00011|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00012|features|INFO|OVS Feature: ct_flush, state: supported
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00020|binding|INFO|Claiming lport eee08de8-f983-4ebe-a654-f67f48659e50 for this chassis.
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00021|binding|INFO|eee08de8-f983-4ebe-a654-f67f48659e50: Claiming fa:16:3e:8e:cf:16 192.168.0.173
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00022|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00023|binding|INFO|Removing lport eee08de8-f983-4ebe-a654-f67f48659e50 ovn-installed in OVS
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 14 09:31:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:48Z|00026|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:31:48 np0005486759.ooo.test sudo[177961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unhwnuvrzmdvwsobbsgwckpvtmbfgjwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434308.466525-602-158510593474454/AnsiballZ_command.py
Oct 14 09:31:48 np0005486759.ooo.test sudo[177961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:48 np0005486759.ooo.test python3.9[177963]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:48 np0005486759.ooo.test ovs-vsctl[177964]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 14 09:31:49 np0005486759.ooo.test sudo[177961]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24182 DF PROTO=TCP SPT=34068 DPT=9882 SEQ=910511605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8687010000000001030307) 
Oct 14 09:31:49 np0005486759.ooo.test sudo[178054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-octtmvfsbvbovkfngvgjyxgbixhialux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434309.1602428-610-8095686583226/AnsiballZ_command.py
Oct 14 09:31:49 np0005486759.ooo.test sudo[178054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:49 np0005486759.ooo.test python3.9[178056]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:49 np0005486759.ooo.test sudo[178054]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:50 np0005486759.ooo.test sudo[178149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umjdlfpdmedkrowgsdwijhmubvgxphei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434310.639491-624-243464799083741/AnsiballZ_command.py
Oct 14 09:31:50 np0005486759.ooo.test sudo[178149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:31:51 np0005486759.ooo.test python3.9[178151]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:31:51 np0005486759.ooo.test ovs-vsctl[178152]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 14 09:31:51 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:51Z|00027|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:31:51 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:51Z|00028|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:31:51 np0005486759.ooo.test sudo[178149]: pam_unix(sudo:session): session closed for user root
Oct 14 09:31:51 np0005486759.ooo.test sshd[171349]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:31:51 np0005486759.ooo.test systemd[1]: session-32.scope: Deactivated successfully.
Oct 14 09:31:51 np0005486759.ooo.test systemd[1]: session-32.scope: Consumed 40.452s CPU time.
Oct 14 09:31:51 np0005486759.ooo.test systemd-logind[759]: Session 32 logged out. Waiting for processes to exit.
Oct 14 09:31:51 np0005486759.ooo.test systemd-logind[759]: Removed session 32.
Oct 14 09:31:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60633 DF PROTO=TCP SPT=46310 DPT=9102 SEQ=196505894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8693810000000001030307) 
Oct 14 09:31:56 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:56Z|00029|binding|INFO|Setting lport eee08de8-f983-4ebe-a654-f67f48659e50 ovn-installed in OVS
Oct 14 09:31:56 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:56Z|00030|binding|INFO|Setting lport eee08de8-f983-4ebe-a654-f67f48659e50 up in Southbound
Oct 14 09:31:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60634 DF PROTO=TCP SPT=46310 DPT=9102 SEQ=196505894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86A3410000000001030307) 
Oct 14 09:31:57 np0005486759.ooo.test sshd[178168]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:31:57 np0005486759.ooo.test sshd[178168]: Accepted publickey for zuul from 192.168.122.31 port 54314 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:31:57 np0005486759.ooo.test systemd-logind[759]: New session 34 of user zuul.
Oct 14 09:31:57 np0005486759.ooo.test systemd[1]: Started Session 34 of User zuul.
Oct 14 09:31:57 np0005486759.ooo.test sshd[178168]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:31:58 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:58Z|00031|memory|INFO|20904 kB peak resident set size after 10.0 seconds
Oct 14 09:31:58 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:31:58Z|00032|memory|INFO|idl-cells-OVN_Southbound:4091 idl-cells-Open_vSwitch:874 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:84 lflow-cache-entries-cache-matches:196 lflow-cache-size-KB:306 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:153 ofctrl_installed_flow_usage-KB:112 ofctrl_sb_flow_ref_usage-KB:67
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Activating special unit Exit the Session...
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped target Main User Target.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped target Basic System.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped target Paths.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped target Sockets.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped target Timers.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Closed D-Bus User Message Bus Socket.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Stopped Create User's Volatile Files and Directories.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Removed slice User Application Slice.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Reached target Shutdown.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Finished Exit the Session.
Oct 14 09:31:58 np0005486759.ooo.test systemd[177795]: Reached target Exit the Session.
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 09:31:58 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 09:31:58 np0005486759.ooo.test python3.9[178263]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:32:00 np0005486759.ooo.test sudo[178357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibmfcjtynoeutnnjemwcnnthkzswmvvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434319.5584497-34-29982592860850/AnsiballZ_file.py
Oct 14 09:32:00 np0005486759.ooo.test sudo[178357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:00 np0005486759.ooo.test python3.9[178359]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:00 np0005486759.ooo.test sudo[178357]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:00 np0005486759.ooo.test sudo[178449]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvojeyvauupjoxfwntdqbhulyzsstskc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434320.3444376-34-151055099375006/AnsiballZ_file.py
Oct 14 09:32:00 np0005486759.ooo.test sudo[178449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:00 np0005486759.ooo.test python3.9[178451]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:00 np0005486759.ooo.test sudo[178449]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:01 np0005486759.ooo.test sudo[178541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmacyqjsxwxciobiebdtopdwqpttfbnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434320.9822714-34-270149540868946/AnsiballZ_file.py
Oct 14 09:32:01 np0005486759.ooo.test sudo[178541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:01 np0005486759.ooo.test python3.9[178543]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:01 np0005486759.ooo.test sudo[178541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:01 np0005486759.ooo.test sudo[178633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvkzkxqummspeoztvdfyuspiiwjxoxzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434321.6066558-34-229666767852646/AnsiballZ_file.py
Oct 14 09:32:01 np0005486759.ooo.test sudo[178633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:02 np0005486759.ooo.test python3.9[178635]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:02 np0005486759.ooo.test sudo[178633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:02 np0005486759.ooo.test sudo[178725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oywqdvkhdjmlbmwiwmgpvgrwrloejnot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434322.2492926-34-161327796080479/AnsiballZ_file.py
Oct 14 09:32:02 np0005486759.ooo.test sudo[178725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:02 np0005486759.ooo.test python3.9[178727]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:02 np0005486759.ooo.test sudo[178725]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:03 np0005486759.ooo.test python3.9[178817]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:32:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16153 DF PROTO=TCP SPT=48424 DPT=9100 SEQ=1867090387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86C1280000000001030307) 
Oct 14 09:32:04 np0005486759.ooo.test sudo[178907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dibacuifagmbjtjdqbjeqyjbitkbpxve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434323.6100795-78-226266900347361/AnsiballZ_seboolean.py
Oct 14 09:32:04 np0005486759.ooo.test sudo[178907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:04 np0005486759.ooo.test python3.9[178909]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 14 09:32:04 np0005486759.ooo.test sudo[178907]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16154 DF PROTO=TCP SPT=48424 DPT=9100 SEQ=1867090387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86C5420000000001030307) 
Oct 14 09:32:05 np0005486759.ooo.test python3.9[178999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:06 np0005486759.ooo.test python3.9[179072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434325.1105309-86-30807729797695/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16155 DF PROTO=TCP SPT=48424 DPT=9100 SEQ=1867090387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86CD420000000001030307) 
Oct 14 09:32:08 np0005486759.ooo.test python3.9[179163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:08 np0005486759.ooo.test python3.9[179236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434327.6427224-101-263790058176621/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:09 np0005486759.ooo.test sudo[179326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzbkhbaijjcdqfpexhrrpfccvvtuujhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434328.8209522-118-238391329591200/AnsiballZ_setup.py
Oct 14 09:32:09 np0005486759.ooo.test sudo[179326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:09 np0005486759.ooo.test python3.9[179328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:32:09 np0005486759.ooo.test sudo[179326]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:10 np0005486759.ooo.test sudo[179380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzebiihpzlojlzncmqyzavwjsnmpkwpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434328.8209522-118-238391329591200/AnsiballZ_dnf.py
Oct 14 09:32:10 np0005486759.ooo.test sudo[179380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:10 np0005486759.ooo.test python3.9[179382]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:32:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16156 DF PROTO=TCP SPT=48424 DPT=9100 SEQ=1867090387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86DD010000000001030307) 
Oct 14 09:32:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38218 DF PROTO=TCP SPT=33740 DPT=9882 SEQ=3489208767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86E0780000000001030307) 
Oct 14 09:32:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38219 DF PROTO=TCP SPT=33740 DPT=9882 SEQ=3489208767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86E4810000000001030307) 
Oct 14 09:32:13 np0005486759.ooo.test sudo[179380]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:14 np0005486759.ooo.test sudo[179474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enlntzyxmlgrhjvyxmwcdczofvckqlaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434333.7024832-130-118044111681873/AnsiballZ_systemd.py
Oct 14 09:32:14 np0005486759.ooo.test sudo[179474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:14 np0005486759.ooo.test python3.9[179476]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:32:14 np0005486759.ooo.test sudo[179474]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:15 np0005486759.ooo.test python3.9[179569]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:15 np0005486759.ooo.test python3.9[179640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434334.8480806-138-233808292258664/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:16Z|00033|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:32:16 np0005486759.ooo.test python3.9[179730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:16 np0005486759.ooo.test python3.9[179801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434335.91684-138-242153557948362/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56416 DF PROTO=TCP SPT=44962 DPT=9105 SEQ=1500431860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86F3420000000001030307) 
Oct 14 09:32:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:32:18 np0005486759.ooo.test podman[179875]: 2025-10-14 09:32:18.450597367 +0000 UTC m=+0.079091574 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Oct 14 09:32:18 np0005486759.ooo.test podman[179875]: 2025-10-14 09:32:18.551263336 +0000 UTC m=+0.179757483 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:32:18 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:32:18 np0005486759.ooo.test python3.9[179918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:19Z|00034|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:32:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38221 DF PROTO=TCP SPT=33740 DPT=9882 SEQ=3489208767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F86FC410000000001030307) 
Oct 14 09:32:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:19Z|00035|ovn_bfd|INFO|Disabled BFD on interface ovn-7f1701-0
Oct 14 09:32:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:19Z|00036|ovn_bfd|INFO|Disabled BFD on interface ovn-0d47b9-0
Oct 14 09:32:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:19Z|00037|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:32:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:19Z|00038|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:32:19 np0005486759.ooo.test python3.9[179989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434337.7035708-182-168204647994542/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:20 np0005486759.ooo.test python3.9[180080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:20 np0005486759.ooo.test python3.9[180151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434339.6036563-182-177968111735210/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:21 np0005486759.ooo.test python3.9[180241]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:32:22 np0005486759.ooo.test sudo[180333]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uznvgoufenrrisqntybocsvcenuxjzqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434342.1281931-220-12137096657788/AnsiballZ_file.py
Oct 14 09:32:22 np0005486759.ooo.test sudo[180333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:22 np0005486759.ooo.test python3.9[180335]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:22 np0005486759.ooo.test sudo[180333]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18787 DF PROTO=TCP SPT=45094 DPT=9102 SEQ=1145019587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8708C10000000001030307) 
Oct 14 09:32:22 np0005486759.ooo.test sudo[180425]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccajduwoyqtfzypszhfpgvljmagythkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434342.6694574-228-193886258505512/AnsiballZ_stat.py
Oct 14 09:32:22 np0005486759.ooo.test sudo[180425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:23 np0005486759.ooo.test python3.9[180427]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:23 np0005486759.ooo.test sudo[180425]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:23 np0005486759.ooo.test sudo[180473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucbzlrneixkokjyjwddpsgpcmipbnbfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434342.6694574-228-193886258505512/AnsiballZ_file.py
Oct 14 09:32:23 np0005486759.ooo.test sudo[180473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:23 np0005486759.ooo.test python3.9[180475]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:23 np0005486759.ooo.test sudo[180473]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:23 np0005486759.ooo.test sudo[180565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clnzrcizbxaspsuhrnmbrysuiaulabep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434343.6756256-228-165529162083926/AnsiballZ_stat.py
Oct 14 09:32:23 np0005486759.ooo.test sudo[180565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:24 np0005486759.ooo.test python3.9[180567]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:24 np0005486759.ooo.test sudo[180565]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:24 np0005486759.ooo.test sudo[180613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unrhpegcistgvnsbifuzttgsiyuixshj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434343.6756256-228-165529162083926/AnsiballZ_file.py
Oct 14 09:32:24 np0005486759.ooo.test sudo[180613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:24 np0005486759.ooo.test python3.9[180615]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:24 np0005486759.ooo.test sudo[180613]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:25 np0005486759.ooo.test sudo[180705]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkbjptmblbqkldqhtdcyhlydczesgyji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434344.772994-251-46577636737040/AnsiballZ_file.py
Oct 14 09:32:25 np0005486759.ooo.test sudo[180705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:25 np0005486759.ooo.test python3.9[180707]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:25 np0005486759.ooo.test sudo[180705]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:25 np0005486759.ooo.test sudo[180797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mquormqnpgzqetdagitcticiaqjuxmdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434345.442092-259-237974566568082/AnsiballZ_stat.py
Oct 14 09:32:25 np0005486759.ooo.test sudo[180797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:25 np0005486759.ooo.test python3.9[180799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:25 np0005486759.ooo.test sudo[180797]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:26 np0005486759.ooo.test sudo[180845]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvvboybvzyqwplcsfzgotkyihblzyumc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434345.442092-259-237974566568082/AnsiballZ_file.py
Oct 14 09:32:26 np0005486759.ooo.test sudo[180845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:26 np0005486759.ooo.test python3.9[180847]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:26 np0005486759.ooo.test sudo[180845]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:26 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:32:26Z|00039|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 14 09:32:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18788 DF PROTO=TCP SPT=45094 DPT=9102 SEQ=1145019587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8718810000000001030307) 
Oct 14 09:32:26 np0005486759.ooo.test sudo[180937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkyyqljubkoojnpgxxdoifnxqaqjmrch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434346.568137-271-83826861880595/AnsiballZ_stat.py
Oct 14 09:32:26 np0005486759.ooo.test sudo[180937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:27 np0005486759.ooo.test python3.9[180939]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:27 np0005486759.ooo.test sudo[180937]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:27 np0005486759.ooo.test sudo[180985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnmjeirvrphnjljkperktosagqprlicb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434346.568137-271-83826861880595/AnsiballZ_file.py
Oct 14 09:32:27 np0005486759.ooo.test sudo[180985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:27 np0005486759.ooo.test python3.9[180987]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:27 np0005486759.ooo.test sudo[180985]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:27 np0005486759.ooo.test sudo[181077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibydcdedobgnmlnacbyvkxuetzssvwga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434347.6948256-283-49208075735665/AnsiballZ_systemd.py
Oct 14 09:32:27 np0005486759.ooo.test sudo[181077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:28 np0005486759.ooo.test python3.9[181079]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:32:28 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:32:28 np0005486759.ooo.test systemd-rc-local-generator[181104]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:32:28 np0005486759.ooo.test systemd-sysv-generator[181108]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:32:28 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:32:28 np0005486759.ooo.test sudo[181077]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:29 np0005486759.ooo.test sudo[181206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnsbabtzotqitiiqzcemtmrjnfuwpmfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434348.7333558-291-115019258924727/AnsiballZ_stat.py
Oct 14 09:32:29 np0005486759.ooo.test sudo[181206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:29 np0005486759.ooo.test python3.9[181208]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:29 np0005486759.ooo.test sudo[181206]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:29 np0005486759.ooo.test sudo[181254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjqjdisznmcrvqjyevlfwsbkonqhlglf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434348.7333558-291-115019258924727/AnsiballZ_file.py
Oct 14 09:32:29 np0005486759.ooo.test sudo[181254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:29 np0005486759.ooo.test python3.9[181256]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:29 np0005486759.ooo.test sudo[181254]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:30 np0005486759.ooo.test sudo[181346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kguexfmxxqugquwjlxhejzlxivvlanev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434349.8991356-303-67734341452985/AnsiballZ_stat.py
Oct 14 09:32:30 np0005486759.ooo.test sudo[181346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:30 np0005486759.ooo.test python3.9[181348]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:30 np0005486759.ooo.test sudo[181346]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:30 np0005486759.ooo.test sudo[181394]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aadnxjwaksmevzemfifqgsxlcspakhmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434349.8991356-303-67734341452985/AnsiballZ_file.py
Oct 14 09:32:30 np0005486759.ooo.test sudo[181394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:30 np0005486759.ooo.test python3.9[181396]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:30 np0005486759.ooo.test sudo[181394]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:31 np0005486759.ooo.test sudo[181486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmmggfksrjgdvhzfmqfdwspqgckpctbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434351.0046756-315-273527134571196/AnsiballZ_systemd.py
Oct 14 09:32:31 np0005486759.ooo.test sudo[181486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:31 np0005486759.ooo.test python3.9[181488]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:32:31 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:32:31 np0005486759.ooo.test systemd-rc-local-generator[181511]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:32:31 np0005486759.ooo.test systemd-sysv-generator[181519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:32:31 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:32:32 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:32:32 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:32:32 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:32:32 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:32:32 np0005486759.ooo.test sudo[181486]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:33 np0005486759.ooo.test sudo[181620]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqweormavsitaybsaxttbnzgavieokew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434353.4295337-325-125134036354436/AnsiballZ_file.py
Oct 14 09:32:33 np0005486759.ooo.test sudo[181620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:33 np0005486759.ooo.test python3.9[181622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:33 np0005486759.ooo.test sudo[181620]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44731 DF PROTO=TCP SPT=33948 DPT=9100 SEQ=95119183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8736570000000001030307) 
Oct 14 09:32:34 np0005486759.ooo.test sudo[181712]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnirljhkacdpofoyabpuumcisxjabuii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434354.141719-333-159938347484847/AnsiballZ_stat.py
Oct 14 09:32:34 np0005486759.ooo.test sudo[181712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:34 np0005486759.ooo.test python3.9[181714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:34 np0005486759.ooo.test sudo[181712]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:34 np0005486759.ooo.test sudo[181785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tziriljtfpzsnoczbuoqwcbcysvcbjcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434354.141719-333-159938347484847/AnsiballZ_copy.py
Oct 14 09:32:34 np0005486759.ooo.test sudo[181785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:35 np0005486759.ooo.test python3.9[181787]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434354.141719-333-159938347484847/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:35 np0005486759.ooo.test sudo[181785]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44732 DF PROTO=TCP SPT=33948 DPT=9100 SEQ=95119183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F873A410000000001030307) 
Oct 14 09:32:36 np0005486759.ooo.test sudo[181877]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mawuraayogmuweyosfbblwatcqynfdfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434356.1085262-350-58481637474252/AnsiballZ_file.py
Oct 14 09:32:36 np0005486759.ooo.test sudo[181877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:36 np0005486759.ooo.test python3.9[181879]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:32:36 np0005486759.ooo.test sudo[181877]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:37 np0005486759.ooo.test sudo[181969]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzramgucmncxcywvvjmqzqmsmdtqrjps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434356.8271909-358-22470911001455/AnsiballZ_stat.py
Oct 14 09:32:37 np0005486759.ooo.test sudo[181969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:37 np0005486759.ooo.test python3.9[181971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:32:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44733 DF PROTO=TCP SPT=33948 DPT=9100 SEQ=95119183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8742410000000001030307) 
Oct 14 09:32:37 np0005486759.ooo.test sudo[181969]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:37 np0005486759.ooo.test sudo[182044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djkpaeibgruoijtrekglddxbxmeyprsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434356.8271909-358-22470911001455/AnsiballZ_copy.py
Oct 14 09:32:37 np0005486759.ooo.test sudo[182044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:37 np0005486759.ooo.test python3.9[182046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434356.8271909-358-22470911001455/.source.json _original_basename=.qnhww_my follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:37 np0005486759.ooo.test sudo[182044]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:38 np0005486759.ooo.test sudo[182136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibovpaszvxianpbbtmpqqywftkgtjqto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434357.9857569-373-26371473009869/AnsiballZ_file.py
Oct 14 09:32:38 np0005486759.ooo.test sudo[182136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:38 np0005486759.ooo.test python3.9[182138]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:38 np0005486759.ooo.test sudo[182136]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:39 np0005486759.ooo.test sudo[182228]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfgegveyzsxfixvnpkujrompybrrxywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434358.794512-381-20030873719662/AnsiballZ_stat.py
Oct 14 09:32:39 np0005486759.ooo.test sudo[182228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:39 np0005486759.ooo.test sudo[182228]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:39 np0005486759.ooo.test sudo[182301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asxaicycitejzvzueafakcfhsucvxqpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434358.794512-381-20030873719662/AnsiballZ_copy.py
Oct 14 09:32:39 np0005486759.ooo.test sudo[182301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:39 np0005486759.ooo.test sudo[182301]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:40 np0005486759.ooo.test sudo[182393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obuhghwdzesxdhzpbmpaohlklyztubqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434360.231619-398-233780689852598/AnsiballZ_container_config_data.py
Oct 14 09:32:40 np0005486759.ooo.test sudo[182393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:40 np0005486759.ooo.test python3.9[182395]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 14 09:32:40 np0005486759.ooo.test sudo[182393]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44734 DF PROTO=TCP SPT=33948 DPT=9100 SEQ=95119183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8752020000000001030307) 
Oct 14 09:32:41 np0005486759.ooo.test sudo[182485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwdrwcsauwjjvmhzwdpbzqqjwhdhkwra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434361.130899-407-185808596302590/AnsiballZ_container_config_hash.py
Oct 14 09:32:41 np0005486759.ooo.test sudo[182485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:41 np0005486759.ooo.test python3.9[182487]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:32:41 np0005486759.ooo.test sudo[182485]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8350 DF PROTO=TCP SPT=59800 DPT=9882 SEQ=3718615013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8755A80000000001030307) 
Oct 14 09:32:42 np0005486759.ooo.test sudo[182577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpsltebjtuusdihnnxhdjyslrpuvqknb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434362.0697415-416-198563708508267/AnsiballZ_podman_container_info.py
Oct 14 09:32:42 np0005486759.ooo.test sudo[182577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:42 np0005486759.ooo.test python3.9[182579]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:32:43 np0005486759.ooo.test sudo[182577]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8351 DF PROTO=TCP SPT=59800 DPT=9882 SEQ=3718615013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8759C10000000001030307) 
Oct 14 09:32:45 np0005486759.ooo.test sudo[182695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqqjtdiupvichpxpzxzglpibomzowvow ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434364.7317355-429-163173910784241/AnsiballZ_edpm_container_manage.py
Oct 14 09:32:45 np0005486759.ooo.test sudo[182695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:45 np0005486759.ooo.test python3[182697]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:32:45 np0005486759.ooo.test python3[182697]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "c6d1b3e4cccd28b7c818995b8e8c01f80bc6d31844f018079ac974a1bc7ff587",
                                                                 "Digest": "sha256:cc78c4a7fbd7c7348d3ee41420dd7c42d83eb1e76a8db6bb94a538a5d2f2c424",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:cc78c4a7fbd7c7348d3ee41420dd7c42d83eb1e76a8db6bb94a538a5d2f2c424"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-13T12:47:50.032440747Z",
                                                                 "Config": {
                                                                      "User": "neutron",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 783982852,
                                                                 "VirtualSize": 783982852,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41/diff:/var/lib/containers/storage/overlay/a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",
                                                                           "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",
                                                                           "sha256:921303cda5c9d8779e6603d3888ac24385c443b872bec9c3138835df3416e3df",
                                                                           "sha256:c059b89efb40f3097e4f1e24153e4ed15b8a660accccb7f6b341c8900767b90e",
                                                                           "sha256:e4b986e48b4f8d2e3d4ecc6d2e17b8ac252dfafd4e4fec6074bd29e67b374a2f"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "neutron",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843286399Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843354051Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843394192Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843417133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843442193Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843461914Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:43.236856724Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:17.539596691Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.007092512Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.334560883Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.713915587Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.426474494Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.742526819Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.072068096Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.376327744Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.639696917Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.946940986Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.329166855Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.709072452Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.066214819Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.407947122Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.744473297Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.044338828Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.376253048Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:29.890793292Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.186632274Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.418527973Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:31.913162322Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817436155Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817485046Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817496507Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817505987Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:34.821748777Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:00.340362183Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:40.80916313Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:43.984050021Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:02.624564487Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:02.975848346Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:17.307835722Z",
                                                                           "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:34.859068882Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:43.973792387Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:55.721936346Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:45:47.824671415Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:01.756087342Z",
                                                                           "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:05.881553206Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:09.471669299Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:50.030005091Z",
                                                                           "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:50.030148305Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:53.340205795Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 09:32:45 np0005486759.ooo.test podman[182743]: 2025-10-14 09:32:45.941877302 +0000 UTC m=+0.109757957 container remove 46106dc856940cd65f1d3770d5b9a62b62508f82037048dd79b30b7135059307 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20250721.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhosp osp openstack osp-17.1, managed_by=tripleo_ansible, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-07-21T16:28:53, distribution-scope=public, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '3fc36489e0095da197228558d2f007a2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Oct 14 09:32:45 np0005486759.ooo.test python3[182697]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Oct 14 09:32:46 np0005486759.ooo.test podman[182757]: 
Oct 14 09:32:46 np0005486759.ooo.test podman[182757]: 2025-10-14 09:32:46.036875555 +0000 UTC m=+0.083433298 container create d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:32:46 np0005486759.ooo.test podman[182757]: 2025-10-14 09:32:45.981731128 +0000 UTC m=+0.028288861 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 09:32:46 np0005486759.ooo.test python3[182697]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 09:32:46 np0005486759.ooo.test sudo[182695]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:46 np0005486759.ooo.test sudo[182883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epeehlxzisazvskxspraahasixwoecij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434366.3827267-437-87365632597178/AnsiballZ_stat.py
Oct 14 09:32:46 np0005486759.ooo.test sudo[182883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:46 np0005486759.ooo.test python3.9[182885]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:32:46 np0005486759.ooo.test sudo[182883]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49048 DF PROTO=TCP SPT=47042 DPT=9105 SEQ=1983680305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8768810000000001030307) 
Oct 14 09:32:48 np0005486759.ooo.test sudo[182977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnhlgcrtbrapsahhunlcubnqighwjtos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434367.8425496-446-260447798166353/AnsiballZ_file.py
Oct 14 09:32:48 np0005486759.ooo.test sudo[182977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:48 np0005486759.ooo.test python3.9[182979]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:48 np0005486759.ooo.test sudo[182977]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:48 np0005486759.ooo.test sudo[183023]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voxdgfezlcggmyxyffqdcwnevzpgalis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434367.8425496-446-260447798166353/AnsiballZ_stat.py
Oct 14 09:32:48 np0005486759.ooo.test sudo[183023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:48 np0005486759.ooo.test python3.9[183025]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:32:48 np0005486759.ooo.test sudo[183023]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:32:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8353 DF PROTO=TCP SPT=59800 DPT=9882 SEQ=3718615013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8771820000000001030307) 
Oct 14 09:32:49 np0005486759.ooo.test systemd[1]: tmp-crun.I4Q2nt.mount: Deactivated successfully.
Oct 14 09:32:49 np0005486759.ooo.test podman[183071]: 2025-10-14 09:32:49.462600764 +0000 UTC m=+0.086828732 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 14 09:32:49 np0005486759.ooo.test podman[183071]: 2025-10-14 09:32:49.531858195 +0000 UTC m=+0.156086113 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:32:49 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:32:50 np0005486759.ooo.test sudo[183139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nilpnvjewsvwhktuyruvpzfgidshhjbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434368.8539505-446-39811797404162/AnsiballZ_copy.py
Oct 14 09:32:50 np0005486759.ooo.test sudo[183139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:50 np0005486759.ooo.test python3.9[183141]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434368.8539505-446-39811797404162/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:32:50 np0005486759.ooo.test sudo[183139]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:50 np0005486759.ooo.test sudo[183185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxtpuptvovpvvkkbulyhsyqzxdjbgzcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434368.8539505-446-39811797404162/AnsiballZ_systemd.py
Oct 14 09:32:50 np0005486759.ooo.test sudo[183185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:50 np0005486759.ooo.test python3.9[183187]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:32:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:32:51 np0005486759.ooo.test systemd-sysv-generator[183218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:32:51 np0005486759.ooo.test systemd-rc-local-generator[183213]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:32:51 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:32:51 np0005486759.ooo.test sudo[183185]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:51 np0005486759.ooo.test sudo[183267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbhbbkgbcuddzfjgqblsgkyugjrobjez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434368.8539505-446-39811797404162/AnsiballZ_systemd.py
Oct 14 09:32:51 np0005486759.ooo.test sudo[183267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:32:51 np0005486759.ooo.test python3.9[183269]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:32:51 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:32:52 np0005486759.ooo.test systemd-sysv-generator[183299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:32:52 np0005486759.ooo.test systemd-rc-local-generator[183294]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: Starting ovn_metadata_agent container...
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:32:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5313e13dbe8fa0d121c6505d8092eee3ddc965ebfd0d18fa117e4df3e4d93ae9/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 14 09:32:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5313e13dbe8fa0d121c6505d8092eee3ddc965ebfd0d18fa117e4df3e4d93ae9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:32:52 np0005486759.ooo.test podman[183310]: 2025-10-14 09:32:52.382347109 +0000 UTC m=+0.144100923 container init d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + sudo -E kolla_set_configs
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:32:52 np0005486759.ooo.test podman[183310]: 2025-10-14 09:32:52.416104998 +0000 UTC m=+0.177858812 container start d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 14 09:32:52 np0005486759.ooo.test edpm-start-podman-container[183310]: ovn_metadata_agent
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Validating config file
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Copying service configuration files
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Writing out command to execute
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: ++ cat /run_command
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + CMD=neutron-ovn-metadata-agent
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + ARGS=
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + sudo kolla_copy_cacerts
Oct 14 09:32:52 np0005486759.ooo.test podman[183330]: 2025-10-14 09:32:52.477917099 +0000 UTC m=+0.058223712 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + [[ ! -n '' ]]
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + . kolla_extend_start
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: Running command: 'neutron-ovn-metadata-agent'
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + umask 0022
Oct 14 09:32:52 np0005486759.ooo.test ovn_metadata_agent[183323]: + exec neutron-ovn-metadata-agent
Oct 14 09:32:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22773 DF PROTO=TCP SPT=42656 DPT=9102 SEQ=327093946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F877DC10000000001030307) 
Oct 14 09:32:52 np0005486759.ooo.test podman[183330]: 2025-10-14 09:32:52.561663855 +0000 UTC m=+0.141970508 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:32:52 np0005486759.ooo.test edpm-start-podman-container[183309]: Creating additional drop-in dependency for "ovn_metadata_agent" (d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c)
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:32:52 np0005486759.ooo.test systemd-rc-local-generator[183394]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:32:52 np0005486759.ooo.test systemd-sysv-generator[183397]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:32:52 np0005486759.ooo.test systemd[1]: Started ovn_metadata_agent container.
Oct 14 09:32:52 np0005486759.ooo.test sudo[183267]: pam_unix(sudo:session): session closed for user root
Oct 14 09:32:53 np0005486759.ooo.test sshd[178168]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:32:53 np0005486759.ooo.test systemd[1]: session-34.scope: Deactivated successfully.
Oct 14 09:32:53 np0005486759.ooo.test systemd[1]: session-34.scope: Consumed 31.545s CPU time.
Oct 14 09:32:53 np0005486759.ooo.test systemd-logind[759]: Session 34 logged out. Waiting for processes to exit.
Oct 14 09:32:53 np0005486759.ooo.test systemd-logind[759]: Removed session 34.
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.091 183328 INFO neutron.common.config [-] Logging enabled!
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.092 183328 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.092 183328 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.092 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.092 183328 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.092 183328 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.093 183328 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.094 183328 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.095 183328 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.096 183328 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.097 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.098 183328 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.099 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.100 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.101 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.102 183328 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.103 183328 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.104 183328 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.104 183328 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.104 183328 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.104 183328 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.104 183328 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.104 183328 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.105 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.106 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.106 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.106 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.106 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.106 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.106 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.107 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.108 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.109 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.110 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.111 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.112 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.113 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.114 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.115 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.116 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.117 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.118 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.119 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.120 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.121 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.122 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.123 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.123 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.123 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.123 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.123 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.123 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.124 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.125 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.125 183328 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.125 183328 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.125 183328 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.125 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.125 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.126 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.127 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.128 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.129 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.130 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.130 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.130 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.130 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.130 183328 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.130 183328 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.140 183328 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.140 183328 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.140 183328 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.141 183328 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.141 183328 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.158 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 93d451ec-9a31-4880-9638-030ff3f86e88 (UUID: 93d451ec-9a31-4880-9638-030ff3f86e88) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.185 183328 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.186 183328 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.186 183328 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.186 183328 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.188 183328 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.190 183328 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.199 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:cf:16 192.168.0.173'], port_security=['fa:16:3e:8e:cf:16 192.168.0.173'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.173/24', 'neutron:device_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005486759.ooo.test', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9197abc5-07db-4abf-9578-9360b49aea49', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'faabc66a-aada-4f6b-bec3-989808c74b8e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62cfbaba-fb96-4812-8b41-6ad8964122a3, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=eee08de8-f983-4ebe-a654-f67f48659e50) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.200 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '93d451ec-9a31-4880-9638-030ff3f86e88'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], external_ids={'neutron:ovn-metadata-id': 'e5117c5a-8055-5a7f-96ac-165928de2a73', 'neutron:ovn-metadata-sb-cfg': '3'}, name=93d451ec-9a31-4880-9638-030ff3f86e88, nb_cfg_timestamp=1760434318488, nb_cfg=5) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.200 183328 INFO neutron.agent.ovn.metadata.agent [-] Port eee08de8-f983-4ebe-a654-f67f48659e50 in datapath 9197abc5-07db-4abf-9578-9360b49aea49 bound to our chassis on insert
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.201 183328 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f7754a7fa00>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.202 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.202 183328 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.202 183328 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.203 183328 INFO oslo_service.service [-] Starting 1 workers
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.206 183328 DEBUG oslo_service.service [-] Started child 183428 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.208 183428 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-432310'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.208 183328 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.210 183328 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpfjtgrwp7/privsep.sock']
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.226 183428 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.227 183428 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.227 183428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.229 183428 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.230 183428 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.238 183428 INFO eventlet.wsgi.server [-] (183428) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.795 183328 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.796 183328 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfjtgrwp7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.684 183433 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.687 183433 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.690 183433 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.690 183433 INFO oslo.privsep.daemon [-] privsep daemon running as pid 183433
Oct 14 09:32:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:54.800 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[579ef5ea-6bf4-4852-b132-7da5ac75123f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:55.235 183433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:32:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:55.236 183433 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:32:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:55.236 183433 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:32:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:55.747 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[ae97e0d0-3913-4bc2-a0a5-b54f1149bbc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:55.748 183328 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpl3qvay34/privsep.sock']
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.367 183328 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.368 183328 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpl3qvay34/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.234 183444 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.239 183444 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.243 183444 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.243 183444 INFO oslo.privsep.daemon [-] privsep daemon running as pid 183444
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.372 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[e31c931e-8be3-474a-bf30-aa4387a8bdea]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22774 DF PROTO=TCP SPT=42656 DPT=9102 SEQ=327093946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F878D810000000001030307) 
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.812 183444 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.812 183444 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:32:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:56.812 183444 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.305 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[799e6667-cf21-4b42-b99c-e6e611fdf40b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.307 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[f5067850-152f-492d-9b16-a5af0976fe7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.324 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[207a9b47-e843-4af3-9b00-249bf1f33f4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.340 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[f61d1ed9-46ca-42c2-b107-fa3ee031aafa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9197abc5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:b0:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 83, 'rx_bytes': 8926, 'tx_bytes': 8133, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 83, 'rx_bytes': 8926, 'tx_bytes': 8133, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754103, 'reachable_time': 19104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 183454, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.354 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[88a024b8-1734-4c9e-bb32-6c6a7698d420]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap9197abc5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 754112, 'tstamp': 754112}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 183455, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9197abc5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 754114, 'tstamp': 754114}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 183455, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 754110, 'tstamp': 754110}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 183455, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:b0ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 754103, 'tstamp': 754103}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 183455, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.402 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[8b891cbd-61f3-482f-a710-e69cd30d57a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.403 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9197abc5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.407 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9197abc5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.407 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.408 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9197abc5-00, col_values=(('external_ids', {'iface-id': '25844137-067c-4137-b11d-9fc6e75f59fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.408 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:32:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.412 183328 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpu865m0wg/privsep.sock']
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.063 183328 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.064 183328 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpu865m0wg/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.922 183464 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.929 183464 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.932 183464 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:57.932 183464 INFO oslo.privsep.daemon [-] privsep daemon running as pid 183464
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.067 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5860ce-d01b-4085-b43e-cd562fed7e16]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.496 183464 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.496 183464 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.496 183464 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:32:58 np0005486759.ooo.test sshd[183469]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:32:58 np0005486759.ooo.test sshd[183469]: Accepted publickey for zuul from 192.168.122.31 port 42492 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:32:58 np0005486759.ooo.test systemd-logind[759]: New session 35 of user zuul.
Oct 14 09:32:58 np0005486759.ooo.test systemd[1]: Started Session 35 of User zuul.
Oct 14 09:32:58 np0005486759.ooo.test sshd[183469]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.947 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[15cee2a4-aef4-47cc-a940-289e1c3b352c]: (4, ['ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.951 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, column=external_ids, values=({'neutron:ovn-metadata-id': 'e5117c5a-8055-5a7f-96ac-165928de2a73'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.952 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.953 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.967 183328 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.968 183328 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.968 183328 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.968 183328 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.968 183328 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.968 183328 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.969 183328 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.969 183328 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.969 183328 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.969 183328 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.970 183328 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.970 183328 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.970 183328 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.970 183328 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.971 183328 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.971 183328 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.971 183328 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.972 183328 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.972 183328 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.972 183328 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.972 183328 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.973 183328 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.973 183328 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.973 183328 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.973 183328 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.974 183328 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.974 183328 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.974 183328 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.975 183328 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.975 183328 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.975 183328 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.975 183328 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.976 183328 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.976 183328 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.976 183328 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.976 183328 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.977 183328 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.977 183328 DEBUG oslo_service.service [-] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.977 183328 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.978 183328 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.978 183328 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.978 183328 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.978 183328 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.979 183328 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.979 183328 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.979 183328 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.980 183328 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.980 183328 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.980 183328 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.980 183328 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.980 183328 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.980 183328 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.981 183328 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.982 183328 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.983 183328 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.984 183328 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.985 183328 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.986 183328 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.987 183328 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.988 183328 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.989 183328 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.990 183328 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.991 183328 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.992 183328 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.993 183328 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.994 183328 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.995 183328 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.996 183328 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.997 183328 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.998 183328 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:58.999 183328 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.000 183328 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.001 183328 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.002 183328 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.003 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.004 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.005 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.006 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.007 183328 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.007 183328 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.007 183328 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.007 183328 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.007 183328 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:32:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:32:59.007 183328 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:32:59 np0005486759.ooo.test python3.9[183562]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:33:01 np0005486759.ooo.test sudo[183656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngogrelxxmwthqqljkfkrgqmctgjgavb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434381.1758776-34-222841164777317/AnsiballZ_command.py
Oct 14 09:33:01 np0005486759.ooo.test sudo[183656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:01 np0005486759.ooo.test python3.9[183658]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:01 np0005486759.ooo.test sudo[183656]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:02 np0005486759.ooo.test sudo[183761]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewyfscyrtpqsyhkmmynycdjmnojuprit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434382.031082-42-232349721947922/AnsiballZ_command.py
Oct 14 09:33:02 np0005486759.ooo.test sudo[183761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:02 np0005486759.ooo.test python3.9[183763]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:02 np0005486759.ooo.test systemd[1]: libpod-405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03.scope: Deactivated successfully.
Oct 14 09:33:02 np0005486759.ooo.test podman[183764]: 2025-10-14 09:33:02.539299639 +0000 UTC m=+0.057876251 container died 405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, version=17.1.9, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Oct 14 09:33:02 np0005486759.ooo.test systemd[1]: tmp-crun.qgSDvX.mount: Deactivated successfully.
Oct 14 09:33:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03-userdata-shm.mount: Deactivated successfully.
Oct 14 09:33:02 np0005486759.ooo.test podman[183764]: 2025-10-14 09:33:02.579535056 +0000 UTC m=+0.098111668 container cleanup 405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, batch=17.1_20250721.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, vendor=Red Hat, Inc., build-date=2025-07-21T14:56:59, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1, io.buildah.version=1.33.12, release=2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.9, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/agreements)
Oct 14 09:33:02 np0005486759.ooo.test sudo[183761]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:02 np0005486759.ooo.test podman[183778]: 2025-10-14 09:33:02.651559448 +0000 UTC m=+0.103960323 container remove 405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, tcib_managed=true, vcs-ref=809f31d3cd93a9e04341110fb85686656c754dc0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1, version=17.1.9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-libvirt/images/17.1.9-2, vcs-type=git, release=2, distribution-scope=public, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, build-date=2025-07-21T14:56:59, io.buildah.version=1.33.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container)
Oct 14 09:33:02 np0005486759.ooo.test systemd[1]: libpod-conmon-405407de268d11ba080566a26cc3ae8d0394c26fa774a539bb024680f4edbf03.scope: Deactivated successfully.
Oct 14 09:33:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ca4da532e9e4321ff6fa78b70e2d8e6834968b1021a5a882c80b2624d34637c6-merged.mount: Deactivated successfully.
Oct 14 09:33:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54255 DF PROTO=TCP SPT=56748 DPT=9100 SEQ=3152084009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87AB870000000001030307) 
Oct 14 09:33:04 np0005486759.ooo.test sudo[183882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsvddbhiurvozisqxwpxzhpnfwduabqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434383.718284-52-205126928899804/AnsiballZ_systemd_service.py
Oct 14 09:33:04 np0005486759.ooo.test sudo[183882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:04 np0005486759.ooo.test python3.9[183884]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:33:04 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:33:04 np0005486759.ooo.test systemd-rc-local-generator[183908]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:33:04 np0005486759.ooo.test systemd-sysv-generator[183915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:33:04 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:33:04 np0005486759.ooo.test sudo[183882]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54256 DF PROTO=TCP SPT=56748 DPT=9100 SEQ=3152084009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87AF810000000001030307) 
Oct 14 09:33:05 np0005486759.ooo.test python3.9[184010]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:33:05 np0005486759.ooo.test network[184027]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:33:05 np0005486759.ooo.test network[184028]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:33:05 np0005486759.ooo.test network[184029]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:33:06 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:33:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54257 DF PROTO=TCP SPT=56748 DPT=9100 SEQ=3152084009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87B7820000000001030307) 
Oct 14 09:33:09 np0005486759.ooo.test sudo[184228]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soojhahhpjicrwxwnfwrhgkketpucwpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434389.2720628-71-112173174291664/AnsiballZ_systemd_service.py
Oct 14 09:33:09 np0005486759.ooo.test sudo[184228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:09 np0005486759.ooo.test python3.9[184230]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:10 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:33:11 np0005486759.ooo.test systemd-sysv-generator[184263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:33:11 np0005486759.ooo.test systemd-rc-local-generator[184260]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:33:11 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:33:11 np0005486759.ooo.test systemd[1]: Stopped target tripleo_nova_libvirt.target.
Oct 14 09:33:11 np0005486759.ooo.test sudo[184228]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54258 DF PROTO=TCP SPT=56748 DPT=9100 SEQ=3152084009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87C7410000000001030307) 
Oct 14 09:33:11 np0005486759.ooo.test sudo[184360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krhfbbyzpmpfydtnppyjbietkjahepny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434391.3654597-71-76895956561498/AnsiballZ_systemd_service.py
Oct 14 09:33:11 np0005486759.ooo.test sudo[184360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:11 np0005486759.ooo.test python3.9[184362]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:12 np0005486759.ooo.test sudo[184360]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59969 DF PROTO=TCP SPT=44258 DPT=9882 SEQ=4022424985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87CAD80000000001030307) 
Oct 14 09:33:12 np0005486759.ooo.test sudo[184453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rypjmrtvgcgqyscreozmijyectkeedep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434392.1242745-71-203963855812347/AnsiballZ_systemd_service.py
Oct 14 09:33:12 np0005486759.ooo.test sudo[184453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:12 np0005486759.ooo.test python3.9[184455]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:12 np0005486759.ooo.test sudo[184453]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:13 np0005486759.ooo.test sudo[184546]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wutfwemhxtecmjvzsvovuxkkyvuhpuke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434392.8673081-71-241143584148737/AnsiballZ_systemd_service.py
Oct 14 09:33:13 np0005486759.ooo.test sudo[184546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59970 DF PROTO=TCP SPT=44258 DPT=9882 SEQ=4022424985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87CEC10000000001030307) 
Oct 14 09:33:13 np0005486759.ooo.test python3.9[184548]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:13 np0005486759.ooo.test sudo[184546]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:13 np0005486759.ooo.test sudo[184639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rijfncqcwhrujaxwrcvcnodleudnylif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434393.6055505-71-11935435966943/AnsiballZ_systemd_service.py
Oct 14 09:33:13 np0005486759.ooo.test sudo[184639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:14 np0005486759.ooo.test python3.9[184641]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:14 np0005486759.ooo.test sudo[184639]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:14 np0005486759.ooo.test sudo[184732]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlpbumqfzbgkpiplkpzuvjpwihxmzrez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434394.2994823-71-191006696174091/AnsiballZ_systemd_service.py
Oct 14 09:33:14 np0005486759.ooo.test sudo[184732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:14 np0005486759.ooo.test python3.9[184734]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:14 np0005486759.ooo.test sudo[184732]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:15 np0005486759.ooo.test sudo[184825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzdanbdtancrvvolwdjnleexrebometb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434395.6313818-71-6890209400949/AnsiballZ_systemd_service.py
Oct 14 09:33:15 np0005486759.ooo.test sudo[184825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:16 np0005486759.ooo.test python3.9[184827]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:33:16 np0005486759.ooo.test sudo[184825]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:17 np0005486759.ooo.test sudo[184918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mogmhocetrcemvjyfhieqyzanvkxltsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434396.5320332-123-157553935134119/AnsiballZ_file.py
Oct 14 09:33:17 np0005486759.ooo.test sudo[184918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13513 DF PROTO=TCP SPT=49682 DPT=9105 SEQ=3464154220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87DD810000000001030307) 
Oct 14 09:33:17 np0005486759.ooo.test python3.9[184920]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:17 np0005486759.ooo.test sudo[184918]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:17 np0005486759.ooo.test sudo[185010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yokudpmosilmyitpqhcyzagcyricwvox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434397.3714278-123-259328191661624/AnsiballZ_file.py
Oct 14 09:33:17 np0005486759.ooo.test sudo[185010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:17 np0005486759.ooo.test python3.9[185012]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:17 np0005486759.ooo.test sudo[185010]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:18 np0005486759.ooo.test sudo[185102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfuptmecwvepsvriuazejgllmbsbhhrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434398.5111675-123-193460319057847/AnsiballZ_file.py
Oct 14 09:33:18 np0005486759.ooo.test sudo[185102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:18 np0005486759.ooo.test python3.9[185104]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:18 np0005486759.ooo.test sudo[185102]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59972 DF PROTO=TCP SPT=44258 DPT=9882 SEQ=4022424985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87E6810000000001030307) 
Oct 14 09:33:19 np0005486759.ooo.test sudo[185194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnetccpirlwcyggmssqhwwktrnytwraj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434399.1112096-123-223373063331856/AnsiballZ_file.py
Oct 14 09:33:19 np0005486759.ooo.test sudo[185194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:19 np0005486759.ooo.test python3.9[185196]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:19 np0005486759.ooo.test sudo[185194]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:19 np0005486759.ooo.test sudo[185286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exauzndhvekrrsuuwuyslbocjfyckalr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434399.7073445-123-252468040208531/AnsiballZ_file.py
Oct 14 09:33:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:33:19 np0005486759.ooo.test sudo[185286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:20 np0005486759.ooo.test podman[185288]: 2025-10-14 09:33:20.053394561 +0000 UTC m=+0.053932585 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:33:20 np0005486759.ooo.test podman[185288]: 2025-10-14 09:33:20.128629365 +0000 UTC m=+0.129167359 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:33:20 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:33:20 np0005486759.ooo.test python3.9[185289]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:20 np0005486759.ooo.test sudo[185286]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:20 np0005486759.ooo.test sudo[185403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehuqkhzpfdwlwmynewuseygfjfjktzih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434400.3805463-123-74901437945539/AnsiballZ_file.py
Oct 14 09:33:20 np0005486759.ooo.test sudo[185403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:20 np0005486759.ooo.test python3.9[185405]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:20 np0005486759.ooo.test sudo[185403]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:21 np0005486759.ooo.test sudo[185495]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twmllfsjfmbxybivytdiidsdonqluidf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434401.015532-123-171172392332485/AnsiballZ_file.py
Oct 14 09:33:21 np0005486759.ooo.test sudo[185495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:21 np0005486759.ooo.test python3.9[185497]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:21 np0005486759.ooo.test sudo[185495]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:21 np0005486759.ooo.test sudo[185587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zowupprbplltadspnrpnmajofmcjhtrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434401.681526-173-156321977952572/AnsiballZ_file.py
Oct 14 09:33:21 np0005486759.ooo.test sudo[185587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:22 np0005486759.ooo.test python3.9[185589]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:22 np0005486759.ooo.test sudo[185587]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:22 np0005486759.ooo.test sudo[185679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cblklursiliohtixtsklvbewgkhmjhgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434402.2573912-173-106709869509552/AnsiballZ_file.py
Oct 14 09:33:22 np0005486759.ooo.test sudo[185679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21660 DF PROTO=TCP SPT=45044 DPT=9102 SEQ=3289787114 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F87F3010000000001030307) 
Oct 14 09:33:22 np0005486759.ooo.test python3.9[185681]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:22 np0005486759.ooo.test sudo[185679]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:23 np0005486759.ooo.test sudo[185771]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlwyvsezskitxtignsxpdrbgtgymhjst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434402.8733962-173-11788203595719/AnsiballZ_file.py
Oct 14 09:33:23 np0005486759.ooo.test sudo[185771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:33:23 np0005486759.ooo.test podman[185773]: 2025-10-14 09:33:23.235829006 +0000 UTC m=+0.059742870 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:33:23 np0005486759.ooo.test podman[185773]: 2025-10-14 09:33:23.266120015 +0000 UTC m=+0.090033919 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 14 09:33:23 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:33:23 np0005486759.ooo.test python3.9[185774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:23 np0005486759.ooo.test sudo[185771]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:23 np0005486759.ooo.test sudo[185881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmqsbgpmbxwfhdzoghkwotjpiulawlpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434403.504241-173-230800777762132/AnsiballZ_file.py
Oct 14 09:33:23 np0005486759.ooo.test sudo[185881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:24 np0005486759.ooo.test python3.9[185883]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:24 np0005486759.ooo.test sudo[185881]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:24 np0005486759.ooo.test sudo[185973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjyqphepssntwejgkwfpiiovxoqxvxcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434404.1641698-173-109732587052588/AnsiballZ_file.py
Oct 14 09:33:24 np0005486759.ooo.test sudo[185973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:24 np0005486759.ooo.test python3.9[185975]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:24 np0005486759.ooo.test sudo[185973]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:24 np0005486759.ooo.test sudo[186065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkifiazkukvxgmrtqhslhfcrzrtdfyiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434404.716998-173-98395160430281/AnsiballZ_file.py
Oct 14 09:33:24 np0005486759.ooo.test sudo[186065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:25 np0005486759.ooo.test python3.9[186067]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:25 np0005486759.ooo.test sudo[186065]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:25 np0005486759.ooo.test sudo[186157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaurmvmahpjpevpjjdiyckvfrzokzlwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434405.2930515-173-107575351400735/AnsiballZ_file.py
Oct 14 09:33:25 np0005486759.ooo.test sudo[186157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:25 np0005486759.ooo.test python3.9[186159]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:33:25 np0005486759.ooo.test sudo[186157]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:26 np0005486759.ooo.test sudo[186249]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljtbihjrchpkreinfhxfulnfsbxrhfln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434406.0088952-224-161488542709008/AnsiballZ_command.py
Oct 14 09:33:26 np0005486759.ooo.test sudo[186249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:26 np0005486759.ooo.test python3.9[186251]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                           systemctl disable --now certmonger.service
                                                           test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                         fi
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:26 np0005486759.ooo.test sudo[186249]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21661 DF PROTO=TCP SPT=45044 DPT=9102 SEQ=3289787114 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8802C10000000001030307) 
Oct 14 09:33:27 np0005486759.ooo.test python3.9[186343]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:33:27 np0005486759.ooo.test sudo[186433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqdeilwoezlmuxgqiawrbipuspccudjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434407.612347-242-136208882412125/AnsiballZ_systemd_service.py
Oct 14 09:33:27 np0005486759.ooo.test sudo[186433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:28 np0005486759.ooo.test python3.9[186435]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:33:28 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:33:28 np0005486759.ooo.test systemd-rc-local-generator[186463]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:33:28 np0005486759.ooo.test systemd-sysv-generator[186466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:33:28 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:33:28 np0005486759.ooo.test sudo[186433]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:29 np0005486759.ooo.test sudo[186561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdrmvzsoyibieznhnlxmchtfpejsdmjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434409.4200392-250-216924637999433/AnsiballZ_command.py
Oct 14 09:33:29 np0005486759.ooo.test sudo[186561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:29 np0005486759.ooo.test python3.9[186563]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:29 np0005486759.ooo.test sudo[186561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:30 np0005486759.ooo.test sudo[186654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxavuoxghiosfprhvljyknhhxwmticgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434409.9658678-250-210979936115374/AnsiballZ_command.py
Oct 14 09:33:30 np0005486759.ooo.test sudo[186654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:30 np0005486759.ooo.test python3.9[186656]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:30 np0005486759.ooo.test sudo[186654]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:31 np0005486759.ooo.test sudo[186747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwxilzfejruaowuednowiafhvzhrnwsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434411.3580754-250-67374972786533/AnsiballZ_command.py
Oct 14 09:33:31 np0005486759.ooo.test sudo[186747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:31 np0005486759.ooo.test python3.9[186749]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:31 np0005486759.ooo.test sudo[186747]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:32 np0005486759.ooo.test sudo[186840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usjuhufgawmgepjltxtwherxxggkepqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434411.9750855-250-160398372942128/AnsiballZ_command.py
Oct 14 09:33:32 np0005486759.ooo.test sudo[186840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:32 np0005486759.ooo.test python3.9[186842]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:32 np0005486759.ooo.test sudo[186840]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:32 np0005486759.ooo.test sudo[186933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjtfwftbpakvcqjjzycgjtlvrjpdlyow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434412.5768962-250-258694119648668/AnsiballZ_command.py
Oct 14 09:33:32 np0005486759.ooo.test sudo[186933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:33 np0005486759.ooo.test python3.9[186935]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:33 np0005486759.ooo.test sudo[186933]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:33 np0005486759.ooo.test sudo[187026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twvsosyvchiyxyxszfubbtjwdxkfbwxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434413.2816465-250-56618879411325/AnsiballZ_command.py
Oct 14 09:33:33 np0005486759.ooo.test sudo[187026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:33 np0005486759.ooo.test python3.9[187028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:33 np0005486759.ooo.test sudo[187026]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:34 np0005486759.ooo.test sudo[187119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfannwihexlumiqehypkymgaidmsgdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434413.8862705-250-217745887336721/AnsiballZ_command.py
Oct 14 09:33:34 np0005486759.ooo.test sudo[187119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16117 DF PROTO=TCP SPT=44680 DPT=9100 SEQ=4134968808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8820B70000000001030307) 
Oct 14 09:33:34 np0005486759.ooo.test python3.9[187121]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:33:34 np0005486759.ooo.test sudo[187119]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:35 np0005486759.ooo.test sudo[187212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmhcdqhquxrhhkooklvotyqubhdepijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434414.7789211-304-184478317377260/AnsiballZ_getent.py
Oct 14 09:33:35 np0005486759.ooo.test sudo[187212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16118 DF PROTO=TCP SPT=44680 DPT=9100 SEQ=4134968808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8824C10000000001030307) 
Oct 14 09:33:35 np0005486759.ooo.test python3.9[187214]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 14 09:33:35 np0005486759.ooo.test sudo[187212]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:36 np0005486759.ooo.test sudo[187305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jttogwlsflhattiohikvcjuxbrfabgho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434415.6784549-312-32554674442631/AnsiballZ_group.py
Oct 14 09:33:36 np0005486759.ooo.test sudo[187305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:36 np0005486759.ooo.test python3.9[187307]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 14 09:33:36 np0005486759.ooo.test groupadd[187308]: group added to /etc/group: name=libvirt, GID=42473
Oct 14 09:33:36 np0005486759.ooo.test groupadd[187308]: group added to /etc/gshadow: name=libvirt
Oct 14 09:33:36 np0005486759.ooo.test groupadd[187308]: new group: name=libvirt, GID=42473
Oct 14 09:33:36 np0005486759.ooo.test sudo[187305]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:37 np0005486759.ooo.test sudo[187403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dufzqokorjouluevavzrwkrqirnfqdox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434416.6530132-320-161815340025089/AnsiballZ_user.py
Oct 14 09:33:37 np0005486759.ooo.test sudo[187403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:37 np0005486759.ooo.test python3.9[187405]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486759.ooo.test update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 14 09:33:37 np0005486759.ooo.test useradd[187407]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Oct 14 09:33:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16119 DF PROTO=TCP SPT=44680 DPT=9100 SEQ=4134968808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F882CC10000000001030307) 
Oct 14 09:33:37 np0005486759.ooo.test sudo[187403]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:38 np0005486759.ooo.test sudo[187503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxwyzxyfrhhcpdgbhwxneqdnlmxehbdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434417.7824936-331-13565142845985/AnsiballZ_setup.py
Oct 14 09:33:38 np0005486759.ooo.test sudo[187503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:38 np0005486759.ooo.test python3.9[187505]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:33:38 np0005486759.ooo.test sudo[187503]: pam_unix(sudo:session): session closed for user root
Oct 14 09:33:39 np0005486759.ooo.test sudo[187557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibgaucbsqusjznhyqhlzprxnnsckfbhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434417.7824936-331-13565142845985/AnsiballZ_dnf.py
Oct 14 09:33:39 np0005486759.ooo.test sudo[187557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:33:39 np0005486759.ooo.test python3.9[187559]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:33:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16120 DF PROTO=TCP SPT=44680 DPT=9100 SEQ=4134968808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F883C810000000001030307) 
Oct 14 09:33:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16486 DF PROTO=TCP SPT=33334 DPT=9882 SEQ=3319907510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8840080000000001030307) 
Oct 14 09:33:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16487 DF PROTO=TCP SPT=33334 DPT=9882 SEQ=3319907510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8844010000000001030307) 
Oct 14 09:33:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24854 DF PROTO=TCP SPT=32944 DPT=9105 SEQ=3108630786 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8852C10000000001030307) 
Oct 14 09:33:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16489 DF PROTO=TCP SPT=33334 DPT=9882 SEQ=3319907510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F885BC20000000001030307) 
Oct 14 09:33:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:33:50 np0005486759.ooo.test podman[187634]: 2025-10-14 09:33:50.457805826 +0000 UTC m=+0.084588427 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:33:50 np0005486759.ooo.test podman[187634]: 2025-10-14 09:33:50.492336622 +0000 UTC m=+0.119119223 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=ovn_controller)
Oct 14 09:33:50 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:33:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9350 DF PROTO=TCP SPT=48480 DPT=9102 SEQ=3782574173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8868410000000001030307) 
Oct 14 09:33:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:33:53 np0005486759.ooo.test podman[187659]: 2025-10-14 09:33:53.463452547 +0000 UTC m=+0.088793783 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:33:53 np0005486759.ooo.test podman[187659]: 2025-10-14 09:33:53.498528303 +0000 UTC m=+0.123869519 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 14 09:33:53 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:33:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:33:54.132 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:33:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:33:54.133 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:33:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:33:54.134 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:33:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9351 DF PROTO=TCP SPT=48480 DPT=9102 SEQ=3782574173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8878010000000001030307) 
Oct 14 09:34:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27756 DF PROTO=TCP SPT=52608 DPT=9100 SEQ=709034240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8895E80000000001030307) 
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  Converting 2750 SID table entries...
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:04 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:34:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27757 DF PROTO=TCP SPT=52608 DPT=9100 SEQ=709034240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F889A020000000001030307) 
Oct 14 09:34:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27758 DF PROTO=TCP SPT=52608 DPT=9100 SEQ=709034240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88A2010000000001030307) 
Oct 14 09:34:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27759 DF PROTO=TCP SPT=52608 DPT=9100 SEQ=709034240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88B1C10000000001030307) 
Oct 14 09:34:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39602 DF PROTO=TCP SPT=47952 DPT=9882 SEQ=4043832771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88B5380000000001030307) 
Oct 14 09:34:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39603 DF PROTO=TCP SPT=47952 DPT=9882 SEQ=4043832771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88B9410000000001030307) 
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  Converting 2753 SID table entries...
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:14 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:34:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50302 DF PROTO=TCP SPT=34174 DPT=9105 SEQ=3975119881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88C8020000000001030307) 
Oct 14 09:34:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39605 DF PROTO=TCP SPT=47952 DPT=9882 SEQ=4043832771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88D1020000000001030307) 
Oct 14 09:34:21 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Oct 14 09:34:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:34:21 np0005486759.ooo.test podman[188628]: 2025-10-14 09:34:21.439130131 +0000 UTC m=+0.055729574 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:34:21 np0005486759.ooo.test podman[188628]: 2025-10-14 09:34:21.523745245 +0000 UTC m=+0.140344698 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:34:21 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:34:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37749 DF PROTO=TCP SPT=39304 DPT=9102 SEQ=960811475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88DD810000000001030307) 
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  Converting 2753 SID table entries...
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:23 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:34:24 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=21 res=1
Oct 14 09:34:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:34:24 np0005486759.ooo.test podman[188662]: 2025-10-14 09:34:24.43798175 +0000 UTC m=+0.059918156 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 09:34:24 np0005486759.ooo.test podman[188662]: 2025-10-14 09:34:24.447360908 +0000 UTC m=+0.069297404 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:34:24 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:34:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37750 DF PROTO=TCP SPT=39304 DPT=9102 SEQ=960811475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F88ED420000000001030307) 
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  Converting 2753 SID table entries...
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:31 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:34:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13876 DF PROTO=TCP SPT=50248 DPT=9100 SEQ=1572009248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F890B170000000001030307) 
Oct 14 09:34:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13877 DF PROTO=TCP SPT=50248 DPT=9100 SEQ=1572009248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F890F010000000001030307) 
Oct 14 09:34:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13878 DF PROTO=TCP SPT=50248 DPT=9100 SEQ=1572009248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8917010000000001030307) 
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  Converting 2753 SID table entries...
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:41 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:34:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13879 DF PROTO=TCP SPT=50248 DPT=9100 SEQ=1572009248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8926C10000000001030307) 
Oct 14 09:34:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29944 DF PROTO=TCP SPT=35840 DPT=9882 SEQ=4217603023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F892A680000000001030307) 
Oct 14 09:34:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29945 DF PROTO=TCP SPT=35840 DPT=9882 SEQ=4217603023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F892E820000000001030307) 
Oct 14 09:34:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16973 DF PROTO=TCP SPT=54118 DPT=9105 SEQ=336379236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F893D410000000001030307) 
Oct 14 09:34:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29947 DF PROTO=TCP SPT=35840 DPT=9882 SEQ=4217603023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8946420000000001030307) 
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  Converting 2753 SID table entries...
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:49 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:34:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:34:50 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Oct 14 09:34:50 np0005486759.ooo.test systemd-rc-local-generator[188734]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:34:50 np0005486759.ooo.test systemd-sysv-generator[188739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:34:50 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:34:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:34:50 np0005486759.ooo.test systemd-sysv-generator[188778]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:34:50 np0005486759.ooo.test systemd-rc-local-generator[188775]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:34:50 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:34:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:34:52 np0005486759.ooo.test podman[188791]: 2025-10-14 09:34:52.441412253 +0000 UTC m=+0.069251403 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:34:52 np0005486759.ooo.test podman[188791]: 2025-10-14 09:34:52.4895129 +0000 UTC m=+0.117352070 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 09:34:52 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:34:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55265 DF PROTO=TCP SPT=59892 DPT=9102 SEQ=1049027377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8952C10000000001030307) 
Oct 14 09:34:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:34:54.134 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:34:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:34:54.135 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:34:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:34:54.137 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:34:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:34:55 np0005486759.ooo.test systemd[1]: tmp-crun.PcGGPJ.mount: Deactivated successfully.
Oct 14 09:34:55 np0005486759.ooo.test podman[188815]: 2025-10-14 09:34:55.477329472 +0000 UTC m=+0.106016685 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 09:34:55 np0005486759.ooo.test podman[188815]: 2025-10-14 09:34:55.486401402 +0000 UTC m=+0.115088635 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:34:55 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:34:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55266 DF PROTO=TCP SPT=59892 DPT=9102 SEQ=1049027377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8962820000000001030307) 
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  Converting 2754 SID table entries...
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability network_peer_controls=1
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability open_perms=1
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability extended_socket_class=1
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability always_check_network=0
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 14 09:34:59 np0005486759.ooo.test kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 14 09:35:00 np0005486759.ooo.test groupadd[188841]: group added to /etc/group: name=clevis, GID=985
Oct 14 09:35:00 np0005486759.ooo.test groupadd[188841]: group added to /etc/gshadow: name=clevis
Oct 14 09:35:00 np0005486759.ooo.test groupadd[188841]: new group: name=clevis, GID=985
Oct 14 09:35:00 np0005486759.ooo.test useradd[188848]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 14 09:35:00 np0005486759.ooo.test usermod[188858]: add 'clevis' to group 'tss'
Oct 14 09:35:00 np0005486759.ooo.test usermod[188858]: add 'clevis' to shadow group 'tss'
Oct 14 09:35:03 np0005486759.ooo.test groupadd[188880]: group added to /etc/group: name=dnsmasq, GID=984
Oct 14 09:35:03 np0005486759.ooo.test groupadd[188880]: group added to /etc/gshadow: name=dnsmasq
Oct 14 09:35:03 np0005486759.ooo.test groupadd[188880]: new group: name=dnsmasq, GID=984
Oct 14 09:35:03 np0005486759.ooo.test useradd[188887]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 14 09:35:03 np0005486759.ooo.test dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Oct 14 09:35:03 np0005486759.ooo.test dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Oct 14 09:35:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16403 DF PROTO=TCP SPT=40394 DPT=9100 SEQ=233235581 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8980470000000001030307) 
Oct 14 09:35:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16404 DF PROTO=TCP SPT=40394 DPT=9100 SEQ=233235581 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8984410000000001030307) 
Oct 14 09:35:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16405 DF PROTO=TCP SPT=40394 DPT=9100 SEQ=233235581 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F898C410000000001030307) 
Oct 14 09:35:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16406 DF PROTO=TCP SPT=40394 DPT=9100 SEQ=233235581 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F899C010000000001030307) 
Oct 14 09:35:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8073 DF PROTO=TCP SPT=57632 DPT=9882 SEQ=2775018305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F899F980000000001030307) 
Oct 14 09:35:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8074 DF PROTO=TCP SPT=57632 DPT=9882 SEQ=2775018305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89A3810000000001030307) 
Oct 14 09:35:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5812 DF PROTO=TCP SPT=57878 DPT=9105 SEQ=1052324724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89B2410000000001030307) 
Oct 14 09:35:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8076 DF PROTO=TCP SPT=57632 DPT=9882 SEQ=2775018305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89BB410000000001030307) 
Oct 14 09:35:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16574 DF PROTO=TCP SPT=35366 DPT=9102 SEQ=555718247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89C7C10000000001030307) 
Oct 14 09:35:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:35:23 np0005486759.ooo.test systemd[1]: tmp-crun.4o5WKM.mount: Deactivated successfully.
Oct 14 09:35:23 np0005486759.ooo.test podman[195003]: 2025-10-14 09:35:23.484004859 +0000 UTC m=+0.099911297 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:35:23 np0005486759.ooo.test podman[195003]: 2025-10-14 09:35:23.581556475 +0000 UTC m=+0.197462873 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 14 09:35:23 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:35:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:35:26 np0005486759.ooo.test podman[197688]: 2025-10-14 09:35:26.45748339 +0000 UTC m=+0.076389737 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:35:26 np0005486759.ooo.test podman[197688]: 2025-10-14 09:35:26.491383878 +0000 UTC m=+0.110290235 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 09:35:26 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:35:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16575 DF PROTO=TCP SPT=35366 DPT=9102 SEQ=555718247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89D7810000000001030307) 
Oct 14 09:35:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6121 DF PROTO=TCP SPT=41574 DPT=9100 SEQ=2360505952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89F5770000000001030307) 
Oct 14 09:35:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6122 DF PROTO=TCP SPT=41574 DPT=9100 SEQ=2360505952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F89F9810000000001030307) 
Oct 14 09:35:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6123 DF PROTO=TCP SPT=41574 DPT=9100 SEQ=2360505952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A01820000000001030307) 
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Reloading rules
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Collecting garbage unconditionally...
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Loading rules from directory /etc/polkit-1/rules.d
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Finished loading, compiling and executing 5 rules
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Reloading rules
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Collecting garbage unconditionally...
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Loading rules from directory /etc/polkit-1/rules.d
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 14 09:35:38 np0005486759.ooo.test polkitd[1035]: Finished loading, compiling and executing 5 rules
Oct 14 09:35:39 np0005486759.ooo.test groupadd[205863]: group added to /etc/group: name=ceph, GID=167
Oct 14 09:35:39 np0005486759.ooo.test groupadd[205863]: group added to /etc/gshadow: name=ceph
Oct 14 09:35:39 np0005486759.ooo.test groupadd[205863]: new group: name=ceph, GID=167
Oct 14 09:35:39 np0005486759.ooo.test useradd[205869]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 14 09:35:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6124 DF PROTO=TCP SPT=41574 DPT=9100 SEQ=2360505952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A11410000000001030307) 
Oct 14 09:35:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1390 DF PROTO=TCP SPT=46428 DPT=9882 SEQ=951206866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A14CF0000000001030307) 
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Stopping OpenSSH server daemon...
Oct 14 09:35:42 np0005486759.ooo.test sshd[144249]: Received signal 15; terminating.
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: sshd.service: Deactivated successfully.
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Stopped OpenSSH server daemon.
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Stopped target sshd-keygen.target.
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Stopping sshd-keygen.target...
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Reached target sshd-keygen.target.
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Starting OpenSSH server daemon...
Oct 14 09:35:42 np0005486759.ooo.test sshd[206480]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:35:42 np0005486759.ooo.test sshd[206480]: Server listening on 0.0.0.0 port 22.
Oct 14 09:35:42 np0005486759.ooo.test sshd[206480]: Server listening on :: port 22.
Oct 14 09:35:42 np0005486759.ooo.test systemd[1]: Started OpenSSH server daemon.
Oct 14 09:35:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1391 DF PROTO=TCP SPT=46428 DPT=9882 SEQ=951206866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A18C10000000001030307) 
Oct 14 09:35:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 09:35:44 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 09:35:44 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:44 np0005486759.ooo.test systemd-rc-local-generator[206710]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:44 np0005486759.ooo.test systemd-sysv-generator[206715]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:44 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:44 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 09:35:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 09:35:47 np0005486759.ooo.test sudo[187557]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53349 DF PROTO=TCP SPT=52104 DPT=9105 SEQ=342507981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A27810000000001030307) 
Oct 14 09:35:47 np0005486759.ooo.test sudo[211312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvzdtlaavlsrbckqbzmwlqcgtvkanerk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434547.1712165-343-26337721034464/AnsiballZ_systemd.py
Oct 14 09:35:47 np0005486759.ooo.test sudo[211312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:48 np0005486759.ooo.test python3.9[211334]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:35:48 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:48 np0005486759.ooo.test systemd-rc-local-generator[211721]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:48 np0005486759.ooo.test systemd-sysv-generator[211726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:48 np0005486759.ooo.test sudo[211312]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:48 np0005486759.ooo.test sudo[212481]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idvjmysksfjlommuurmbekvxanmaqekm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434548.6205003-343-273701929478457/AnsiballZ_systemd.py
Oct 14 09:35:48 np0005486759.ooo.test sudo[212481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:49 np0005486759.ooo.test python3.9[212504]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:35:49 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:49 np0005486759.ooo.test systemd-rc-local-generator[212708]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:49 np0005486759.ooo.test systemd-sysv-generator[212711]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1393 DF PROTO=TCP SPT=46428 DPT=9882 SEQ=951206866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A30810000000001030307) 
Oct 14 09:35:49 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:49 np0005486759.ooo.test sudo[212481]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:49 np0005486759.ooo.test sudo[213017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwfqgbpiungmxwcobsnzcxlyyzesseqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434549.6893017-343-37610573285930/AnsiballZ_systemd.py
Oct 14 09:35:49 np0005486759.ooo.test sudo[213017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:50 np0005486759.ooo.test python3.9[213043]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:35:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:50 np0005486759.ooo.test systemd-rc-local-generator[213394]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:50 np0005486759.ooo.test systemd-sysv-generator[213399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:50 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:50 np0005486759.ooo.test sudo[213017]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:51 np0005486759.ooo.test sudo[213813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtnxteujkhgukbsucyqsknkwvmfutuam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434550.8001628-343-101410967666360/AnsiballZ_systemd.py
Oct 14 09:35:51 np0005486759.ooo.test sudo[213813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:51 np0005486759.ooo.test python3.9[213831]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:35:51 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:51 np0005486759.ooo.test systemd-sysv-generator[214155]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:51 np0005486759.ooo.test systemd-rc-local-generator[214151]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:51 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:51 np0005486759.ooo.test sudo[213813]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:52 np0005486759.ooo.test sudo[214665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sushqfshbkcuhdkwnxgjkkzyhebooqqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434551.9073973-372-230925590125724/AnsiballZ_systemd.py
Oct 14 09:35:52 np0005486759.ooo.test sudo[214665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:52 np0005486759.ooo.test python3.9[214683]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:35:52 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2700 DF PROTO=TCP SPT=37496 DPT=9102 SEQ=165082532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A3D020000000001030307) 
Oct 14 09:35:52 np0005486759.ooo.test systemd-sysv-generator[215056]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:52 np0005486759.ooo.test systemd-rc-local-generator[215050]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:52 np0005486759.ooo.test sudo[214665]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:53 np0005486759.ooo.test sudo[215554]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfneyjsbcvubstculfblkexamordmxvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434552.9956548-372-268723813259421/AnsiballZ_systemd.py
Oct 14 09:35:53 np0005486759.ooo.test sudo[215554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:53 np0005486759.ooo.test python3.9[215581]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:35:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:35:53 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:53 np0005486759.ooo.test podman[215776]: 2025-10-14 09:35:53.709059268 +0000 UTC m=+0.077158022 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=ovn_controller)
Oct 14 09:35:53 np0005486759.ooo.test systemd-sysv-generator[215891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:53 np0005486759.ooo.test systemd-rc-local-generator[215885]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:53 np0005486759.ooo.test podman[215776]: 2025-10-14 09:35:53.786388645 +0000 UTC m=+0.154487369 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller)
Oct 14 09:35:53 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:53 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:35:53 np0005486759.ooo.test sudo[215554]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:35:54.136 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:35:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:35:54.136 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:35:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:35:54.137 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Consumed 11.988s CPU time.
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: run-r71aa3f410ddf4e968c63180e6a198fb1.service: Deactivated successfully.
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: run-rb807e188c20245e4a9933d5c7c547c06.service: Deactivated successfully.
Oct 14 09:35:54 np0005486759.ooo.test sudo[216229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibfaymdwczpcjnvxdwehinmtymsizmuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434554.094037-372-160568430686363/AnsiballZ_systemd.py
Oct 14 09:35:54 np0005486759.ooo.test sudo[216229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:54 np0005486759.ooo.test python3.9[216231]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:54 np0005486759.ooo.test systemd-sysv-generator[216266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:54 np0005486759.ooo.test systemd-rc-local-generator[216262]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:54 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:55 np0005486759.ooo.test sudo[216229]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:55 np0005486759.ooo.test sudo[216378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orcjsnmathdewwyyzytchiduchkqzjpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434555.1514914-372-245842711797806/AnsiballZ_systemd.py
Oct 14 09:35:55 np0005486759.ooo.test sudo[216378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:55 np0005486759.ooo.test python3.9[216380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:35:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2701 DF PROTO=TCP SPT=37496 DPT=9102 SEQ=165082532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A4CC20000000001030307) 
Oct 14 09:35:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:35:56 np0005486759.ooo.test sudo[216378]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:56 np0005486759.ooo.test podman[216384]: 2025-10-14 09:35:56.916408202 +0000 UTC m=+0.087580612 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 09:35:56 np0005486759.ooo.test podman[216384]: 2025-10-14 09:35:56.924295427 +0000 UTC m=+0.095467817 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 09:35:56 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:35:57 np0005486759.ooo.test sudo[216509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqmivsbwelcvrzyxnpkfhbtiymxqxcih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434557.010401-372-247842125265011/AnsiballZ_systemd.py
Oct 14 09:35:57 np0005486759.ooo.test sudo[216509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:57 np0005486759.ooo.test python3.9[216511]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:35:57 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:35:57 np0005486759.ooo.test systemd-rc-local-generator[216539]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:35:57 np0005486759.ooo.test systemd-sysv-generator[216546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:35:57 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:35:58 np0005486759.ooo.test sudo[216509]: pam_unix(sudo:session): session closed for user root
Oct 14 09:35:59 np0005486759.ooo.test sudo[216658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obcodjcxmsvmspjtjfzfyuldsbteqraz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434558.8975692-408-53208099813036/AnsiballZ_systemd.py
Oct 14 09:35:59 np0005486759.ooo.test sudo[216658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:35:59 np0005486759.ooo.test python3.9[216660]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:36:00 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:36:00 np0005486759.ooo.test systemd-sysv-generator[216694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:36:00 np0005486759.ooo.test systemd-rc-local-generator[216689]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:36:00 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:36:01 np0005486759.ooo.test sudo[216658]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:01 np0005486759.ooo.test sudo[216807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnxkgxqevnrrtihxddtkhsjeimtgtsna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434561.191852-416-273212190111063/AnsiballZ_systemd.py
Oct 14 09:36:01 np0005486759.ooo.test sudo[216807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:01 np0005486759.ooo.test python3.9[216809]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:01 np0005486759.ooo.test sudo[216807]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:02 np0005486759.ooo.test sudo[216920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pulcvniwjzoxpgvvsvjarxzhpavuzytm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434562.035465-416-67983878759600/AnsiballZ_systemd.py
Oct 14 09:36:02 np0005486759.ooo.test sudo[216920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:02 np0005486759.ooo.test python3.9[216922]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:02 np0005486759.ooo.test sudo[216920]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:03 np0005486759.ooo.test sudo[217033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kokdwwfaezyyqmvakfkzxlsxaduejmpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434563.1197505-416-184045016452377/AnsiballZ_systemd.py
Oct 14 09:36:03 np0005486759.ooo.test sudo[217033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:03 np0005486759.ooo.test python3.9[217035]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59615 DF PROTO=TCP SPT=39652 DPT=9100 SEQ=1860130077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A6AA80000000001030307) 
Oct 14 09:36:04 np0005486759.ooo.test sudo[217033]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:05 np0005486759.ooo.test sudo[217146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oexfnwnhhdpydxqxhwlcbqtmyatwvlvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434564.9416502-416-114037323857334/AnsiballZ_systemd.py
Oct 14 09:36:05 np0005486759.ooo.test sudo[217146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59616 DF PROTO=TCP SPT=39652 DPT=9100 SEQ=1860130077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A6EC20000000001030307) 
Oct 14 09:36:05 np0005486759.ooo.test python3.9[217148]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:05 np0005486759.ooo.test sudo[217146]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:06 np0005486759.ooo.test sudo[217259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnpbbwjfzqqglcakwdnuyxgrdzedoyds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434565.7137651-416-190602593774889/AnsiballZ_systemd.py
Oct 14 09:36:06 np0005486759.ooo.test sudo[217259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:06 np0005486759.ooo.test python3.9[217261]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:06 np0005486759.ooo.test sudo[217259]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:06 np0005486759.ooo.test sudo[217372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtzmqkjtforlytstdbenqritumvrgpiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434566.4893265-416-263568978590737/AnsiballZ_systemd.py
Oct 14 09:36:06 np0005486759.ooo.test sudo[217372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:07 np0005486759.ooo.test python3.9[217374]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:07 np0005486759.ooo.test sudo[217372]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59617 DF PROTO=TCP SPT=39652 DPT=9100 SEQ=1860130077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A76C20000000001030307) 
Oct 14 09:36:07 np0005486759.ooo.test sudo[217485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddfidniczmwtevsrkmlmygvkhbvzqylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434567.298499-416-210504293251062/AnsiballZ_systemd.py
Oct 14 09:36:07 np0005486759.ooo.test sudo[217485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:07 np0005486759.ooo.test python3.9[217487]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:08 np0005486759.ooo.test sudo[217485]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:08 np0005486759.ooo.test sudo[217598]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzzathkcxerygsfurrpsgyqmpyuhyqbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434568.1499863-416-141036121652434/AnsiballZ_systemd.py
Oct 14 09:36:08 np0005486759.ooo.test sudo[217598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:08 np0005486759.ooo.test python3.9[217600]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:08 np0005486759.ooo.test sudo[217598]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:09 np0005486759.ooo.test sudo[217711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbmurlqzvtfchjtkatprnrgugqxuzrvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434568.9632437-416-138505938565933/AnsiballZ_systemd.py
Oct 14 09:36:09 np0005486759.ooo.test sudo[217711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:09 np0005486759.ooo.test python3.9[217713]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:09 np0005486759.ooo.test sudo[217711]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:10 np0005486759.ooo.test sudo[217824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntbyxtvftjmmmczkswysapwrsbclwhev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434569.831635-416-892532226123/AnsiballZ_systemd.py
Oct 14 09:36:10 np0005486759.ooo.test sudo[217824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:10 np0005486759.ooo.test python3.9[217826]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:10 np0005486759.ooo.test sudo[217824]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:10 np0005486759.ooo.test sudo[217937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzrprhepofrfronmqcojzqvijspeubmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434570.6086125-416-78028243965500/AnsiballZ_systemd.py
Oct 14 09:36:10 np0005486759.ooo.test sudo[217937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:11 np0005486759.ooo.test python3.9[217939]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:11 np0005486759.ooo.test sudo[217937]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59618 DF PROTO=TCP SPT=39652 DPT=9100 SEQ=1860130077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A86810000000001030307) 
Oct 14 09:36:11 np0005486759.ooo.test sudo[218050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqgyqhkmggheeemrfuguscqfevhxqvbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434571.359652-416-97801726719449/AnsiballZ_systemd.py
Oct 14 09:36:11 np0005486759.ooo.test sudo[218050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:11 np0005486759.ooo.test python3.9[218052]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:12 np0005486759.ooo.test sudo[218050]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8370 DF PROTO=TCP SPT=36072 DPT=9882 SEQ=3356181423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A89F80000000001030307) 
Oct 14 09:36:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8371 DF PROTO=TCP SPT=36072 DPT=9882 SEQ=3356181423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A8E010000000001030307) 
Oct 14 09:36:13 np0005486759.ooo.test sudo[218163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahopyqkcopssifeklnmatsazzzsbuvnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434572.1088448-416-38723396659054/AnsiballZ_systemd.py
Oct 14 09:36:13 np0005486759.ooo.test sudo[218163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:13 np0005486759.ooo.test python3.9[218165]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:13 np0005486759.ooo.test sudo[218163]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:14 np0005486759.ooo.test sudo[218276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weojmwfefvaxrlqjpqbougozijgyswlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434573.8941562-416-261056803018564/AnsiballZ_systemd.py
Oct 14 09:36:14 np0005486759.ooo.test sudo[218276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:14 np0005486759.ooo.test python3.9[218278]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 14 09:36:14 np0005486759.ooo.test sudo[218276]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:16 np0005486759.ooo.test sudo[218389]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssumrwkbluqiplytzhgjblmqxuviiefq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434575.8571923-518-90352734385513/AnsiballZ_file.py
Oct 14 09:36:16 np0005486759.ooo.test sudo[218389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:16 np0005486759.ooo.test python3.9[218391]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:36:16 np0005486759.ooo.test sudo[218389]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:16 np0005486759.ooo.test sudo[218499]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bznkxgrkajfoddjqdygttlnksykthhpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434576.532397-518-32685416924022/AnsiballZ_file.py
Oct 14 09:36:16 np0005486759.ooo.test sudo[218499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:17 np0005486759.ooo.test python3.9[218501]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:36:17 np0005486759.ooo.test sudo[218499]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34760 DF PROTO=TCP SPT=33470 DPT=9105 SEQ=4137800916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8A9CC10000000001030307) 
Oct 14 09:36:17 np0005486759.ooo.test sudo[218609]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfihjnmqdtdvuzntcyplveekumoeojkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434577.1831276-518-123434309903427/AnsiballZ_file.py
Oct 14 09:36:17 np0005486759.ooo.test sudo[218609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:17 np0005486759.ooo.test python3.9[218611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:36:17 np0005486759.ooo.test sudo[218609]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:18 np0005486759.ooo.test sudo[218719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tobqcatipppnrurfwrxrsyqtiwmweurd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434577.8445098-518-214011142872536/AnsiballZ_file.py
Oct 14 09:36:18 np0005486759.ooo.test sudo[218719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:18 np0005486759.ooo.test python3.9[218721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:36:18 np0005486759.ooo.test sudo[218719]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:18 np0005486759.ooo.test sudo[218829]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfybqwuygcznkirxarpdyxzhbpdgbiod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434578.4620001-518-86699010455725/AnsiballZ_file.py
Oct 14 09:36:18 np0005486759.ooo.test sudo[218829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:18 np0005486759.ooo.test python3.9[218831]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:36:18 np0005486759.ooo.test sudo[218829]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:19 np0005486759.ooo.test sudo[218939]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptbzuwnxkhknuemlaiydceimwbbcvbzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434579.0649884-518-205419265965457/AnsiballZ_file.py
Oct 14 09:36:19 np0005486759.ooo.test sudo[218939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8373 DF PROTO=TCP SPT=36072 DPT=9882 SEQ=3356181423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AA5C20000000001030307) 
Oct 14 09:36:19 np0005486759.ooo.test python3.9[218941]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:36:19 np0005486759.ooo.test sudo[218939]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:20 np0005486759.ooo.test sudo[219049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqrdhfxndszjungygubramhjjfhuujfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434579.763644-561-32877515238153/AnsiballZ_stat.py
Oct 14 09:36:20 np0005486759.ooo.test sudo[219049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:20 np0005486759.ooo.test python3.9[219051]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:20 np0005486759.ooo.test sudo[219049]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:20 np0005486759.ooo.test sudo[219139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgqgyneqcweenmypwwzbjqkpveiuwqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434579.763644-561-32877515238153/AnsiballZ_copy.py
Oct 14 09:36:20 np0005486759.ooo.test sudo[219139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:21 np0005486759.ooo.test python3.9[219141]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434579.763644-561-32877515238153/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:21 np0005486759.ooo.test sudo[219139]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:21 np0005486759.ooo.test sudo[219249]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gminguyfieylfauqehfirilsxppgivrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434581.2862723-561-14161508580900/AnsiballZ_stat.py
Oct 14 09:36:21 np0005486759.ooo.test sudo[219249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:21 np0005486759.ooo.test python3.9[219251]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:21 np0005486759.ooo.test sudo[219249]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:22 np0005486759.ooo.test sudo[219339]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssytaspundnstqralubtlakbnnhspwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434581.2862723-561-14161508580900/AnsiballZ_copy.py
Oct 14 09:36:22 np0005486759.ooo.test sudo[219339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:22 np0005486759.ooo.test python3.9[219341]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434581.2862723-561-14161508580900/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:22 np0005486759.ooo.test sudo[219339]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11003 DF PROTO=TCP SPT=32988 DPT=9102 SEQ=4224516742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AB2410000000001030307) 
Oct 14 09:36:22 np0005486759.ooo.test sudo[219449]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgkfaioyqiccjieujafvvvbezgucigdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434582.473924-561-128589208541255/AnsiballZ_stat.py
Oct 14 09:36:22 np0005486759.ooo.test sudo[219449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:23 np0005486759.ooo.test python3.9[219451]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:23 np0005486759.ooo.test sudo[219449]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:23 np0005486759.ooo.test sudo[219539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuplkgvjsiaebqvdrqxlceiydmbiyhqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434582.473924-561-128589208541255/AnsiballZ_copy.py
Oct 14 09:36:23 np0005486759.ooo.test sudo[219539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:23 np0005486759.ooo.test python3.9[219541]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434582.473924-561-128589208541255/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:23 np0005486759.ooo.test sudo[219539]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:24 np0005486759.ooo.test sudo[219649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pywiqawjxmrmmcfdyvnatftkfvnfhonn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434583.751727-561-173127982920468/AnsiballZ_stat.py
Oct 14 09:36:24 np0005486759.ooo.test sudo[219649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:36:24 np0005486759.ooo.test systemd[1]: tmp-crun.YXOmxU.mount: Deactivated successfully.
Oct 14 09:36:24 np0005486759.ooo.test podman[219651]: 2025-10-14 09:36:24.148112026 +0000 UTC m=+0.059533865 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller)
Oct 14 09:36:24 np0005486759.ooo.test podman[219651]: 2025-10-14 09:36:24.218262789 +0000 UTC m=+0.129684618 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 09:36:24 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:36:24 np0005486759.ooo.test python3.9[219652]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:24 np0005486759.ooo.test sudo[219649]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:24 np0005486759.ooo.test sudo[219763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iihnquydarjcvnbxajqurmfqicgafewn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434583.751727-561-173127982920468/AnsiballZ_copy.py
Oct 14 09:36:24 np0005486759.ooo.test sudo[219763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:24 np0005486759.ooo.test python3.9[219765]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434583.751727-561-173127982920468/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:24 np0005486759.ooo.test sudo[219763]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:25 np0005486759.ooo.test sudo[219873]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrghuzxlakoguvsbrfnitigxqkwvkfoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434585.0412543-561-103301471692330/AnsiballZ_stat.py
Oct 14 09:36:25 np0005486759.ooo.test sudo[219873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:25 np0005486759.ooo.test python3.9[219875]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:25 np0005486759.ooo.test sudo[219873]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:25 np0005486759.ooo.test sudo[219963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idpizuhoehdwnfhnbxpxybrpibabyajw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434585.0412543-561-103301471692330/AnsiballZ_copy.py
Oct 14 09:36:25 np0005486759.ooo.test sudo[219963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:26 np0005486759.ooo.test python3.9[219965]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434585.0412543-561-103301471692330/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:26 np0005486759.ooo.test sudo[219963]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:26 np0005486759.ooo.test sudo[220073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxuipqivfrycutzrtlywaxhzerchldxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434586.1304517-561-221648775443963/AnsiballZ_stat.py
Oct 14 09:36:26 np0005486759.ooo.test sudo[220073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:26 np0005486759.ooo.test python3.9[220075]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11004 DF PROTO=TCP SPT=32988 DPT=9102 SEQ=4224516742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AC2010000000001030307) 
Oct 14 09:36:26 np0005486759.ooo.test sudo[220073]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:26 np0005486759.ooo.test sudo[220163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnleavglyjrwsprlbyzoxxhxewqmfllp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434586.1304517-561-221648775443963/AnsiballZ_copy.py
Oct 14 09:36:26 np0005486759.ooo.test sudo[220163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:36:27 np0005486759.ooo.test podman[220166]: 2025-10-14 09:36:27.070696979 +0000 UTC m=+0.089799660 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:36:27 np0005486759.ooo.test podman[220166]: 2025-10-14 09:36:27.075798731 +0000 UTC m=+0.094901352 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 14 09:36:27 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:36:27 np0005486759.ooo.test python3.9[220165]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434586.1304517-561-221648775443963/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:27 np0005486759.ooo.test sudo[220163]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:27 np0005486759.ooo.test sudo[220292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzroqaozelevcwykhbhtnwnybasqjicp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434587.2650938-561-175749431725016/AnsiballZ_stat.py
Oct 14 09:36:27 np0005486759.ooo.test sudo[220292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:27 np0005486759.ooo.test python3.9[220294]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:27 np0005486759.ooo.test sudo[220292]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:28 np0005486759.ooo.test sudo[220380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxakipilbdkfvgzfeypxvgrlvwhingwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434587.2650938-561-175749431725016/AnsiballZ_copy.py
Oct 14 09:36:28 np0005486759.ooo.test sudo[220380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:28 np0005486759.ooo.test python3.9[220382]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434587.2650938-561-175749431725016/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:28 np0005486759.ooo.test sudo[220380]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:29 np0005486759.ooo.test sudo[220490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vemzvavxyyeugbefrfiwasbadhzxtopb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434588.9369333-561-225168310138105/AnsiballZ_stat.py
Oct 14 09:36:29 np0005486759.ooo.test sudo[220490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:29 np0005486759.ooo.test python3.9[220492]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:29 np0005486759.ooo.test sudo[220490]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:29 np0005486759.ooo.test sudo[220580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idpdebwvmdgkgcptkgzrqvnajzrmddai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434588.9369333-561-225168310138105/AnsiballZ_copy.py
Oct 14 09:36:29 np0005486759.ooo.test sudo[220580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:29 np0005486759.ooo.test python3.9[220582]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760434588.9369333-561-225168310138105/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:29 np0005486759.ooo.test sudo[220580]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:31 np0005486759.ooo.test sudo[220690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tetyysznncapeusphesvxvrevmnkvfev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434590.884561-675-262994034732838/AnsiballZ_file.py
Oct 14 09:36:31 np0005486759.ooo.test sudo[220690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:31 np0005486759.ooo.test python3.9[220692]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:31 np0005486759.ooo.test sudo[220690]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:31 np0005486759.ooo.test sudo[220800]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uitnrubprhokeiylhvljypfbqwuslwpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434591.5635717-683-140018902646177/AnsiballZ_file.py
Oct 14 09:36:31 np0005486759.ooo.test sudo[220800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:32 np0005486759.ooo.test python3.9[220802]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:32 np0005486759.ooo.test sudo[220800]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:32 np0005486759.ooo.test sudo[220910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kczprhkdvnxnoppootwdyowjqcwzhund ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434592.1164327-683-36661155452185/AnsiballZ_file.py
Oct 14 09:36:32 np0005486759.ooo.test sudo[220910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:32 np0005486759.ooo.test python3.9[220912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:32 np0005486759.ooo.test sudo[220910]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:32 np0005486759.ooo.test sudo[221020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfptpiqlqkgcvflnzkbrembedzgifodw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434592.664364-683-60295785228008/AnsiballZ_file.py
Oct 14 09:36:32 np0005486759.ooo.test sudo[221020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:33 np0005486759.ooo.test python3.9[221022]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:33 np0005486759.ooo.test sudo[221020]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:33 np0005486759.ooo.test sudo[221130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utztvdvtgxwsgueorgukhtqphmyiapsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434593.3143888-683-85027288411106/AnsiballZ_file.py
Oct 14 09:36:33 np0005486759.ooo.test sudo[221130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:33 np0005486759.ooo.test python3.9[221132]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:33 np0005486759.ooo.test sudo[221130]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:34 np0005486759.ooo.test sudo[221240]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbtsnsnfhpvquwhwrzlaxwnikqjixubo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434593.879248-683-158236207048509/AnsiballZ_file.py
Oct 14 09:36:34 np0005486759.ooo.test sudo[221240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57073 DF PROTO=TCP SPT=49242 DPT=9100 SEQ=2966530074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8ADFD80000000001030307) 
Oct 14 09:36:34 np0005486759.ooo.test python3.9[221242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:34 np0005486759.ooo.test sudo[221240]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:34 np0005486759.ooo.test sudo[221350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vljqomtxmmnavicyqptlwfexcqrmveut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434594.5015943-683-183862816055050/AnsiballZ_file.py
Oct 14 09:36:34 np0005486759.ooo.test sudo[221350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:34 np0005486759.ooo.test python3.9[221352]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:35 np0005486759.ooo.test sudo[221350]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57074 DF PROTO=TCP SPT=49242 DPT=9100 SEQ=2966530074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AE3C10000000001030307) 
Oct 14 09:36:35 np0005486759.ooo.test sudo[221460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyouqsedukddwqfgoyataonmwkancflg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434595.1332865-683-87120874262990/AnsiballZ_file.py
Oct 14 09:36:35 np0005486759.ooo.test sudo[221460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:35 np0005486759.ooo.test python3.9[221462]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:35 np0005486759.ooo.test sudo[221460]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:35 np0005486759.ooo.test sudo[221570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghqkiicskigveucmdxqlfleiputoowxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434595.6885388-683-101631737002214/AnsiballZ_file.py
Oct 14 09:36:35 np0005486759.ooo.test sudo[221570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:36 np0005486759.ooo.test python3.9[221572]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:36 np0005486759.ooo.test sudo[221570]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:36 np0005486759.ooo.test sudo[221680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsjgqlqyvgbcdsjjypwwxbjrkiyvselq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434596.2978473-683-20563784153561/AnsiballZ_file.py
Oct 14 09:36:36 np0005486759.ooo.test sudo[221680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:36 np0005486759.ooo.test python3.9[221682]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:36 np0005486759.ooo.test sudo[221680]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:37 np0005486759.ooo.test sudo[221790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxyazsxuaxajudlrxgquywqqeeigiwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434596.932057-683-198065718020747/AnsiballZ_file.py
Oct 14 09:36:37 np0005486759.ooo.test sudo[221790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57075 DF PROTO=TCP SPT=49242 DPT=9100 SEQ=2966530074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AEBC20000000001030307) 
Oct 14 09:36:37 np0005486759.ooo.test python3.9[221792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:37 np0005486759.ooo.test sudo[221790]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:37 np0005486759.ooo.test sudo[221900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juxzxdpeofsinrzszixjfaczppcvonpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434597.5621367-683-13650638942526/AnsiballZ_file.py
Oct 14 09:36:37 np0005486759.ooo.test sudo[221900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:38 np0005486759.ooo.test python3.9[221902]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:38 np0005486759.ooo.test sudo[221900]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:38 np0005486759.ooo.test sudo[222010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgjjdaeryaaomimamocnstlpriezufxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434598.1714773-683-151392707626895/AnsiballZ_file.py
Oct 14 09:36:38 np0005486759.ooo.test sudo[222010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:38 np0005486759.ooo.test python3.9[222012]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:38 np0005486759.ooo.test sudo[222010]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:39 np0005486759.ooo.test sudo[222120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhybxpnusylmsarzzpwvewmkmeqjgwyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434598.7572658-683-189251415644889/AnsiballZ_file.py
Oct 14 09:36:39 np0005486759.ooo.test sudo[222120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:39 np0005486759.ooo.test python3.9[222122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:39 np0005486759.ooo.test sudo[222120]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:39 np0005486759.ooo.test sudo[222230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-strbsrnxizdlliumhisukebqnaqtaxjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434599.406102-683-135056792866569/AnsiballZ_file.py
Oct 14 09:36:39 np0005486759.ooo.test sudo[222230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:39 np0005486759.ooo.test python3.9[222232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:39 np0005486759.ooo.test sudo[222230]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:40 np0005486759.ooo.test sudo[222340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqpranxuzhgdxrgkuntnypmnpslaghna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434600.1019018-782-166921674187892/AnsiballZ_stat.py
Oct 14 09:36:40 np0005486759.ooo.test sudo[222340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:40 np0005486759.ooo.test python3.9[222342]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:40 np0005486759.ooo.test sudo[222340]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:40 np0005486759.ooo.test sudo[222428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuecqkmisdwzeydnkveexmzkuxhoieux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434600.1019018-782-166921674187892/AnsiballZ_copy.py
Oct 14 09:36:40 np0005486759.ooo.test sudo[222428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:41 np0005486759.ooo.test python3.9[222430]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434600.1019018-782-166921674187892/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:41 np0005486759.ooo.test sudo[222428]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57076 DF PROTO=TCP SPT=49242 DPT=9100 SEQ=2966530074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AFB810000000001030307) 
Oct 14 09:36:41 np0005486759.ooo.test sudo[222538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezvmdnqttpvwpanicovpwflqnhtbkuny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434601.2402253-782-231619674152651/AnsiballZ_stat.py
Oct 14 09:36:41 np0005486759.ooo.test sudo[222538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:41 np0005486759.ooo.test python3.9[222540]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:41 np0005486759.ooo.test sudo[222538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52948 DF PROTO=TCP SPT=50168 DPT=9882 SEQ=3085373200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8AFF280000000001030307) 
Oct 14 09:36:43 np0005486759.ooo.test sudo[222626]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uukmcbmlwehmbpbbvwjophqabsjpozuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434601.2402253-782-231619674152651/AnsiballZ_copy.py
Oct 14 09:36:43 np0005486759.ooo.test sudo[222626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:43 np0005486759.ooo.test python3.9[222628]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434601.2402253-782-231619674152651/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:43 np0005486759.ooo.test sudo[222626]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52949 DF PROTO=TCP SPT=50168 DPT=9882 SEQ=3085373200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B03410000000001030307) 
Oct 14 09:36:43 np0005486759.ooo.test sudo[222736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-przpwilcfkfviitvwnfgoqxmwtsvpyxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434603.3837845-782-48372204410515/AnsiballZ_stat.py
Oct 14 09:36:43 np0005486759.ooo.test sudo[222736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:43 np0005486759.ooo.test python3.9[222738]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:43 np0005486759.ooo.test sudo[222736]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:44 np0005486759.ooo.test sudo[222824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exnnsbnyuiyzvbxdhxwngeqvcmpkkqje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434603.3837845-782-48372204410515/AnsiballZ_copy.py
Oct 14 09:36:44 np0005486759.ooo.test sudo[222824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:44 np0005486759.ooo.test python3.9[222826]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434603.3837845-782-48372204410515/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:44 np0005486759.ooo.test sudo[222824]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:44 np0005486759.ooo.test sudo[222934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntfwvexyktxzebqkwbsygkczliyvrcpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434604.6268318-782-72247233510218/AnsiballZ_stat.py
Oct 14 09:36:44 np0005486759.ooo.test sudo[222934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:45 np0005486759.ooo.test python3.9[222936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:45 np0005486759.ooo.test sudo[222934]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:46 np0005486759.ooo.test sudo[223022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urtllfxfdxkpxnadqkgkmbbunbzxwgza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434604.6268318-782-72247233510218/AnsiballZ_copy.py
Oct 14 09:36:46 np0005486759.ooo.test sudo[223022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:46 np0005486759.ooo.test python3.9[223024]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434604.6268318-782-72247233510218/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:46 np0005486759.ooo.test sudo[223022]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:46 np0005486759.ooo.test sudo[223132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnrgpzaxghxysuphxgtcequrjxxoiayo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434606.4642882-782-208630863130429/AnsiballZ_stat.py
Oct 14 09:36:46 np0005486759.ooo.test sudo[223132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:46 np0005486759.ooo.test python3.9[223134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:46 np0005486759.ooo.test sudo[223132]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11839 DF PROTO=TCP SPT=49546 DPT=9105 SEQ=2683372630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B12010000000001030307) 
Oct 14 09:36:47 np0005486759.ooo.test sudo[223220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiochsscxsfyexfirlezsxancifbxfkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434606.4642882-782-208630863130429/AnsiballZ_copy.py
Oct 14 09:36:47 np0005486759.ooo.test sudo[223220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:47 np0005486759.ooo.test python3.9[223222]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434606.4642882-782-208630863130429/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:47 np0005486759.ooo.test sudo[223220]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:47 np0005486759.ooo.test sudo[223330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoikpdsdtrqkfykdxushqjsjsxxrwcrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434607.6257153-782-34594373400092/AnsiballZ_stat.py
Oct 14 09:36:47 np0005486759.ooo.test sudo[223330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:48 np0005486759.ooo.test python3.9[223332]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:48 np0005486759.ooo.test sudo[223330]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:48 np0005486759.ooo.test sudo[223418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mganipvgwfrfackgbpzmqbgbslllobfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434607.6257153-782-34594373400092/AnsiballZ_copy.py
Oct 14 09:36:48 np0005486759.ooo.test sudo[223418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:48 np0005486759.ooo.test python3.9[223420]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434607.6257153-782-34594373400092/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:48 np0005486759.ooo.test sudo[223418]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:48 np0005486759.ooo.test sudo[223528]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpbfnkvcidhokqcavhgfujobjawqqeoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434608.7007551-782-249829356591298/AnsiballZ_stat.py
Oct 14 09:36:48 np0005486759.ooo.test sudo[223528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:49 np0005486759.ooo.test python3.9[223530]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:49 np0005486759.ooo.test sudo[223528]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:49 np0005486759.ooo.test sudo[223616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emoqpceucflhfkhmacbihefplitbafec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434608.7007551-782-249829356591298/AnsiballZ_copy.py
Oct 14 09:36:49 np0005486759.ooo.test sudo[223616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52951 DF PROTO=TCP SPT=50168 DPT=9882 SEQ=3085373200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B1B020000000001030307) 
Oct 14 09:36:49 np0005486759.ooo.test python3.9[223618]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434608.7007551-782-249829356591298/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:49 np0005486759.ooo.test sudo[223616]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:50 np0005486759.ooo.test sudo[223726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtkqnovqdbagjyiraxeuiyjmonhenkne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434609.7489443-782-188655833065866/AnsiballZ_stat.py
Oct 14 09:36:50 np0005486759.ooo.test sudo[223726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:50 np0005486759.ooo.test python3.9[223728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:50 np0005486759.ooo.test sudo[223726]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:50 np0005486759.ooo.test sudo[223814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gigxakbmrsgevzkmsjbtbkjsqilfuiai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434609.7489443-782-188655833065866/AnsiballZ_copy.py
Oct 14 09:36:50 np0005486759.ooo.test sudo[223814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:50 np0005486759.ooo.test python3.9[223816]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434609.7489443-782-188655833065866/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:50 np0005486759.ooo.test sudo[223814]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:51 np0005486759.ooo.test sudo[223924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvmfkevcjtagtgwctjxquifxklyoyhnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434610.9479098-782-34608949931418/AnsiballZ_stat.py
Oct 14 09:36:51 np0005486759.ooo.test sudo[223924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:51 np0005486759.ooo.test python3.9[223926]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:51 np0005486759.ooo.test sudo[223924]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:51 np0005486759.ooo.test sudo[224012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awhtcmkhjbsnhsxqkzwqpdoimubepsgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434610.9479098-782-34608949931418/AnsiballZ_copy.py
Oct 14 09:36:51 np0005486759.ooo.test sudo[224012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:52 np0005486759.ooo.test python3.9[224014]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434610.9479098-782-34608949931418/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:52 np0005486759.ooo.test sudo[224012]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:52 np0005486759.ooo.test sudo[224122]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaafhhxgmatchpblhwfdtonvrtyrguso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434612.1531198-782-23224415124573/AnsiballZ_stat.py
Oct 14 09:36:52 np0005486759.ooo.test sudo[224122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21616 DF PROTO=TCP SPT=53934 DPT=9102 SEQ=1136016550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B27410000000001030307) 
Oct 14 09:36:52 np0005486759.ooo.test python3.9[224124]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:52 np0005486759.ooo.test sudo[224122]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:53 np0005486759.ooo.test sudo[224210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cktpcmywzlxupktlidnpfamtkbpxfakw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434612.1531198-782-23224415124573/AnsiballZ_copy.py
Oct 14 09:36:53 np0005486759.ooo.test sudo[224210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:53 np0005486759.ooo.test python3.9[224212]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434612.1531198-782-23224415124573/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:53 np0005486759.ooo.test sudo[224210]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:53 np0005486759.ooo.test sudo[224320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aealweencrjwnoqrtydlomqwjgonoaun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434613.395285-782-236937069894715/AnsiballZ_stat.py
Oct 14 09:36:53 np0005486759.ooo.test sudo[224320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:53 np0005486759.ooo.test python3.9[224322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:53 np0005486759.ooo.test sudo[224320]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:36:54.137 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:36:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:36:54.137 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:36:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:36:54.139 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:36:54 np0005486759.ooo.test sudo[224408]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nowljnnprioughrrbvyaphiwaveefrqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434613.395285-782-236937069894715/AnsiballZ_copy.py
Oct 14 09:36:54 np0005486759.ooo.test sudo[224408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:36:54 np0005486759.ooo.test podman[224411]: 2025-10-14 09:36:54.337698634 +0000 UTC m=+0.078906812 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:36:54 np0005486759.ooo.test podman[224411]: 2025-10-14 09:36:54.430427651 +0000 UTC m=+0.171635839 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 09:36:54 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:36:54 np0005486759.ooo.test python3.9[224410]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434613.395285-782-236937069894715/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:54 np0005486759.ooo.test sudo[224408]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:54 np0005486759.ooo.test sudo[224541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjwhscenbmibofwkfupveekvstyomuti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434614.613099-782-171741034729869/AnsiballZ_stat.py
Oct 14 09:36:54 np0005486759.ooo.test sudo[224541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:55 np0005486759.ooo.test python3.9[224543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:55 np0005486759.ooo.test sudo[224541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:55 np0005486759.ooo.test sudo[224629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmktyyiinvpmutpiwyiofthnbcxstivg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434614.613099-782-171741034729869/AnsiballZ_copy.py
Oct 14 09:36:55 np0005486759.ooo.test sudo[224629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:55 np0005486759.ooo.test python3.9[224631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434614.613099-782-171741034729869/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:55 np0005486759.ooo.test sudo[224629]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:56 np0005486759.ooo.test sudo[224739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbeefvftkdlhdheqaxpxymlkfeczrijg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434615.7546544-782-4832145283211/AnsiballZ_stat.py
Oct 14 09:36:56 np0005486759.ooo.test sudo[224739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:56 np0005486759.ooo.test python3.9[224741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:56 np0005486759.ooo.test sudo[224739]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21617 DF PROTO=TCP SPT=53934 DPT=9102 SEQ=1136016550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B37010000000001030307) 
Oct 14 09:36:56 np0005486759.ooo.test sudo[224827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewxjpljcozmuecqatrrgdvseryzojzla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434615.7546544-782-4832145283211/AnsiballZ_copy.py
Oct 14 09:36:56 np0005486759.ooo.test sudo[224827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:56 np0005486759.ooo.test python3.9[224829]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434615.7546544-782-4832145283211/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:56 np0005486759.ooo.test sudo[224827]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:57 np0005486759.ooo.test sudo[224937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxdqehdcuyrmzklotexayhyajqsoclib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434616.9866984-782-149535281502736/AnsiballZ_stat.py
Oct 14 09:36:57 np0005486759.ooo.test sudo[224937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:36:57 np0005486759.ooo.test systemd[1]: tmp-crun.dPo54E.mount: Deactivated successfully.
Oct 14 09:36:57 np0005486759.ooo.test podman[224940]: 2025-10-14 09:36:57.341144383 +0000 UTC m=+0.080124432 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:36:57 np0005486759.ooo.test podman[224940]: 2025-10-14 09:36:57.370567498 +0000 UTC m=+0.109547547 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:36:57 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:36:57 np0005486759.ooo.test python3.9[224939]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:36:57 np0005486759.ooo.test sudo[224937]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:58 np0005486759.ooo.test sudo[225042]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrsdqlnvipdgmergeuztvrkhzoyxiefh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434616.9866984-782-149535281502736/AnsiballZ_copy.py
Oct 14 09:36:58 np0005486759.ooo.test sudo[225042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:36:58 np0005486759.ooo.test python3.9[225044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434616.9866984-782-149535281502736/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:36:58 np0005486759.ooo.test sudo[225042]: pam_unix(sudo:session): session closed for user root
Oct 14 09:36:59 np0005486759.ooo.test python3.9[225152]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                         ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:37:00 np0005486759.ooo.test sudo[225263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtgfrvghofdmlagighsfzrqutvcrmwbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434619.6493206-988-58624316522948/AnsiballZ_seboolean.py
Oct 14 09:37:00 np0005486759.ooo.test sudo[225263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:00 np0005486759.ooo.test python3.9[225265]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 14 09:37:00 np0005486759.ooo.test sudo[225263]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:02 np0005486759.ooo.test sudo[225373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxmsvywngiwbfaksybyowzjpsmezbabe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434621.7117553-998-156433307209133/AnsiballZ_systemd.py
Oct 14 09:37:02 np0005486759.ooo.test sudo[225373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:02 np0005486759.ooo.test python3.9[225375]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:02 np0005486759.ooo.test systemd-rc-local-generator[225392]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:02 np0005486759.ooo.test systemd-sysv-generator[225398]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Starting libvirt logging daemon socket...
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Listening on libvirt logging daemon socket.
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Starting libvirt logging daemon admin socket...
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Starting libvirt logging daemon...
Oct 14 09:37:02 np0005486759.ooo.test systemd[1]: Started libvirt logging daemon.
Oct 14 09:37:02 np0005486759.ooo.test sudo[225373]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:03 np0005486759.ooo.test sudo[225524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpdzjhbnktkfuobklelipbdyfzlrcrdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434622.9674625-998-52976156378111/AnsiballZ_systemd.py
Oct 14 09:37:03 np0005486759.ooo.test sudo[225524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:03 np0005486759.ooo.test python3.9[225526]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:03 np0005486759.ooo.test systemd-rc-local-generator[225549]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:03 np0005486759.ooo.test systemd-sysv-generator[225554]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Starting libvirt nodedev daemon socket...
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Starting libvirt nodedev daemon...
Oct 14 09:37:03 np0005486759.ooo.test systemd[1]: Started libvirt nodedev daemon.
Oct 14 09:37:04 np0005486759.ooo.test sudo[225524]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:04 np0005486759.ooo.test systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 14 09:37:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18082 DF PROTO=TCP SPT=47994 DPT=9100 SEQ=2559266407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B55070000000001030307) 
Oct 14 09:37:04 np0005486759.ooo.test systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 14 09:37:04 np0005486759.ooo.test setroubleshoot[225589]: Deleting alert 395af390-a324-4a43-96e7-e57a24fdcf92, it is allowed in current policy
Oct 14 09:37:04 np0005486759.ooo.test sudo[225699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajzrfqpajdijfqkmvrnapvxcxchqcvbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434624.150709-998-90749202977321/AnsiballZ_systemd.py
Oct 14 09:37:04 np0005486759.ooo.test sudo[225699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:04 np0005486759.ooo.test systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Oct 14 09:37:04 np0005486759.ooo.test python3.9[225703]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:37:04 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:04 np0005486759.ooo.test systemd-rc-local-generator[225734]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:04 np0005486759.ooo.test systemd-sysv-generator[225737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:04 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Starting libvirt proxy daemon socket...
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Listening on libvirt proxy daemon socket.
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Starting libvirt proxy daemon...
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Started libvirt proxy daemon.
Oct 14 09:37:05 np0005486759.ooo.test sudo[225699]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18083 DF PROTO=TCP SPT=47994 DPT=9100 SEQ=2559266407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B59010000000001030307) 
Oct 14 09:37:05 np0005486759.ooo.test setroubleshoot[225589]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 97adb2f5-3db1-448a-aa12-4d2a93e77915
Oct 14 09:37:05 np0005486759.ooo.test setroubleshoot[225589]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                              
                                                              *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                              
                                                              If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                              Then turn on full auditing to get path information about the offending file and generate the error again.
                                                              Do
                                                              
                                                              Turn on full auditing
                                                              # auditctl -w /etc/shadow -p w
                                                              Try to recreate AVC. Then execute
                                                              # ausearch -m avc -ts recent
                                                              If you see PATH record check ownership/permissions on file, and fix it,
                                                              otherwise report as a bugzilla.
                                                              
                                                              *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                              
                                                              If you believe that virtlogd should have the dac_read_search capability by default.
                                                              Then you should report this as a bug.
                                                              You can generate a local policy module to allow this access.
                                                              Do
                                                              allow this access for now by executing:
                                                              # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                              # semodule -X 300 -i my-virtlogd.pp
                                                              
Oct 14 09:37:05 np0005486759.ooo.test setroubleshoot[225589]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 97adb2f5-3db1-448a-aa12-4d2a93e77915
Oct 14 09:37:05 np0005486759.ooo.test setroubleshoot[225589]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                              
                                                              *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                              
                                                              If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                              Then turn on full auditing to get path information about the offending file and generate the error again.
                                                              Do
                                                              
                                                              Turn on full auditing
                                                              # auditctl -w /etc/shadow -p w
                                                              Try to recreate AVC. Then execute
                                                              # ausearch -m avc -ts recent
                                                              If you see PATH record check ownership/permissions on file, and fix it,
                                                              otherwise report as a bugzilla.
                                                              
                                                              *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                              
                                                              If you believe that virtlogd should have the dac_read_search capability by default.
                                                              Then you should report this as a bug.
                                                              You can generate a local policy module to allow this access.
                                                              Do
                                                              allow this access for now by executing:
                                                              # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                              # semodule -X 300 -i my-virtlogd.pp
                                                              
Oct 14 09:37:05 np0005486759.ooo.test sudo[225878]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqwtoxxbehkjberloprrtcrvoselonso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434625.3136842-998-108267494463480/AnsiballZ_systemd.py
Oct 14 09:37:05 np0005486759.ooo.test sudo[225878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:05 np0005486759.ooo.test python3.9[225880]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:37:05 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:06 np0005486759.ooo.test systemd-sysv-generator[225905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:06 np0005486759.ooo.test systemd-rc-local-generator[225900]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Listening on libvirt locking daemon socket.
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Starting libvirt QEMU daemon socket...
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Starting libvirt QEMU daemon...
Oct 14 09:37:06 np0005486759.ooo.test systemd[1]: Started libvirt QEMU daemon.
Oct 14 09:37:06 np0005486759.ooo.test sudo[225878]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:06 np0005486759.ooo.test sudo[226059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziwarglbgybvhcmgjeyfzmpdgxfxlwcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434626.4568691-998-84332611551640/AnsiballZ_systemd.py
Oct 14 09:37:06 np0005486759.ooo.test sudo[226059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:07 np0005486759.ooo.test python3.9[226061]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:07 np0005486759.ooo.test systemd-rc-local-generator[226096]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:07 np0005486759.ooo.test systemd-sysv-generator[226101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18084 DF PROTO=TCP SPT=47994 DPT=9100 SEQ=2559266407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B61020000000001030307) 
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Starting libvirt secret daemon socket...
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Listening on libvirt secret daemon socket.
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Starting libvirt secret daemon admin socket...
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Starting libvirt secret daemon...
Oct 14 09:37:07 np0005486759.ooo.test systemd[1]: Started libvirt secret daemon.
Oct 14 09:37:07 np0005486759.ooo.test sudo[226059]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:08 np0005486759.ooo.test sudo[226240]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhbedvssdhrzmlzfbaccvddjblzakuae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434627.7773929-1035-146046247148329/AnsiballZ_file.py
Oct 14 09:37:08 np0005486759.ooo.test sudo[226240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:08 np0005486759.ooo.test python3.9[226242]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:08 np0005486759.ooo.test sudo[226240]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:08 np0005486759.ooo.test sudo[226350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykozbipowoydhfwbyrnlfcypikefxyqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434628.4820428-1043-4802192804472/AnsiballZ_find.py
Oct 14 09:37:08 np0005486759.ooo.test sudo[226350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:08 np0005486759.ooo.test python3.9[226352]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:37:08 np0005486759.ooo.test sudo[226350]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:09 np0005486759.ooo.test sudo[226460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmnrrevjpfpnvvmbplsnegudaaevnsfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434629.3835595-1057-195930233244674/AnsiballZ_stat.py
Oct 14 09:37:09 np0005486759.ooo.test sudo[226460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:09 np0005486759.ooo.test python3.9[226462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:09 np0005486759.ooo.test sudo[226460]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:10 np0005486759.ooo.test sudo[226548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwojwvcuonxrsyohnjeflswgyrewtdan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434629.3835595-1057-195930233244674/AnsiballZ_copy.py
Oct 14 09:37:10 np0005486759.ooo.test sudo[226548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:10 np0005486759.ooo.test python3.9[226550]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434629.3835595-1057-195930233244674/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:10 np0005486759.ooo.test sudo[226548]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:11 np0005486759.ooo.test sudo[226658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwiqmudvxudyeewylulqowtfzfyffoly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434630.7331007-1073-47324583725645/AnsiballZ_file.py
Oct 14 09:37:11 np0005486759.ooo.test sudo[226658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:11 np0005486759.ooo.test python3.9[226660]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:11 np0005486759.ooo.test sudo[226658]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18085 DF PROTO=TCP SPT=47994 DPT=9100 SEQ=2559266407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B70C10000000001030307) 
Oct 14 09:37:11 np0005486759.ooo.test sudo[226768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igqemwaltgzebdwjijvefypweozbixcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434631.3581958-1081-25842229787237/AnsiballZ_stat.py
Oct 14 09:37:11 np0005486759.ooo.test sudo[226768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:11 np0005486759.ooo.test python3.9[226770]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:11 np0005486759.ooo.test sudo[226768]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:12 np0005486759.ooo.test sudo[226825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppmuosecydvpakbfgxzqzvhzpfruvlyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434631.3581958-1081-25842229787237/AnsiballZ_file.py
Oct 14 09:37:12 np0005486759.ooo.test sudo[226825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47728 DF PROTO=TCP SPT=51820 DPT=9882 SEQ=393619919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B74580000000001030307) 
Oct 14 09:37:12 np0005486759.ooo.test python3.9[226827]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:12 np0005486759.ooo.test sudo[226825]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:12 np0005486759.ooo.test sudo[226935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujocbbdlnwcboiqdrbdwcdkfksinolqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434632.4983308-1093-170901864143939/AnsiballZ_stat.py
Oct 14 09:37:12 np0005486759.ooo.test sudo[226935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:13 np0005486759.ooo.test python3.9[226937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:13 np0005486759.ooo.test sudo[226935]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47729 DF PROTO=TCP SPT=51820 DPT=9882 SEQ=393619919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B78410000000001030307) 
Oct 14 09:37:14 np0005486759.ooo.test sudo[226992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ervkvlqxcdeezoggiazzdwnzogomsiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434632.4983308-1093-170901864143939/AnsiballZ_file.py
Oct 14 09:37:14 np0005486759.ooo.test sudo[226992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:14 np0005486759.ooo.test python3.9[226994]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ivay_4xa recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:14 np0005486759.ooo.test sudo[226992]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:14 np0005486759.ooo.test sudo[227102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otrvsijvsylzrgkzcsqtcnhyscsnbsgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434634.5310788-1105-184890041963102/AnsiballZ_stat.py
Oct 14 09:37:14 np0005486759.ooo.test sudo[227102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:15 np0005486759.ooo.test python3.9[227104]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:15 np0005486759.ooo.test sudo[227102]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:15 np0005486759.ooo.test sudo[227159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wioixhkrwmcykdodvrkcceamlkxbeisk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434634.5310788-1105-184890041963102/AnsiballZ_file.py
Oct 14 09:37:15 np0005486759.ooo.test sudo[227159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:15 np0005486759.ooo.test python3.9[227161]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:15 np0005486759.ooo.test systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Oct 14 09:37:15 np0005486759.ooo.test sudo[227159]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:15 np0005486759.ooo.test systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 14 09:37:16 np0005486759.ooo.test sudo[227269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvyapgdiuhfnnzpjeecgixsuowvkcwuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434635.7186928-1118-228192245732312/AnsiballZ_command.py
Oct 14 09:37:16 np0005486759.ooo.test sudo[227269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63924 DF PROTO=TCP SPT=44000 DPT=9105 SEQ=2488329665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B87040000000001030307) 
Oct 14 09:37:17 np0005486759.ooo.test python3.9[227271]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:37:17 np0005486759.ooo.test sudo[227269]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:17 np0005486759.ooo.test sudo[227380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcxonuysmwmtiwarnpqpksfzndowawhy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434637.2716281-1126-197989630436778/AnsiballZ_edpm_nftables_from_files.py
Oct 14 09:37:17 np0005486759.ooo.test sudo[227380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:17 np0005486759.ooo.test python3[227382]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 14 09:37:17 np0005486759.ooo.test sudo[227380]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:18 np0005486759.ooo.test sudo[227490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfjnuoaizsbzciyjrdrurcrocmavcofo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434638.104536-1134-76642090350060/AnsiballZ_stat.py
Oct 14 09:37:18 np0005486759.ooo.test sudo[227490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:18 np0005486759.ooo.test python3.9[227492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:18 np0005486759.ooo.test sudo[227490]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:18 np0005486759.ooo.test sudo[227547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyxxpitvhjraxxxmufrgkaifxepvfnpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434638.104536-1134-76642090350060/AnsiballZ_file.py
Oct 14 09:37:18 np0005486759.ooo.test sudo[227547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:19 np0005486759.ooo.test python3.9[227549]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:19 np0005486759.ooo.test sudo[227547]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47731 DF PROTO=TCP SPT=51820 DPT=9882 SEQ=393619919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B90010000000001030307) 
Oct 14 09:37:19 np0005486759.ooo.test sudo[227657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhlbwmspfqypvrxfezrbrmgcirhfyguj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434639.3606412-1146-10352747457288/AnsiballZ_stat.py
Oct 14 09:37:19 np0005486759.ooo.test sudo[227657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:19 np0005486759.ooo.test python3.9[227659]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:19 np0005486759.ooo.test sudo[227657]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:20 np0005486759.ooo.test sudo[227714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ronxbocwoitmuqrcyhjcplghnznltvsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434639.3606412-1146-10352747457288/AnsiballZ_file.py
Oct 14 09:37:20 np0005486759.ooo.test sudo[227714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:20 np0005486759.ooo.test python3.9[227716]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:20 np0005486759.ooo.test sudo[227714]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:20 np0005486759.ooo.test sudo[227824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxjkupwswegjzfvtisyqnkinyddbiisi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434640.5376034-1158-272348303783502/AnsiballZ_stat.py
Oct 14 09:37:20 np0005486759.ooo.test sudo[227824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:20 np0005486759.ooo.test python3.9[227826]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:21 np0005486759.ooo.test sudo[227824]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:21 np0005486759.ooo.test sudo[227881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxgvklxtzfzdhuivddjyfzvvqibtmxci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434640.5376034-1158-272348303783502/AnsiballZ_file.py
Oct 14 09:37:21 np0005486759.ooo.test sudo[227881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:21 np0005486759.ooo.test python3.9[227883]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:21 np0005486759.ooo.test sudo[227881]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:21 np0005486759.ooo.test sudo[227991]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcaukqwnjmuaamphdsvkmdcvsonklepg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434641.6030264-1170-29204555929237/AnsiballZ_stat.py
Oct 14 09:37:21 np0005486759.ooo.test sudo[227991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:22 np0005486759.ooo.test python3.9[227993]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:22 np0005486759.ooo.test sudo[227991]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:22 np0005486759.ooo.test sudo[228048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxoyhcpqnsfuwjjjyqapkhzxhjjqlfjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434641.6030264-1170-29204555929237/AnsiballZ_file.py
Oct 14 09:37:22 np0005486759.ooo.test sudo[228048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21526 DF PROTO=TCP SPT=37560 DPT=9102 SEQ=1551731678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8B9C820000000001030307) 
Oct 14 09:37:22 np0005486759.ooo.test python3.9[228050]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:22 np0005486759.ooo.test sudo[228048]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:23 np0005486759.ooo.test sudo[228158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocfvumjahpkpdbvnhnxwkeatggjqfxoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434642.763207-1182-190546011630882/AnsiballZ_stat.py
Oct 14 09:37:23 np0005486759.ooo.test sudo[228158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:23 np0005486759.ooo.test python3.9[228160]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:23 np0005486759.ooo.test sudo[228158]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:23 np0005486759.ooo.test sudo[228248]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiynbdfwtgzwkqgvkcitcitqtgzighxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434642.763207-1182-190546011630882/AnsiballZ_copy.py
Oct 14 09:37:23 np0005486759.ooo.test sudo[228248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:24 np0005486759.ooo.test python3.9[228250]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760434642.763207-1182-190546011630882/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:24 np0005486759.ooo.test sudo[228248]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:24 np0005486759.ooo.test sudo[228358]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkkqjeyywumplwywijcinloomscjyqyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434644.235698-1197-99369061887513/AnsiballZ_file.py
Oct 14 09:37:24 np0005486759.ooo.test sudo[228358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:37:24 np0005486759.ooo.test systemd[1]: tmp-crun.p1DF27.mount: Deactivated successfully.
Oct 14 09:37:24 np0005486759.ooo.test podman[228361]: 2025-10-14 09:37:24.629732056 +0000 UTC m=+0.081397310 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:37:24 np0005486759.ooo.test podman[228361]: 2025-10-14 09:37:24.715134874 +0000 UTC m=+0.166800098 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct 14 09:37:24 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:37:24 np0005486759.ooo.test python3.9[228360]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:24 np0005486759.ooo.test sudo[228358]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:25 np0005486759.ooo.test sudo[228493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwmbshxztzavccufdzrhidliiyclksou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434644.927893-1205-112302958587252/AnsiballZ_command.py
Oct 14 09:37:25 np0005486759.ooo.test sudo[228493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:25 np0005486759.ooo.test python3.9[228495]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:37:25 np0005486759.ooo.test sudo[228493]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:26 np0005486759.ooo.test sudo[228606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asahjlywttqdumhefczacsrsnjmpsrgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434645.5652907-1213-148981285717638/AnsiballZ_blockinfile.py
Oct 14 09:37:26 np0005486759.ooo.test sudo[228606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:26 np0005486759.ooo.test python3.9[228608]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/edpm-chains.nft"
                                                         include "/etc/nftables/edpm-rules.nft"
                                                         include "/etc/nftables/edpm-jumps.nft"
                                                          path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:26 np0005486759.ooo.test sudo[228606]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21527 DF PROTO=TCP SPT=37560 DPT=9102 SEQ=1551731678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BAC410000000001030307) 
Oct 14 09:37:26 np0005486759.ooo.test sudo[228716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yahywwstuqmyzzvrhkrazyyxjtwletvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434646.459392-1222-232609106511278/AnsiballZ_command.py
Oct 14 09:37:26 np0005486759.ooo.test sudo[228716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:26 np0005486759.ooo.test python3.9[228718]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:37:27 np0005486759.ooo.test sudo[228716]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:27 np0005486759.ooo.test sudo[228827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izfiieejugghocpddyhxtuhuqktihvau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434647.136493-1230-231756748960534/AnsiballZ_stat.py
Oct 14 09:37:27 np0005486759.ooo.test sudo[228827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:37:27 np0005486759.ooo.test systemd[1]: tmp-crun.QnJHyJ.mount: Deactivated successfully.
Oct 14 09:37:27 np0005486759.ooo.test podman[228830]: 2025-10-14 09:37:27.487362739 +0000 UTC m=+0.083897127 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:37:27 np0005486759.ooo.test podman[228830]: 2025-10-14 09:37:27.492676635 +0000 UTC m=+0.089211033 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:37:27 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:37:27 np0005486759.ooo.test python3.9[228829]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:37:27 np0005486759.ooo.test sudo[228827]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:27 np0005486759.ooo.test sudo[228957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jendlenyqqkcphpjkodyjbayosxixfpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434647.7215793-1238-6093665133910/AnsiballZ_command.py
Oct 14 09:37:27 np0005486759.ooo.test sudo[228957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:28 np0005486759.ooo.test python3.9[228959]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:37:28 np0005486759.ooo.test sudo[228957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:28 np0005486759.ooo.test sudo[229070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdmeyqkasqgbitrlkgynndxlpdmnckuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434648.3399262-1246-194804470453397/AnsiballZ_file.py
Oct 14 09:37:28 np0005486759.ooo.test sudo[229070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:28 np0005486759.ooo.test python3.9[229072]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:28 np0005486759.ooo.test sudo[229070]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:29 np0005486759.ooo.test sudo[229180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuhlazoodnbtwjushuomyiojkbhovlzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434648.9849946-1254-174752747807017/AnsiballZ_stat.py
Oct 14 09:37:29 np0005486759.ooo.test sudo[229180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:29 np0005486759.ooo.test python3.9[229182]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:29 np0005486759.ooo.test sudo[229180]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:29 np0005486759.ooo.test sudo[229268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsgmgwiiudswkmdybddrqpprjhhnymho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434648.9849946-1254-174752747807017/AnsiballZ_copy.py
Oct 14 09:37:29 np0005486759.ooo.test sudo[229268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:30 np0005486759.ooo.test python3.9[229270]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434648.9849946-1254-174752747807017/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:30 np0005486759.ooo.test sudo[229268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:30 np0005486759.ooo.test sudo[229378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxyxgpxrbgpnuizpmvswenxryzyrvxim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434650.18311-1269-32223971890181/AnsiballZ_stat.py
Oct 14 09:37:30 np0005486759.ooo.test sudo[229378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:30 np0005486759.ooo.test python3.9[229380]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:30 np0005486759.ooo.test sudo[229378]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:31 np0005486759.ooo.test sudo[229466]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzbppipzceqdfwftmkwxwoevwtyeehxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434650.18311-1269-32223971890181/AnsiballZ_copy.py
Oct 14 09:37:31 np0005486759.ooo.test sudo[229466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:31 np0005486759.ooo.test python3.9[229468]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434650.18311-1269-32223971890181/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:31 np0005486759.ooo.test sudo[229466]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:31 np0005486759.ooo.test sudo[229576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rowwqjhlomnkqsbaxoxzfpmzlrvxbbrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434651.4103699-1284-190277054948613/AnsiballZ_stat.py
Oct 14 09:37:31 np0005486759.ooo.test sudo[229576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:31 np0005486759.ooo.test python3.9[229578]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:37:31 np0005486759.ooo.test sudo[229576]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:32 np0005486759.ooo.test sudo[229664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mspeqwlsupiypxjjgvyaragdvounlgwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434651.4103699-1284-190277054948613/AnsiballZ_copy.py
Oct 14 09:37:32 np0005486759.ooo.test sudo[229664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:32 np0005486759.ooo.test python3.9[229666]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434651.4103699-1284-190277054948613/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:32 np0005486759.ooo.test sudo[229664]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:32 np0005486759.ooo.test sudo[229774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqactpuyssbjntxorvhrpipomtlrsunt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434652.689039-1299-134717186745675/AnsiballZ_systemd.py
Oct 14 09:37:33 np0005486759.ooo.test sudo[229774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:33 np0005486759.ooo.test python3.9[229776]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:37:33 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:33 np0005486759.ooo.test systemd-rc-local-generator[229804]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:33 np0005486759.ooo.test systemd-sysv-generator[229808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:33 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:33 np0005486759.ooo.test systemd[1]: Reached target edpm_libvirt.target.
Oct 14 09:37:33 np0005486759.ooo.test sudo[229774]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:34 np0005486759.ooo.test sudo[229923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcvttnonqsnxtsqplvuedvoyrlgemsam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434653.863014-1307-64853193754393/AnsiballZ_systemd.py
Oct 14 09:37:34 np0005486759.ooo.test sudo[229923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58510 DF PROTO=TCP SPT=35694 DPT=9100 SEQ=1420128205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BCA370000000001030307) 
Oct 14 09:37:34 np0005486759.ooo.test python3.9[229925]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 14 09:37:34 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:34 np0005486759.ooo.test systemd-rc-local-generator[229950]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:34 np0005486759.ooo.test systemd-sysv-generator[229954]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:34 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:34 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:34 np0005486759.ooo.test systemd-rc-local-generator[229987]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:34 np0005486759.ooo.test systemd-sysv-generator[229993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:35 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:35 np0005486759.ooo.test sudo[229923]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58511 DF PROTO=TCP SPT=35694 DPT=9100 SEQ=1420128205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BCE410000000001030307) 
Oct 14 09:37:35 np0005486759.ooo.test sshd[183469]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:37:35 np0005486759.ooo.test systemd[1]: session-35.scope: Deactivated successfully.
Oct 14 09:37:35 np0005486759.ooo.test systemd[1]: session-35.scope: Consumed 3min 26.531s CPU time.
Oct 14 09:37:35 np0005486759.ooo.test systemd-logind[759]: Session 35 logged out. Waiting for processes to exit.
Oct 14 09:37:35 np0005486759.ooo.test systemd-logind[759]: Removed session 35.
Oct 14 09:37:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58512 DF PROTO=TCP SPT=35694 DPT=9100 SEQ=1420128205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BD6410000000001030307) 
Oct 14 09:37:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58513 DF PROTO=TCP SPT=35694 DPT=9100 SEQ=1420128205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BE6010000000001030307) 
Oct 14 09:37:41 np0005486759.ooo.test sshd[230015]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:37:41 np0005486759.ooo.test sshd[230015]: Accepted publickey for zuul from 192.168.122.31 port 52918 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:37:41 np0005486759.ooo.test systemd-logind[759]: New session 36 of user zuul.
Oct 14 09:37:41 np0005486759.ooo.test systemd[1]: Started Session 36 of User zuul.
Oct 14 09:37:41 np0005486759.ooo.test sshd[230015]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:37:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38161 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=1894727101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BE9880000000001030307) 
Oct 14 09:37:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38162 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=1894727101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BED810000000001030307) 
Oct 14 09:37:43 np0005486759.ooo.test python3.9[230126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:37:44 np0005486759.ooo.test sudo[230238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmspbmncdyfwjtzomuzvjqmxyyuzetdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434664.3328273-34-52282378078871/AnsiballZ_file.py
Oct 14 09:37:44 np0005486759.ooo.test sudo[230238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:44 np0005486759.ooo.test python3.9[230240]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:37:45 np0005486759.ooo.test sudo[230238]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:45 np0005486759.ooo.test sudo[230348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksqagsemygramudjrjcuyvxydrpgsbkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434665.1298969-34-3892990206887/AnsiballZ_file.py
Oct 14 09:37:45 np0005486759.ooo.test sudo[230348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:45 np0005486759.ooo.test python3.9[230350]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:37:45 np0005486759.ooo.test sudo[230348]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:46 np0005486759.ooo.test sudo[230458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyvoissirmzdelxebuthzatqysyoyevc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434665.765273-34-137026807688866/AnsiballZ_file.py
Oct 14 09:37:46 np0005486759.ooo.test sudo[230458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:46 np0005486759.ooo.test python3.9[230460]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:37:46 np0005486759.ooo.test sudo[230458]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:46 np0005486759.ooo.test sudo[230568]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jljhwdeylkekfmtkbyuxvwxdrdxatswn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434666.5892806-34-11400859233901/AnsiballZ_file.py
Oct 14 09:37:46 np0005486759.ooo.test sudo[230568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60619 DF PROTO=TCP SPT=37538 DPT=9105 SEQ=3538549224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8BFC410000000001030307) 
Oct 14 09:37:47 np0005486759.ooo.test python3.9[230570]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:37:47 np0005486759.ooo.test sudo[230568]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:47 np0005486759.ooo.test sudo[230678]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiesxtgsojganpekprbdbozgpqoqzwwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434667.25116-34-10877960331298/AnsiballZ_file.py
Oct 14 09:37:47 np0005486759.ooo.test sudo[230678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:47 np0005486759.ooo.test python3.9[230680]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:37:47 np0005486759.ooo.test sudo[230678]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:48 np0005486759.ooo.test sudo[230788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xusmkpebjjuczaxxobkoedjmqtolvxhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434668.003614-70-50842476301984/AnsiballZ_stat.py
Oct 14 09:37:48 np0005486759.ooo.test sudo[230788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:48 np0005486759.ooo.test python3.9[230790]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:37:48 np0005486759.ooo.test sudo[230788]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38164 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=1894727101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C05410000000001030307) 
Oct 14 09:37:49 np0005486759.ooo.test sudo[230900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zoxoqqzdobiojfdpmlgmlemrbpeuwmcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434668.8628833-78-259703065273589/AnsiballZ_systemd.py
Oct 14 09:37:49 np0005486759.ooo.test sudo[230900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:49 np0005486759.ooo.test python3.9[230902]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:37:49 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:49 np0005486759.ooo.test systemd-rc-local-generator[230925]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:49 np0005486759.ooo.test systemd-sysv-generator[230931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:49 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:50 np0005486759.ooo.test sudo[230900]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:50 np0005486759.ooo.test sudo[231048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfgwgltdgfnhnesrxmvnrcohzjolldbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434670.2360344-86-226263806472113/AnsiballZ_service_facts.py
Oct 14 09:37:50 np0005486759.ooo.test sudo[231048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:50 np0005486759.ooo.test python3.9[231050]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:37:50 np0005486759.ooo.test network[231067]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:37:50 np0005486759.ooo.test network[231068]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:37:50 np0005486759.ooo.test network[231069]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:37:51 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=325 DF PROTO=TCP SPT=41250 DPT=9102 SEQ=2011901535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C11C10000000001030307) 
Oct 14 09:37:53 np0005486759.ooo.test sudo[231048]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:37:54.138 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:37:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:37:54.140 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:37:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:37:54.141 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:37:55 np0005486759.ooo.test sudo[231298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oviwpdleauzgfykojoqhijssbutztrhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434674.9353316-94-96641209621765/AnsiballZ_systemd.py
Oct 14 09:37:55 np0005486759.ooo.test sudo[231298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:37:55 np0005486759.ooo.test podman[231300]: 2025-10-14 09:37:55.356889119 +0000 UTC m=+0.097082157 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:37:55 np0005486759.ooo.test podman[231300]: 2025-10-14 09:37:55.415269342 +0000 UTC m=+0.155462350 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:37:55 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:37:55 np0005486759.ooo.test python3.9[231301]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:37:55 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:37:55 np0005486759.ooo.test systemd-rc-local-generator[231350]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:37:55 np0005486759.ooo.test systemd-sysv-generator[231358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:37:55 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:37:55 np0005486759.ooo.test sudo[231298]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:56 np0005486759.ooo.test python3.9[231472]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:37:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=326 DF PROTO=TCP SPT=41250 DPT=9102 SEQ=2011901535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C21810000000001030307) 
Oct 14 09:37:58 np0005486759.ooo.test python3.9[231582]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:37:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:37:58 np0005486759.ooo.test systemd[1]: tmp-crun.GGbH4J.mount: Deactivated successfully.
Oct 14 09:37:58 np0005486759.ooo.test podman[231640]: 2025-10-14 09:37:58.422919705 +0000 UTC m=+0.051908638 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:37:58 np0005486759.ooo.test podman[231640]: 2025-10-14 09:37:58.433207814 +0000 UTC m=+0.062196737 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:37:58 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:37:58 np0005486759.ooo.test sudo[231710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drmylruiuzapsfzucwgjviacplretsmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434678.2207723-123-49536639293737/AnsiballZ_lineinfile.py
Oct 14 09:37:58 np0005486759.ooo.test sudo[231710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:58 np0005486759.ooo.test python3.9[231712]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:37:58 np0005486759.ooo.test sudo[231710]: pam_unix(sudo:session): session closed for user root
Oct 14 09:37:59 np0005486759.ooo.test sudo[231820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozaztmgnvgmimnbkeyfkxxchbgunaawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434679.1925738-132-31879429057884/AnsiballZ_file.py
Oct 14 09:37:59 np0005486759.ooo.test sudo[231820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:37:59 np0005486759.ooo.test python3.9[231822]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:37:59 np0005486759.ooo.test sudo[231820]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:00 np0005486759.ooo.test sudo[231930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rygfdkovcixbwfbiomgiuevxcnuxaxvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434679.8792737-140-20939446749924/AnsiballZ_stat.py
Oct 14 09:38:00 np0005486759.ooo.test sudo[231930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:00 np0005486759.ooo.test python3.9[231932]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:00 np0005486759.ooo.test sudo[231930]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:00 np0005486759.ooo.test sudo[231987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aklsognnhuqpjjdyemfncgrlwvugwcdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434679.8792737-140-20939446749924/AnsiballZ_file.py
Oct 14 09:38:00 np0005486759.ooo.test sudo[231987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:00 np0005486759.ooo.test python3.9[231989]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:00 np0005486759.ooo.test sudo[231987]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:01 np0005486759.ooo.test sudo[232097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykvgbeabflpkfftmgvvhfzmqfgltdypi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434680.8972206-140-280845155247436/AnsiballZ_stat.py
Oct 14 09:38:01 np0005486759.ooo.test sudo[232097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:01 np0005486759.ooo.test python3.9[232099]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:01 np0005486759.ooo.test sudo[232097]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:01 np0005486759.ooo.test sudo[232154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frrrxuexfalacewfhxycxnsnxednkqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434680.8972206-140-280845155247436/AnsiballZ_file.py
Oct 14 09:38:01 np0005486759.ooo.test sudo[232154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:01 np0005486759.ooo.test python3.9[232156]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:01 np0005486759.ooo.test sudo[232154]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:02 np0005486759.ooo.test sudo[232264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxemxdithwuwdmuisljilepjwqskrgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434681.956413-163-101255709601237/AnsiballZ_file.py
Oct 14 09:38:02 np0005486759.ooo.test sudo[232264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:02 np0005486759.ooo.test python3.9[232266]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:02 np0005486759.ooo.test sudo[232264]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:02 np0005486759.ooo.test sudo[232374]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpeeslscwuyjtbifnjtdllwdoymhqyax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434682.622991-171-109838089263198/AnsiballZ_stat.py
Oct 14 09:38:02 np0005486759.ooo.test sudo[232374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:03 np0005486759.ooo.test python3.9[232376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:03 np0005486759.ooo.test sudo[232374]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:03 np0005486759.ooo.test sudo[232431]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdidluwhqsggzaxugmslqiywtecdfpgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434682.622991-171-109838089263198/AnsiballZ_file.py
Oct 14 09:38:03 np0005486759.ooo.test sudo[232431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:03 np0005486759.ooo.test python3.9[232433]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:03 np0005486759.ooo.test sudo[232431]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:04 np0005486759.ooo.test sudo[232541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkyrwbqrmjmnsqrubxwmevbimeamwgzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434683.844978-183-238116667388010/AnsiballZ_stat.py
Oct 14 09:38:04 np0005486759.ooo.test sudo[232541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10229 DF PROTO=TCP SPT=37412 DPT=9100 SEQ=778714345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C3F670000000001030307) 
Oct 14 09:38:04 np0005486759.ooo.test python3.9[232543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:04 np0005486759.ooo.test sudo[232541]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:04 np0005486759.ooo.test sudo[232598]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzdheaysqysmkggamvgclitgdwsojnzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434683.844978-183-238116667388010/AnsiballZ_file.py
Oct 14 09:38:04 np0005486759.ooo.test sudo[232598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:04 np0005486759.ooo.test python3.9[232600]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:04 np0005486759.ooo.test sudo[232598]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10230 DF PROTO=TCP SPT=37412 DPT=9100 SEQ=778714345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C43820000000001030307) 
Oct 14 09:38:05 np0005486759.ooo.test sudo[232708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpfxjwneahgahdnnnfvdyeqsqiwxinmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434685.0730808-195-162035450909445/AnsiballZ_systemd.py
Oct 14 09:38:05 np0005486759.ooo.test sudo[232708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:05 np0005486759.ooo.test python3.9[232710]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:38:05 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:38:05 np0005486759.ooo.test systemd-rc-local-generator[232738]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:38:05 np0005486759.ooo.test systemd-sysv-generator[232742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:38:05 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:38:06 np0005486759.ooo.test sudo[232708]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:06 np0005486759.ooo.test sudo[232856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luyippkmuwcdrjaeciaxdnnkslwjcnsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434686.1814566-203-126099147724033/AnsiballZ_stat.py
Oct 14 09:38:06 np0005486759.ooo.test sudo[232856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:06 np0005486759.ooo.test python3.9[232858]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:06 np0005486759.ooo.test sudo[232856]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:06 np0005486759.ooo.test sudo[232913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvpdrnkwictbedyiphcnluinrotrkhxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434686.1814566-203-126099147724033/AnsiballZ_file.py
Oct 14 09:38:06 np0005486759.ooo.test sudo[232913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:07 np0005486759.ooo.test python3.9[232915]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:07 np0005486759.ooo.test sudo[232913]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10231 DF PROTO=TCP SPT=37412 DPT=9100 SEQ=778714345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C4B810000000001030307) 
Oct 14 09:38:07 np0005486759.ooo.test sudo[233023]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfhwrydyekyrqumgmycnzlubiwxvwpor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434687.3445406-215-146723485201338/AnsiballZ_stat.py
Oct 14 09:38:07 np0005486759.ooo.test sudo[233023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:07 np0005486759.ooo.test python3.9[233025]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:07 np0005486759.ooo.test sudo[233023]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:08 np0005486759.ooo.test sudo[233080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdlqmlrucmflslrlpmprpsxaepxagiyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434687.3445406-215-146723485201338/AnsiballZ_file.py
Oct 14 09:38:08 np0005486759.ooo.test sudo[233080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:08 np0005486759.ooo.test python3.9[233082]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:08 np0005486759.ooo.test sudo[233080]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:09 np0005486759.ooo.test sudo[233190]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bspeaqkipibeoqeuzimqcnwohoscwgej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434689.4250987-227-116491882501994/AnsiballZ_systemd.py
Oct 14 09:38:09 np0005486759.ooo.test sudo[233190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:10 np0005486759.ooo.test python3.9[233192]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:38:10 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:38:10 np0005486759.ooo.test systemd-rc-local-generator[233219]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:38:10 np0005486759.ooo.test systemd-sysv-generator[233223]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:38:10 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:38:10 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:38:10 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:38:10 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:38:10 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:38:10 np0005486759.ooo.test sudo[233190]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10232 DF PROTO=TCP SPT=37412 DPT=9100 SEQ=778714345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C5B420000000001030307) 
Oct 14 09:38:11 np0005486759.ooo.test sudo[233342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjxnsbifecjhcgococrserchyykucymr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434690.6875212-237-206871529175742/AnsiballZ_file.py
Oct 14 09:38:11 np0005486759.ooo.test sudo[233342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:11 np0005486759.ooo.test python3.9[233344]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:11 np0005486759.ooo.test sudo[233342]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:12 np0005486759.ooo.test sudo[233452]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzazqhipyokvckucayhijvzrpjdtmdfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434692.0008953-245-123837504928622/AnsiballZ_stat.py
Oct 14 09:38:12 np0005486759.ooo.test sudo[233452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2032 DF PROTO=TCP SPT=39330 DPT=9882 SEQ=3887525038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C5EB80000000001030307) 
Oct 14 09:38:12 np0005486759.ooo.test python3.9[233454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:12 np0005486759.ooo.test sudo[233452]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:13 np0005486759.ooo.test sudo[233540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfxxgmfehzjajhgrlejgazykyvjbuzdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434692.0008953-245-123837504928622/AnsiballZ_copy.py
Oct 14 09:38:13 np0005486759.ooo.test sudo[233540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:13 np0005486759.ooo.test python3.9[233542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434692.0008953-245-123837504928622/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:13 np0005486759.ooo.test sudo[233540]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2033 DF PROTO=TCP SPT=39330 DPT=9882 SEQ=3887525038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C62C10000000001030307) 
Oct 14 09:38:13 np0005486759.ooo.test sudo[233650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfzrftyfqvwxhtxvkqdqphhaujbrkzjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434693.6754375-262-19214569107382/AnsiballZ_file.py
Oct 14 09:38:13 np0005486759.ooo.test sudo[233650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:14 np0005486759.ooo.test python3.9[233652]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:14 np0005486759.ooo.test sudo[233650]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:14 np0005486759.ooo.test sudo[233760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvcquidmpvoyhvynvxggahrrdgegrqyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434694.2997801-270-3636405032461/AnsiballZ_stat.py
Oct 14 09:38:14 np0005486759.ooo.test sudo[233760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:14 np0005486759.ooo.test python3.9[233762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:14 np0005486759.ooo.test sudo[233760]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:15 np0005486759.ooo.test sudo[233850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjkqgvfzaaopzyqyrgatvcofawxessti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434694.2997801-270-3636405032461/AnsiballZ_copy.py
Oct 14 09:38:15 np0005486759.ooo.test sudo[233850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:15 np0005486759.ooo.test python3.9[233852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434694.2997801-270-3636405032461/.source.json _original_basename=.4_iozaic follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:15 np0005486759.ooo.test sudo[233850]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:15 np0005486759.ooo.test sudo[233960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-begvmancqiameqozfevjbsldjoohbkmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434695.527549-285-73208258188302/AnsiballZ_file.py
Oct 14 09:38:15 np0005486759.ooo.test sudo[233960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:16 np0005486759.ooo.test python3.9[233962]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:16 np0005486759.ooo.test sudo[233960]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:16 np0005486759.ooo.test sudo[234070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpupjjfgouglznjoxdfcqczsmmitqwei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434696.28274-293-99348146779657/AnsiballZ_stat.py
Oct 14 09:38:16 np0005486759.ooo.test sudo[234070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:16 np0005486759.ooo.test sudo[234070]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:17 np0005486759.ooo.test sudo[234158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqqunvfvpidevutjbvneuaurglwidzld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434696.28274-293-99348146779657/AnsiballZ_copy.py
Oct 14 09:38:17 np0005486759.ooo.test sudo[234158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26021 DF PROTO=TCP SPT=34534 DPT=9105 SEQ=2444446226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C71820000000001030307) 
Oct 14 09:38:17 np0005486759.ooo.test sudo[234158]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:18 np0005486759.ooo.test sudo[234268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sablvzvausbzqwemmbfvincdplimuhpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434697.7306468-310-222113342875518/AnsiballZ_container_config_data.py
Oct 14 09:38:18 np0005486759.ooo.test sudo[234268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:18 np0005486759.ooo.test python3.9[234270]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 14 09:38:18 np0005486759.ooo.test sudo[234268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:18 np0005486759.ooo.test sudo[234378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgxtzipkuouxyohszepowxqoymkfrgoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434698.558372-319-150823056736721/AnsiballZ_container_config_hash.py
Oct 14 09:38:18 np0005486759.ooo.test sudo[234378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:19 np0005486759.ooo.test python3.9[234380]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:38:19 np0005486759.ooo.test sudo[234378]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2035 DF PROTO=TCP SPT=39330 DPT=9882 SEQ=3887525038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C7A810000000001030307) 
Oct 14 09:38:19 np0005486759.ooo.test sudo[234488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcgfiwnlkueqyzbbkqirlivcolquxhoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434699.4418497-328-77787737844775/AnsiballZ_podman_container_info.py
Oct 14 09:38:19 np0005486759.ooo.test sudo[234488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:20 np0005486759.ooo.test python3.9[234490]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:38:20 np0005486759.ooo.test sudo[234488]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36475 DF PROTO=TCP SPT=44496 DPT=9102 SEQ=3476400005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C87010000000001030307) 
Oct 14 09:38:23 np0005486759.ooo.test sudo[234625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjunkptterrtzagfimlfeysldhfclpvj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434702.156709-341-183469524034045/AnsiballZ_edpm_container_manage.py
Oct 14 09:38:23 np0005486759.ooo.test sudo[234625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:23 np0005486759.ooo.test python3[234627]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:38:25 np0005486759.ooo.test podman[234641]: 2025-10-14 09:38:23.580084836 +0000 UTC m=+0.068444045 image pull  quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 14 09:38:25 np0005486759.ooo.test python3[234627]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "4f44a4f5e0315c0d3dbd533e21d0927bf0518cf452942382901ff1ff9d621cbd",
                                                                 "Digest": "sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-iscsid:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-iscsid@sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-14T06:14:08.154480843Z",
                                                                 "Config": {
                                                                      "User": "root",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 403858061,
                                                                 "VirtualSize": 403858061,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec/diff:/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",
                                                                           "sha256:f640179b0564dc7abbe22bd39fc8810d5bbb8e54094fe7ebc5b3c45b658c4983",
                                                                           "sha256:f004953af60f7a99c360488169b0781a154164be09dce508bd68d57932c60f8f"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "root",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969219151Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969253522Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969285133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969308103Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969342284Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969363945Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:55.340499198Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:32.389605838Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.587912811Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.976619634Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:36.392967414Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.005863592Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.29378883Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.651733508Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.077574384Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.492629447Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.841668394Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.241713606Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.624152332Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.968354993Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.280465471Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.616162553Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.039895541Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.340755181Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.002994823Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.284637314Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.582935524Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:47.185088535Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260120756Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260167227Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260179498Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260189038Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:50.485771038Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:11:48.328117095Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:12:30.499124675Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:12:33.437399647Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:13:23.772230749Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:06.540870685Z",
                                                                           "created_by": "/bin/sh -c dnf -y install iscsi-initiator-utils python3-rtslib targetcli socat && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:06.903436438Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/iscsid/extend_start.sh /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:07.559847274Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:08.150720607Z",
                                                                           "created_by": "/bin/sh -c rm -f /etc/iscsi/initiatorname.iscsi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:11.370653418Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-iscsid:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 14 09:38:25 np0005486759.ooo.test podman[234706]: 2025-10-14 09:38:25.354555158 +0000 UTC m=+0.075393848 container remove 6936fd94ed33ea16ee393f985e677bb65f6faf223ade6b3b9832c544adc41301 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.9, container_name=iscsid, distribution-scope=public, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-07-21T13:27:15, config_id=tripleo_step3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-iscsid/images/17.1.9-1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1, vcs-ref=92ba14eeb90bb45ac0dcf02b7ce60e274a5ccbb2, io.openshift.tags=rhosp osp openstack osp-17.1, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20250721.1)
Oct 14 09:38:25 np0005486759.ooo.test python3[234627]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force iscsid
Oct 14 09:38:25 np0005486759.ooo.test podman[234720]: 
Oct 14 09:38:25 np0005486759.ooo.test podman[234720]: 2025-10-14 09:38:25.450939052 +0000 UTC m=+0.079436149 container create 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 14 09:38:25 np0005486759.ooo.test podman[234720]: 2025-10-14 09:38:25.415142369 +0000 UTC m=+0.043639526 image pull  quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 14 09:38:25 np0005486759.ooo.test python3[234627]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 14 09:38:25 np0005486759.ooo.test sudo[234625]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:26 np0005486759.ooo.test sudo[234865]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdaxawqbnplmvxesxmfcripkhgjsoexh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434705.7699955-349-222166478597415/AnsiballZ_stat.py
Oct 14 09:38:26 np0005486759.ooo.test sudo[234865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:38:26 np0005486759.ooo.test podman[234867]: 2025-10-14 09:38:26.163800354 +0000 UTC m=+0.073855219 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 14 09:38:26 np0005486759.ooo.test podman[234867]: 2025-10-14 09:38:26.195309688 +0000 UTC m=+0.105364593 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:38:26 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:38:26 np0005486759.ooo.test python3.9[234868]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:38:26 np0005486759.ooo.test sudo[234865]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36476 DF PROTO=TCP SPT=44496 DPT=9102 SEQ=3476400005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8C96C20000000001030307) 
Oct 14 09:38:26 np0005486759.ooo.test sudo[235002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hquwlvdesyfcodaxfxdqcwmuubcpohxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434706.572608-358-192035878851457/AnsiballZ_file.py
Oct 14 09:38:26 np0005486759.ooo.test sudo[235002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:27 np0005486759.ooo.test python3.9[235004]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:27 np0005486759.ooo.test sudo[235002]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:27 np0005486759.ooo.test sudo[235057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyvhphetbyqozefcplekzdduuwwmjhwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434706.572608-358-192035878851457/AnsiballZ_stat.py
Oct 14 09:38:27 np0005486759.ooo.test sudo[235057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:27 np0005486759.ooo.test python3.9[235059]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:38:27 np0005486759.ooo.test sudo[235057]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:27 np0005486759.ooo.test sudo[235166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjloobmxlvcecenyfauuuaobgpectubl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434707.4902842-358-113556745492864/AnsiballZ_copy.py
Oct 14 09:38:27 np0005486759.ooo.test sudo[235166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:28 np0005486759.ooo.test python3.9[235168]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434707.4902842-358-113556745492864/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:28 np0005486759.ooo.test sudo[235166]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:28 np0005486759.ooo.test sudo[235221]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvksrsujiuvsgwksecugyjstpdcgfsid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434707.4902842-358-113556745492864/AnsiballZ_systemd.py
Oct 14 09:38:28 np0005486759.ooo.test sudo[235221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:28 np0005486759.ooo.test python3.9[235223]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:38:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:38:28 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:38:28 np0005486759.ooo.test podman[235225]: 2025-10-14 09:38:28.59026896 +0000 UTC m=+0.050048613 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct 14 09:38:28 np0005486759.ooo.test systemd-sysv-generator[235267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:38:28 np0005486759.ooo.test systemd-rc-local-generator[235264]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:38:28 np0005486759.ooo.test podman[235225]: 2025-10-14 09:38:28.623322814 +0000 UTC m=+0.083102457 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 09:38:28 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:38:28 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:38:28 np0005486759.ooo.test sudo[235221]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:29 np0005486759.ooo.test sudo[235329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfkjdxrbshyiaeyoovjpfxbbeouddldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434707.4902842-358-113556745492864/AnsiballZ_systemd.py
Oct 14 09:38:29 np0005486759.ooo.test sudo[235329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:29 np0005486759.ooo.test python3.9[235331]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:38:29 np0005486759.ooo.test systemd-rc-local-generator[235353]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:38:29 np0005486759.ooo.test systemd-sysv-generator[235360]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Starting iscsid container...
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:38:29 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ddc5fbe16b14e08d4db6edd68ab8354f4e24be6dd0d2901ee351991c0d47b0/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 14 09:38:29 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ddc5fbe16b14e08d4db6edd68ab8354f4e24be6dd0d2901ee351991c0d47b0/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:38:29 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ddc5fbe16b14e08d4db6edd68ab8354f4e24be6dd0d2901ee351991c0d47b0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:38:29 np0005486759.ooo.test podman[235373]: 2025-10-14 09:38:29.82422281 +0000 UTC m=+0.111881983 container init 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009)
Oct 14 09:38:29 np0005486759.ooo.test iscsid[235388]: + sudo -E kolla_set_configs
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:38:29 np0005486759.ooo.test sudo[235395]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 09:38:29 np0005486759.ooo.test podman[235373]: 2025-10-14 09:38:29.856651124 +0000 UTC m=+0.144310277 container start 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:38:29 np0005486759.ooo.test podman[235373]: iscsid
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Started iscsid container.
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 09:38:29 np0005486759.ooo.test sudo[235329]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:29 np0005486759.ooo.test systemd[235406]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:38:29 np0005486759.ooo.test podman[235394]: 2025-10-14 09:38:29.941158165 +0000 UTC m=+0.080724090 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:38:29 np0005486759.ooo.test podman[235394]: 2025-10-14 09:38:29.954322259 +0000 UTC m=+0.093888154 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:38:29 np0005486759.ooo.test podman[235394]: unhealthy
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:38:29 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Failed with result 'exit-code'.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Queued start job for default target Main User Target.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Created slice User Application Slice.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Reached target Paths.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Reached target Timers.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Starting D-Bus User Message Bus Socket...
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Starting Create User's Volatile Files and Directories...
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Listening on D-Bus User Message Bus Socket.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Reached target Sockets.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Finished Create User's Volatile Files and Directories.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Reached target Basic System.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Reached target Main User Target.
Oct 14 09:38:30 np0005486759.ooo.test systemd[235406]: Startup finished in 135ms.
Oct 14 09:38:30 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 09:38:30 np0005486759.ooo.test systemd[1]: Started Session c15 of User root.
Oct 14 09:38:30 np0005486759.ooo.test sudo[235395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: INFO:__main__:Validating config file
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: INFO:__main__:Writing out command to execute
Oct 14 09:38:30 np0005486759.ooo.test sudo[235395]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:30 np0005486759.ooo.test systemd[1]: session-c15.scope: Deactivated successfully.
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: ++ cat /run_command
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + CMD='/usr/sbin/iscsid -f'
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + ARGS=
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + sudo kolla_copy_cacerts
Oct 14 09:38:30 np0005486759.ooo.test sudo[235489]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 14 09:38:30 np0005486759.ooo.test systemd[1]: Started Session c16 of User root.
Oct 14 09:38:30 np0005486759.ooo.test sudo[235489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:38:30 np0005486759.ooo.test sudo[235489]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:30 np0005486759.ooo.test systemd[1]: session-c16.scope: Deactivated successfully.
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + [[ ! -n '' ]]
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + . kolla_extend_start
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: Running command: '/usr/sbin/iscsid -f'
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + umask 0022
Oct 14 09:38:30 np0005486759.ooo.test iscsid[235388]: + exec /usr/sbin/iscsid -f
Oct 14 09:38:30 np0005486759.ooo.test python3.9[235540]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:38:31 np0005486759.ooo.test sudo[235648]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hufqpafcyjikadczbcwqwxeaswbqkecn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434710.7744994-395-48547623843392/AnsiballZ_file.py
Oct 14 09:38:31 np0005486759.ooo.test sudo[235648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:31 np0005486759.ooo.test python3.9[235650]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:31 np0005486759.ooo.test sudo[235648]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:31 np0005486759.ooo.test sudo[235758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stqdjitgctkzgulsqgttdenmhovuhwka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434711.615154-406-203903886138480/AnsiballZ_service_facts.py
Oct 14 09:38:31 np0005486759.ooo.test sudo[235758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:32 np0005486759.ooo.test python3.9[235760]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:38:32 np0005486759.ooo.test network[235777]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:38:32 np0005486759.ooo.test network[235778]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:38:32 np0005486759.ooo.test network[235779]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:38:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32527 DF PROTO=TCP SPT=47644 DPT=9100 SEQ=3198378661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CB4980000000001030307) 
Oct 14 09:38:34 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:38:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32528 DF PROTO=TCP SPT=47644 DPT=9100 SEQ=3198378661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CB8810000000001030307) 
Oct 14 09:38:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32529 DF PROTO=TCP SPT=47644 DPT=9100 SEQ=3198378661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CC0820000000001030307) 
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Activating special unit Exit the Session...
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped target Main User Target.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped target Basic System.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped target Paths.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped target Sockets.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped target Timers.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Closed D-Bus User Message Bus Socket.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Stopped Create User's Volatile Files and Directories.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Removed slice User Application Slice.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Reached target Shutdown.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Finished Exit the Session.
Oct 14 09:38:40 np0005486759.ooo.test systemd[235406]: Reached target Exit the Session.
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 09:38:40 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 09:38:40 np0005486759.ooo.test sudo[235758]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:41 np0005486759.ooo.test sudo[236011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbhyuvgefbunnavxzndxbngfugfhaysz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434721.0443053-416-75662347621198/AnsiballZ_file.py
Oct 14 09:38:41 np0005486759.ooo.test sudo[236011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32530 DF PROTO=TCP SPT=47644 DPT=9100 SEQ=3198378661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CD0410000000001030307) 
Oct 14 09:38:41 np0005486759.ooo.test python3.9[236013]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:38:41 np0005486759.ooo.test sudo[236011]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:42 np0005486759.ooo.test sudo[236121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrdvlmgoycbxbytthoroxmqvtgeaxonx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434721.6579912-424-81799687750971/AnsiballZ_modprobe.py
Oct 14 09:38:42 np0005486759.ooo.test sudo[236121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:42 np0005486759.ooo.test python3.9[236123]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 14 09:38:42 np0005486759.ooo.test sudo[236121]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54750 DF PROTO=TCP SPT=38536 DPT=9882 SEQ=2340689430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CD3E80000000001030307) 
Oct 14 09:38:42 np0005486759.ooo.test sudo[236235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwzoyvoqnbdepqymbjxmhzbvbacuvuds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434722.3979962-432-53616072420182/AnsiballZ_stat.py
Oct 14 09:38:42 np0005486759.ooo.test sudo[236235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:42 np0005486759.ooo.test python3.9[236237]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:42 np0005486759.ooo.test sudo[236235]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:43 np0005486759.ooo.test sudo[236323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyypkagakobhmnyifluqxhxxoniwixzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434722.3979962-432-53616072420182/AnsiballZ_copy.py
Oct 14 09:38:43 np0005486759.ooo.test sudo[236323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54751 DF PROTO=TCP SPT=38536 DPT=9882 SEQ=2340689430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CD8010000000001030307) 
Oct 14 09:38:43 np0005486759.ooo.test python3.9[236325]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434722.3979962-432-53616072420182/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:43 np0005486759.ooo.test sudo[236323]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:43 np0005486759.ooo.test sudo[236433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aynckvgrqjdahhdqozhpkniuhfwnzsrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434723.62545-448-3525915287352/AnsiballZ_lineinfile.py
Oct 14 09:38:43 np0005486759.ooo.test sudo[236433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:44 np0005486759.ooo.test python3.9[236435]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:44 np0005486759.ooo.test sudo[236433]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:44 np0005486759.ooo.test sudo[236543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ginxhlsjofxilpwomfppxipngfwgdoyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434724.2659357-456-137921603995349/AnsiballZ_systemd.py
Oct 14 09:38:44 np0005486759.ooo.test sudo[236543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:44 np0005486759.ooo.test python3.9[236545]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:38:44 np0005486759.ooo.test systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 14 09:38:44 np0005486759.ooo.test systemd[1]: Stopped Load Kernel Modules.
Oct 14 09:38:44 np0005486759.ooo.test systemd[1]: Stopping Load Kernel Modules...
Oct 14 09:38:44 np0005486759.ooo.test systemd[1]: Starting Load Kernel Modules...
Oct 14 09:38:44 np0005486759.ooo.test systemd-modules-load[236549]: Module 'msr' is built in
Oct 14 09:38:44 np0005486759.ooo.test systemd[1]: Finished Load Kernel Modules.
Oct 14 09:38:44 np0005486759.ooo.test sudo[236543]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:45 np0005486759.ooo.test sudo[236657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrtjaoatxuzgdxtdwjvifvcacjruppuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434725.1233823-464-197248566565236/AnsiballZ_file.py
Oct 14 09:38:45 np0005486759.ooo.test sudo[236657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:45 np0005486759.ooo.test python3.9[236659]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:45 np0005486759.ooo.test sudo[236657]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:45 np0005486759.ooo.test sudo[236767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvyagzmzswhgpefaavkastrcnmjhvpkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434725.777579-473-114491386164087/AnsiballZ_stat.py
Oct 14 09:38:45 np0005486759.ooo.test sudo[236767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:46 np0005486759.ooo.test python3.9[236769]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:38:46 np0005486759.ooo.test sudo[236767]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:46 np0005486759.ooo.test sudo[236877]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nixvirdxnsndiyfeqalzyvzkploutqdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434726.3007681-482-179442642778276/AnsiballZ_stat.py
Oct 14 09:38:46 np0005486759.ooo.test sudo[236877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:46 np0005486759.ooo.test python3.9[236879]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:38:46 np0005486759.ooo.test sudo[236877]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:47 np0005486759.ooo.test sudo[236987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgezjjefgeyvctxzezpelikatpwawvxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434726.8239408-490-160493956184477/AnsiballZ_stat.py
Oct 14 09:38:47 np0005486759.ooo.test sudo[236987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63141 DF PROTO=TCP SPT=39708 DPT=9105 SEQ=2877358026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CE6C10000000001030307) 
Oct 14 09:38:47 np0005486759.ooo.test python3.9[236989]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:47 np0005486759.ooo.test sudo[236987]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:47 np0005486759.ooo.test sudo[237075]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtcfnvnmobpigajheblwmorzmyrwalpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434726.8239408-490-160493956184477/AnsiballZ_copy.py
Oct 14 09:38:47 np0005486759.ooo.test sudo[237075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:47 np0005486759.ooo.test python3.9[237077]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434726.8239408-490-160493956184477/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:47 np0005486759.ooo.test sudo[237075]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:48 np0005486759.ooo.test sudo[237185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhdfmbmwhqkyolazmecgruqutacxzuav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434727.898407-505-161713419788230/AnsiballZ_command.py
Oct 14 09:38:48 np0005486759.ooo.test sudo[237185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:48 np0005486759.ooo.test python3.9[237187]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:38:48 np0005486759.ooo.test sudo[237185]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:48 np0005486759.ooo.test sudo[237296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gekxwqcezfoxfuatbgntpoekllhugprw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434728.6240969-513-241112291245916/AnsiballZ_lineinfile.py
Oct 14 09:38:48 np0005486759.ooo.test sudo[237296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:49 np0005486759.ooo.test python3.9[237298]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:49 np0005486759.ooo.test sudo[237296]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54753 DF PROTO=TCP SPT=38536 DPT=9882 SEQ=2340689430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CEFC10000000001030307) 
Oct 14 09:38:49 np0005486759.ooo.test sudo[237406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayzdnzzlehbpbtujbtpmoenpveamgqbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434729.2027078-521-42995218071038/AnsiballZ_replace.py
Oct 14 09:38:49 np0005486759.ooo.test sudo[237406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:49 np0005486759.ooo.test python3.9[237408]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:49 np0005486759.ooo.test sudo[237406]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:50 np0005486759.ooo.test sudo[237516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxvwvztwdmsmegsqzwvrzeahzklbgntt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434729.9931288-529-109291354963017/AnsiballZ_replace.py
Oct 14 09:38:50 np0005486759.ooo.test sudo[237516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:50 np0005486759.ooo.test python3.9[237518]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:50 np0005486759.ooo.test sudo[237516]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:50 np0005486759.ooo.test sudo[237626]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snirupwbtextzprypmawenxbgwlctake ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434730.6860118-538-6678907644001/AnsiballZ_lineinfile.py
Oct 14 09:38:50 np0005486759.ooo.test sudo[237626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:51 np0005486759.ooo.test python3.9[237628]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:51 np0005486759.ooo.test sudo[237626]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:51 np0005486759.ooo.test sudo[237736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tffccrlkcagrauffdhitobndvhfdkzvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434731.2560828-538-225429180555790/AnsiballZ_lineinfile.py
Oct 14 09:38:51 np0005486759.ooo.test sudo[237736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:51 np0005486759.ooo.test python3.9[237738]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:51 np0005486759.ooo.test sudo[237736]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:52 np0005486759.ooo.test sudo[237846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chjyjaakpevcpgvqpzhqpyoujmcnvajt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434731.8115294-538-131862476429718/AnsiballZ_lineinfile.py
Oct 14 09:38:52 np0005486759.ooo.test sudo[237846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:52 np0005486759.ooo.test python3.9[237848]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:52 np0005486759.ooo.test sudo[237846]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19486 DF PROTO=TCP SPT=57834 DPT=9102 SEQ=2445971916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8CFC020000000001030307) 
Oct 14 09:38:52 np0005486759.ooo.test sudo[237956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roxfmbvruicxzorslryfscostoaaegkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434732.3681378-538-159159453110357/AnsiballZ_lineinfile.py
Oct 14 09:38:52 np0005486759.ooo.test sudo[237956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:52 np0005486759.ooo.test python3.9[237958]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:52 np0005486759.ooo.test sudo[237956]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:53 np0005486759.ooo.test sudo[238066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpuatpxxczqehjehehdopagyeognyhlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434733.0458574-567-235107896174560/AnsiballZ_stat.py
Oct 14 09:38:53 np0005486759.ooo.test sudo[238066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:53 np0005486759.ooo.test python3.9[238068]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:38:53 np0005486759.ooo.test sudo[238066]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:54 np0005486759.ooo.test sudo[238178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdxyxjnzuwcnkshpbxqupyloxmgjhjxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434733.7625587-575-170208711843633/AnsiballZ_file.py
Oct 14 09:38:54 np0005486759.ooo.test sudo[238178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:38:54.140 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:38:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:38:54.141 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:38:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:38:54.142 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:38:54 np0005486759.ooo.test python3.9[238180]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:54 np0005486759.ooo.test sudo[238178]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:54 np0005486759.ooo.test sudo[238288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxqcdoluxoechagtatwrfpgdshdzdvwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434734.5148225-584-134361154761720/AnsiballZ_file.py
Oct 14 09:38:54 np0005486759.ooo.test sudo[238288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:55 np0005486759.ooo.test python3.9[238290]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:55 np0005486759.ooo.test sudo[238288]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:55 np0005486759.ooo.test sudo[238398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqckcmcgxnkijuwbhdnqmrtlszanjtph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434735.2385373-592-34594832647149/AnsiballZ_stat.py
Oct 14 09:38:55 np0005486759.ooo.test sudo[238398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:55 np0005486759.ooo.test python3.9[238400]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:55 np0005486759.ooo.test sudo[238398]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:56 np0005486759.ooo.test sudo[238455]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtqdstljaiiudypowbbvyavygtnlpdmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434735.2385373-592-34594832647149/AnsiballZ_file.py
Oct 14 09:38:56 np0005486759.ooo.test sudo[238455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:56 np0005486759.ooo.test python3.9[238457]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:56 np0005486759.ooo.test sudo[238455]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:38:56 np0005486759.ooo.test podman[238476]: 2025-10-14 09:38:56.447031223 +0000 UTC m=+0.072041341 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Oct 14 09:38:56 np0005486759.ooo.test podman[238476]: 2025-10-14 09:38:56.522448282 +0000 UTC m=+0.147458429 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 09:38:56 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:38:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19487 DF PROTO=TCP SPT=57834 DPT=9102 SEQ=2445971916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D0BC10000000001030307) 
Oct 14 09:38:56 np0005486759.ooo.test sudo[238589]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njislwethpalujhrjpyxfeowbvuybezw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434736.3921008-592-77503157213243/AnsiballZ_stat.py
Oct 14 09:38:56 np0005486759.ooo.test sudo[238589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:56 np0005486759.ooo.test python3.9[238591]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:56 np0005486759.ooo.test sudo[238589]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:57 np0005486759.ooo.test sudo[238646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eatgugmzxropqeggbcrfrkagqbbcuiob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434736.3921008-592-77503157213243/AnsiballZ_file.py
Oct 14 09:38:57 np0005486759.ooo.test sudo[238646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:57 np0005486759.ooo.test python3.9[238648]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:38:57 np0005486759.ooo.test sudo[238646]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:57 np0005486759.ooo.test sudo[238756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aavcylkpaeaawwykssagomqdvksjbmdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434737.5927367-615-56530964575946/AnsiballZ_file.py
Oct 14 09:38:57 np0005486759.ooo.test sudo[238756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:58 np0005486759.ooo.test python3.9[238758]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:58 np0005486759.ooo.test sudo[238756]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:58 np0005486759.ooo.test sudo[238866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuhwfnmcigarryevuuhxdumsdwybrpxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434738.2430577-623-49505182682876/AnsiballZ_stat.py
Oct 14 09:38:58 np0005486759.ooo.test sudo[238866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:58 np0005486759.ooo.test python3.9[238868]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:58 np0005486759.ooo.test sudo[238866]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:58 np0005486759.ooo.test sudo[238923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwxgejhjizifzfgczwskpidxpmqanffy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434738.2430577-623-49505182682876/AnsiballZ_file.py
Oct 14 09:38:58 np0005486759.ooo.test sudo[238923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:38:59 np0005486759.ooo.test podman[238925]: 2025-10-14 09:38:59.056843693 +0000 UTC m=+0.084837562 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:38:59 np0005486759.ooo.test podman[238925]: 2025-10-14 09:38:59.090447965 +0000 UTC m=+0.118441844 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:38:59 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:38:59 np0005486759.ooo.test python3.9[238926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:38:59 np0005486759.ooo.test sudo[238923]: pam_unix(sudo:session): session closed for user root
Oct 14 09:38:59 np0005486759.ooo.test sudo[239049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xufsjvchnuimyamhryngtiaupfhmbbvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434739.3211954-635-278144806878363/AnsiballZ_stat.py
Oct 14 09:38:59 np0005486759.ooo.test sudo[239049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:38:59 np0005486759.ooo.test python3.9[239051]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:38:59 np0005486759.ooo.test sudo[239049]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:00 np0005486759.ooo.test sudo[239106]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjyrenytmvwbaatnupwtbmzimodvunvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434739.3211954-635-278144806878363/AnsiballZ_file.py
Oct 14 09:39:00 np0005486759.ooo.test sudo[239106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:39:00 np0005486759.ooo.test systemd[1]: tmp-crun.XijKQp.mount: Deactivated successfully.
Oct 14 09:39:00 np0005486759.ooo.test podman[239109]: 2025-10-14 09:39:00.12361844 +0000 UTC m=+0.068581819 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 14 09:39:00 np0005486759.ooo.test podman[239109]: 2025-10-14 09:39:00.133031023 +0000 UTC m=+0.077994362 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Oct 14 09:39:00 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:39:00 np0005486759.ooo.test python3.9[239108]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:00 np0005486759.ooo.test sudo[239106]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:00 np0005486759.ooo.test sudo[239234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymxlflvnwtlbrhwupekhjeyezfjwaoue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434740.4084964-647-155442061915349/AnsiballZ_systemd.py
Oct 14 09:39:00 np0005486759.ooo.test sudo[239234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:01 np0005486759.ooo.test python3.9[239236]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:01 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:01 np0005486759.ooo.test systemd-rc-local-generator[239257]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:01 np0005486759.ooo.test systemd-sysv-generator[239261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:01 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:01 np0005486759.ooo.test sudo[239234]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:01 np0005486759.ooo.test sudo[239382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxxzikckeauexlvsxtakxafkqqbgetoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434741.5811303-655-66206814807206/AnsiballZ_stat.py
Oct 14 09:39:01 np0005486759.ooo.test sudo[239382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:02 np0005486759.ooo.test python3.9[239384]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:39:02 np0005486759.ooo.test sudo[239382]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:02 np0005486759.ooo.test sudo[239439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juvpibsjvbglbiwunmdapbvtvyhiuvkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434741.5811303-655-66206814807206/AnsiballZ_file.py
Oct 14 09:39:02 np0005486759.ooo.test sudo[239439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:02 np0005486759.ooo.test python3.9[239441]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:02 np0005486759.ooo.test sudo[239439]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:03 np0005486759.ooo.test sudo[239549]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjusmlmiwppouvvhdlwajohianfgcafr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434742.7183638-667-46001725455755/AnsiballZ_stat.py
Oct 14 09:39:03 np0005486759.ooo.test sudo[239549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:03 np0005486759.ooo.test python3.9[239551]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:39:03 np0005486759.ooo.test sudo[239549]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:03 np0005486759.ooo.test sudo[239606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvxgxcznsihggoajvbhukfvheifxzebi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434742.7183638-667-46001725455755/AnsiballZ_file.py
Oct 14 09:39:03 np0005486759.ooo.test sudo[239606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:03 np0005486759.ooo.test python3.9[239608]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:03 np0005486759.ooo.test sudo[239606]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:04 np0005486759.ooo.test systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 14 09:39:04 np0005486759.ooo.test sudo[239717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvdxgtgdxthsniwdqwywqcqvufgcsznm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434743.8044345-679-52410467933683/AnsiballZ_systemd.py
Oct 14 09:39:04 np0005486759.ooo.test sudo[239717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17011 DF PROTO=TCP SPT=35842 DPT=9100 SEQ=1098994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D29C80000000001030307) 
Oct 14 09:39:04 np0005486759.ooo.test python3.9[239719]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:04 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:04 np0005486759.ooo.test systemd-rc-local-generator[239742]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:04 np0005486759.ooo.test systemd-sysv-generator[239745]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:04 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:05 np0005486759.ooo.test systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 14 09:39:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17012 DF PROTO=TCP SPT=35842 DPT=9100 SEQ=1098994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D2DC10000000001030307) 
Oct 14 09:39:05 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:39:05 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:39:05 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:39:05 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:39:05 np0005486759.ooo.test sudo[239717]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:06 np0005486759.ooo.test sudo[239869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzrwfhzplnrilpuxutxyukeevpqacghf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434746.0765514-689-222237717779758/AnsiballZ_file.py
Oct 14 09:39:06 np0005486759.ooo.test sudo[239869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:06 np0005486759.ooo.test python3.9[239871]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:39:06 np0005486759.ooo.test sudo[239869]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:06 np0005486759.ooo.test sudo[239979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsupkqtqpbgcsgjopqjfzulpvhzvdzhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434746.7011747-697-278788064814466/AnsiballZ_stat.py
Oct 14 09:39:06 np0005486759.ooo.test sudo[239979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:07 np0005486759.ooo.test python3.9[239981]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:39:07 np0005486759.ooo.test sudo[239979]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17013 DF PROTO=TCP SPT=35842 DPT=9100 SEQ=1098994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D35C10000000001030307) 
Oct 14 09:39:07 np0005486759.ooo.test systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 14 09:39:07 np0005486759.ooo.test sudo[240068]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdrjppyhizpbszleqbreyxbqynarcyqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434746.7011747-697-278788064814466/AnsiballZ_copy.py
Oct 14 09:39:07 np0005486759.ooo.test sudo[240068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:07 np0005486759.ooo.test python3.9[240070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434746.7011747-697-278788064814466/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:39:07 np0005486759.ooo.test sudo[240068]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:08 np0005486759.ooo.test sudo[240178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pejykdpljcjsdfgdzabuabrkyxqxjvxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434748.303404-714-248046684605462/AnsiballZ_file.py
Oct 14 09:39:08 np0005486759.ooo.test sudo[240178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:08 np0005486759.ooo.test python3.9[240180]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:39:08 np0005486759.ooo.test sudo[240178]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:09 np0005486759.ooo.test sudo[240288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gydmkggamllkfhgtajrvjlqfgnikvksg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434749.7128103-722-21632299940801/AnsiballZ_stat.py
Oct 14 09:39:09 np0005486759.ooo.test sudo[240288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:10 np0005486759.ooo.test python3.9[240290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:39:10 np0005486759.ooo.test sudo[240288]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:10 np0005486759.ooo.test sudo[240376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucwbuwtdwarhgdselrlgfjpwcutmqnsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434749.7128103-722-21632299940801/AnsiballZ_copy.py
Oct 14 09:39:10 np0005486759.ooo.test sudo[240376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:10 np0005486759.ooo.test python3.9[240378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434749.7128103-722-21632299940801/.source.json _original_basename=.0db1_hke follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:10 np0005486759.ooo.test sudo[240376]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:11 np0005486759.ooo.test sudo[240486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyjcygwoyrygamfnafpkuitgcuddsozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434750.9031808-737-161404921802150/AnsiballZ_file.py
Oct 14 09:39:11 np0005486759.ooo.test sudo[240486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:11 np0005486759.ooo.test python3.9[240488]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:11 np0005486759.ooo.test sudo[240486]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17014 DF PROTO=TCP SPT=35842 DPT=9100 SEQ=1098994716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D45820000000001030307) 
Oct 14 09:39:11 np0005486759.ooo.test sudo[240596]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tptaxtnbmozbvlnufbgaoneybiuxpfvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434751.581088-745-208899380917132/AnsiballZ_stat.py
Oct 14 09:39:11 np0005486759.ooo.test sudo[240596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:12 np0005486759.ooo.test sudo[240596]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43393 DF PROTO=TCP SPT=37996 DPT=9882 SEQ=1127823235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D49180000000001030307) 
Oct 14 09:39:12 np0005486759.ooo.test sudo[240684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgwtjnhrhgzvvgfejcxlxrngsfficfes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434751.581088-745-208899380917132/AnsiballZ_copy.py
Oct 14 09:39:12 np0005486759.ooo.test sudo[240684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:12 np0005486759.ooo.test sudo[240684]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:13 np0005486759.ooo.test sudo[240794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfezkkvegzijspmvhfojkipqwwtdnamo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434752.864326-762-112370332931092/AnsiballZ_container_config_data.py
Oct 14 09:39:13 np0005486759.ooo.test sudo[240794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43394 DF PROTO=TCP SPT=37996 DPT=9882 SEQ=1127823235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D4D010000000001030307) 
Oct 14 09:39:13 np0005486759.ooo.test python3.9[240796]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 14 09:39:13 np0005486759.ooo.test sudo[240794]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:13 np0005486759.ooo.test sudo[240904]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twetocuiezsafojlcexmsaanfpodvntt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434753.5886347-771-257518733799304/AnsiballZ_container_config_hash.py
Oct 14 09:39:13 np0005486759.ooo.test sudo[240904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:14 np0005486759.ooo.test python3.9[240906]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:39:14 np0005486759.ooo.test sudo[240904]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:14 np0005486759.ooo.test sudo[241014]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grjrvevwvrackirvqwvhltqntxtsbcdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434754.3414707-780-125673389184305/AnsiballZ_podman_container_info.py
Oct 14 09:39:14 np0005486759.ooo.test sudo[241014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:14 np0005486759.ooo.test python3.9[241016]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:39:15 np0005486759.ooo.test sudo[241014]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:16 np0005486759.ooo.test sudo[241150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxnpxjvwqczfovatyjpcuoqsigvahzsi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434756.5021276-793-248383427130991/AnsiballZ_edpm_container_manage.py
Oct 14 09:39:16 np0005486759.ooo.test sudo[241150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:17 np0005486759.ooo.test python3[241152]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:39:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40790 DF PROTO=TCP SPT=50296 DPT=9105 SEQ=1072672288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D5BC10000000001030307) 
Oct 14 09:39:18 np0005486759.ooo.test podman[241165]: 2025-10-14 09:39:17.184763573 +0000 UTC m=+0.047164300 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 14 09:39:18 np0005486759.ooo.test podman[241213]: 
Oct 14 09:39:18 np0005486759.ooo.test podman[241213]: 2025-10-14 09:39:18.940344438 +0000 UTC m=+0.051227481 container create b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:39:18 np0005486759.ooo.test podman[241213]: 2025-10-14 09:39:18.916591663 +0000 UTC m=+0.027474726 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 14 09:39:18 np0005486759.ooo.test python3[241152]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 14 09:39:19 np0005486759.ooo.test sudo[241150]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43396 DF PROTO=TCP SPT=37996 DPT=9882 SEQ=1127823235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D64C10000000001030307) 
Oct 14 09:39:19 np0005486759.ooo.test sudo[241357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfublwpckaaknizbezwsdjkttwwbuclm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434759.2913659-801-274685040591456/AnsiballZ_stat.py
Oct 14 09:39:19 np0005486759.ooo.test sudo[241357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:19 np0005486759.ooo.test python3.9[241359]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:39:19 np0005486759.ooo.test sudo[241357]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:20 np0005486759.ooo.test sudo[241469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqgdicefvttejrxtswrhtytoiqkstrdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434760.653726-810-17585179978255/AnsiballZ_file.py
Oct 14 09:39:20 np0005486759.ooo.test sudo[241469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:21 np0005486759.ooo.test python3.9[241471]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:21 np0005486759.ooo.test sudo[241469]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:21 np0005486759.ooo.test sudo[241524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqpqvuargxuvmtxmpbkzxakzcsujnzae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434760.653726-810-17585179978255/AnsiballZ_stat.py
Oct 14 09:39:21 np0005486759.ooo.test sudo[241524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:21 np0005486759.ooo.test python3.9[241526]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:39:21 np0005486759.ooo.test sudo[241524]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:22 np0005486759.ooo.test sudo[241633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkpjphmzwlbtblknjzwbsjoiqserpjjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434761.6284595-810-219582630885342/AnsiballZ_copy.py
Oct 14 09:39:22 np0005486759.ooo.test sudo[241633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:22 np0005486759.ooo.test python3.9[241635]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434761.6284595-810-219582630885342/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:22 np0005486759.ooo.test sudo[241633]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25799 DF PROTO=TCP SPT=59192 DPT=9102 SEQ=1018630753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D71410000000001030307) 
Oct 14 09:39:23 np0005486759.ooo.test sudo[241688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxfbtjudwitoyjlzeixcqojancsybizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434761.6284595-810-219582630885342/AnsiballZ_systemd.py
Oct 14 09:39:23 np0005486759.ooo.test sudo[241688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:23 np0005486759.ooo.test python3.9[241690]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:39:23 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:23 np0005486759.ooo.test systemd-rc-local-generator[241713]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:23 np0005486759.ooo.test systemd-sysv-generator[241717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:23 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:23 np0005486759.ooo.test sudo[241688]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:24 np0005486759.ooo.test sudo[241779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hogsusifxkvasaugikyfrjrkqxbkzvta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434761.6284595-810-219582630885342/AnsiballZ_systemd.py
Oct 14 09:39:24 np0005486759.ooo.test sudo[241779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:24 np0005486759.ooo.test python3.9[241781]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:25 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:25 np0005486759.ooo.test systemd-rc-local-generator[241805]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:25 np0005486759.ooo.test systemd-sysv-generator[241811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:25 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:25 np0005486759.ooo.test systemd[1]: Starting multipathd container...
Oct 14 09:39:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:39:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67a96d5e3f59cf32cb356dfc6c3c3d9c3d6cd68ab4693bba1dcca66753f9e5fb/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 14 09:39:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67a96d5e3f59cf32cb356dfc6c3c3d9c3d6cd68ab4693bba1dcca66753f9e5fb/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:39:26 np0005486759.ooo.test podman[241821]: 2025-10-14 09:39:26.020305192 +0000 UTC m=+0.139793704 container init b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + sudo -E kolla_set_configs
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:39:26 np0005486759.ooo.test sudo[241842]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 09:39:26 np0005486759.ooo.test sudo[241842]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:39:26 np0005486759.ooo.test sudo[241842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:39:26 np0005486759.ooo.test podman[241821]: 2025-10-14 09:39:26.05707149 +0000 UTC m=+0.176559922 container start b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true)
Oct 14 09:39:26 np0005486759.ooo.test podman[241821]: multipathd
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: Started multipathd container.
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: INFO:__main__:Validating config file
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: INFO:__main__:Writing out command to execute
Oct 14 09:39:26 np0005486759.ooo.test sudo[241779]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:26 np0005486759.ooo.test sudo[241842]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: ++ cat /run_command
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + CMD='/usr/sbin/multipathd -d'
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + ARGS=
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + sudo kolla_copy_cacerts
Oct 14 09:39:26 np0005486759.ooo.test sudo[241859]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 14 09:39:26 np0005486759.ooo.test sudo[241859]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:39:26 np0005486759.ooo.test sudo[241859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:39:26 np0005486759.ooo.test podman[241843]: 2025-10-14 09:39:26.137909443 +0000 UTC m=+0.081678601 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 09:39:26 np0005486759.ooo.test sudo[241859]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + [[ ! -n '' ]]
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + . kolla_extend_start
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: Running command: '/usr/sbin/multipathd -d'
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + umask 0022
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: + exec /usr/sbin/multipathd -d
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: 10786.348998 | --------start up--------
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: 10786.349017 | read /etc/multipath.conf
Oct 14 09:39:26 np0005486759.ooo.test multipathd[241836]: 10786.352722 | path checkers start up
Oct 14 09:39:26 np0005486759.ooo.test podman[241843]: 2025-10-14 09:39:26.183036492 +0000 UTC m=+0.126805590 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:39:26 np0005486759.ooo.test podman[241843]: unhealthy
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Failed with result 'exit-code'.
Oct 14 09:39:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25800 DF PROTO=TCP SPT=59192 DPT=9102 SEQ=1018630753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D81010000000001030307) 
Oct 14 09:39:26 np0005486759.ooo.test python3.9[241982]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:39:26 np0005486759.ooo.test podman[241991]: 2025-10-14 09:39:26.945853067 +0000 UTC m=+0.061638888 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:39:26 np0005486759.ooo.test podman[241991]: 2025-10-14 09:39:26.983230175 +0000 UTC m=+0.099015956 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 09:39:26 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:39:27 np0005486759.ooo.test sudo[242118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swnycqljyzjzhoigbhqjxmahfscmohyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434767.012934-846-101366863931478/AnsiballZ_command.py
Oct 14 09:39:27 np0005486759.ooo.test sudo[242118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:27 np0005486759.ooo.test python3.9[242120]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:39:27 np0005486759.ooo.test sudo[242118]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:28 np0005486759.ooo.test sudo[242241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twhnxdujzbjziundozspjiovhpashswl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434767.8122084-854-166919048028113/AnsiballZ_systemd.py
Oct 14 09:39:28 np0005486759.ooo.test sudo[242241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:28 np0005486759.ooo.test python3.9[242243]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Stopping multipathd container...
Oct 14 09:39:28 np0005486759.ooo.test multipathd[241836]: 10788.763940 | exit (signal)
Oct 14 09:39:28 np0005486759.ooo.test multipathd[241836]: 10788.764962 | --------shut down-------
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: libpod-b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.scope: Deactivated successfully.
Oct 14 09:39:28 np0005486759.ooo.test podman[242247]: 2025-10-14 09:39:28.601100779 +0000 UTC m=+0.100718831 container died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.timer: Deactivated successfully.
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319-userdata-shm.mount: Deactivated successfully.
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-67a96d5e3f59cf32cb356dfc6c3c3d9c3d6cd68ab4693bba1dcca66753f9e5fb-merged.mount: Deactivated successfully.
Oct 14 09:39:28 np0005486759.ooo.test podman[242247]: 2025-10-14 09:39:28.723109484 +0000 UTC m=+0.222727516 container cleanup b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:39:28 np0005486759.ooo.test podman[242247]: multipathd
Oct 14 09:39:28 np0005486759.ooo.test podman[242276]: 2025-10-14 09:39:28.819586835 +0000 UTC m=+0.069920058 container cleanup b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:39:28 np0005486759.ooo.test podman[242276]: multipathd
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Stopped multipathd container.
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Starting multipathd container...
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:39:28 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67a96d5e3f59cf32cb356dfc6c3c3d9c3d6cd68ab4693bba1dcca66753f9e5fb/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 14 09:39:28 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67a96d5e3f59cf32cb356dfc6c3c3d9c3d6cd68ab4693bba1dcca66753f9e5fb/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:39:28 np0005486759.ooo.test podman[242288]: 2025-10-14 09:39:28.949179866 +0000 UTC m=+0.108116972 container init b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 09:39:28 np0005486759.ooo.test multipathd[242304]: + sudo -E kolla_set_configs
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:39:28 np0005486759.ooo.test sudo[242310]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 09:39:28 np0005486759.ooo.test podman[242288]: 2025-10-14 09:39:28.992133406 +0000 UTC m=+0.151070512 container start b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible)
Oct 14 09:39:28 np0005486759.ooo.test podman[242288]: multipathd
Oct 14 09:39:28 np0005486759.ooo.test sudo[242310]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:39:28 np0005486759.ooo.test sudo[242310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:39:28 np0005486759.ooo.test systemd[1]: Started multipathd container.
Oct 14 09:39:29 np0005486759.ooo.test sudo[242241]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: INFO:__main__:Validating config file
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: INFO:__main__:Writing out command to execute
Oct 14 09:39:29 np0005486759.ooo.test sudo[242310]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: ++ cat /run_command
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + CMD='/usr/sbin/multipathd -d'
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + ARGS=
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + sudo kolla_copy_cacerts
Oct 14 09:39:29 np0005486759.ooo.test podman[242311]: 2025-10-14 09:39:29.075582263 +0000 UTC m=+0.077361920 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 09:39:29 np0005486759.ooo.test sudo[242332]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 14 09:39:29 np0005486759.ooo.test sudo[242332]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:39:29 np0005486759.ooo.test podman[242311]: 2025-10-14 09:39:29.081512727 +0000 UTC m=+0.083292414 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:39:29 np0005486759.ooo.test sudo[242332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:39:29 np0005486759.ooo.test sudo[242332]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + [[ ! -n '' ]]
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + . kolla_extend_start
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: Running command: '/usr/sbin/multipathd -d'
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + umask 0022
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: + exec /usr/sbin/multipathd -d
Oct 14 09:39:29 np0005486759.ooo.test podman[242311]: unhealthy
Oct 14 09:39:29 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:39:29 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Failed with result 'exit-code'.
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: 10789.291817 | --------start up--------
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: 10789.291834 | read /etc/multipath.conf
Oct 14 09:39:29 np0005486759.ooo.test multipathd[242304]: 10789.295406 | path checkers start up
Oct 14 09:39:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:39:29 np0005486759.ooo.test podman[242413]: 2025-10-14 09:39:29.45709677 +0000 UTC m=+0.082896052 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 14 09:39:29 np0005486759.ooo.test podman[242413]: 2025-10-14 09:39:29.491394507 +0000 UTC m=+0.117193749 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 14 09:39:29 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:39:29 np0005486759.ooo.test sudo[242468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efkmykbnsokycocutmydmwhtwhlvcldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434769.22247-862-238498302150215/AnsiballZ_file.py
Oct 14 09:39:29 np0005486759.ooo.test sudo[242468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:29 np0005486759.ooo.test python3.9[242470]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:29 np0005486759.ooo.test sudo[242468]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:39:30 np0005486759.ooo.test podman[242542]: 2025-10-14 09:39:30.451348423 +0000 UTC m=+0.078878581 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:39:30 np0005486759.ooo.test podman[242542]: 2025-10-14 09:39:30.465313878 +0000 UTC m=+0.092844006 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:39:30 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:39:30 np0005486759.ooo.test sudo[242595]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abnmdpfjmipogxvnhbqtyznzfophlyrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434770.1815162-874-19572970604090/AnsiballZ_file.py
Oct 14 09:39:30 np0005486759.ooo.test sudo[242595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:30 np0005486759.ooo.test python3.9[242600]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:39:30 np0005486759.ooo.test sudo[242595]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:31 np0005486759.ooo.test sudo[242708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmjujdlavhkqojfcvliliyifnsqishfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434770.9055254-882-175537169025022/AnsiballZ_modprobe.py
Oct 14 09:39:31 np0005486759.ooo.test sudo[242708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:31 np0005486759.ooo.test python3.9[242710]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 14 09:39:31 np0005486759.ooo.test sudo[242708]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:31 np0005486759.ooo.test sudo[242826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgthnqjydlopxlshiaeznfhkvizmzgzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434771.6548548-890-279544906323544/AnsiballZ_stat.py
Oct 14 09:39:31 np0005486759.ooo.test sudo[242826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:32 np0005486759.ooo.test python3.9[242828]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:39:32 np0005486759.ooo.test sudo[242826]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:32 np0005486759.ooo.test sudo[242914]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhpyxwsvdjfcarzpelcocrxdglbfictv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434771.6548548-890-279544906323544/AnsiballZ_copy.py
Oct 14 09:39:32 np0005486759.ooo.test sudo[242914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:32 np0005486759.ooo.test python3.9[242916]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434771.6548548-890-279544906323544/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:32 np0005486759.ooo.test sudo[242914]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:33 np0005486759.ooo.test sudo[243024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvlkxtkxeyjveoqqbxdfopgudwdmvmdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434772.9175684-906-74355498413380/AnsiballZ_lineinfile.py
Oct 14 09:39:33 np0005486759.ooo.test sudo[243024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:33 np0005486759.ooo.test python3.9[243026]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:33 np0005486759.ooo.test sudo[243024]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:33 np0005486759.ooo.test sudo[243134]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohiyaeyhpewkfbguzhovccnwelbmiwht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434773.5588639-914-119564189608585/AnsiballZ_systemd.py
Oct 14 09:39:33 np0005486759.ooo.test sudo[243134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:34 np0005486759.ooo.test python3.9[243136]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:39:34 np0005486759.ooo.test systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 14 09:39:34 np0005486759.ooo.test systemd[1]: Stopped Load Kernel Modules.
Oct 14 09:39:34 np0005486759.ooo.test systemd[1]: Stopping Load Kernel Modules...
Oct 14 09:39:34 np0005486759.ooo.test systemd[1]: Starting Load Kernel Modules...
Oct 14 09:39:34 np0005486759.ooo.test systemd-modules-load[243140]: Module 'msr' is built in
Oct 14 09:39:34 np0005486759.ooo.test systemd[1]: Finished Load Kernel Modules.
Oct 14 09:39:34 np0005486759.ooo.test sudo[243134]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32294 DF PROTO=TCP SPT=59364 DPT=9100 SEQ=311210807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8D9EF70000000001030307) 
Oct 14 09:39:34 np0005486759.ooo.test sudo[243248]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blzlychympkarrevzxmnpaocogqfopjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434774.4375486-922-156166713885689/AnsiballZ_setup.py
Oct 14 09:39:34 np0005486759.ooo.test sudo[243248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:35 np0005486759.ooo.test python3.9[243250]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:39:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32295 DF PROTO=TCP SPT=59364 DPT=9100 SEQ=311210807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DA3010000000001030307) 
Oct 14 09:39:35 np0005486759.ooo.test sudo[243248]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:35 np0005486759.ooo.test sudo[243311]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utokthhmxkmjzshhozkccgaffiolcrgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434774.4375486-922-156166713885689/AnsiballZ_dnf.py
Oct 14 09:39:35 np0005486759.ooo.test sudo[243311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:36 np0005486759.ooo.test python3.9[243313]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:39:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32296 DF PROTO=TCP SPT=59364 DPT=9100 SEQ=311210807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DAB020000000001030307) 
Oct 14 09:39:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32297 DF PROTO=TCP SPT=59364 DPT=9100 SEQ=311210807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DBAC10000000001030307) 
Oct 14 09:39:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46230 DF PROTO=TCP SPT=46250 DPT=9882 SEQ=3985463284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DBE4C0000000001030307) 
Oct 14 09:39:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46231 DF PROTO=TCP SPT=46250 DPT=9882 SEQ=3985463284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DC2420000000001030307) 
Oct 14 09:39:43 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:44 np0005486759.ooo.test systemd-sysv-generator[243353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:44 np0005486759.ooo.test systemd-rc-local-generator[243349]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:44 np0005486759.ooo.test systemd-rc-local-generator[243386]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:44 np0005486759.ooo.test systemd-sysv-generator[243391]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:44 np0005486759.ooo.test systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 14 09:39:44 np0005486759.ooo.test systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: Starting man-db-cache-update.service...
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:44 np0005486759.ooo.test systemd-rc-local-generator[243480]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:44 np0005486759.ooo.test systemd-sysv-generator[243484]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:44 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:45 np0005486759.ooo.test systemd[1]: Queuing reload/restart jobs for marked units…
Oct 14 09:39:45 np0005486759.ooo.test systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 14 09:39:45 np0005486759.ooo.test systemd[1]: Finished man-db-cache-update.service.
Oct 14 09:39:45 np0005486759.ooo.test systemd[1]: run-r5f3fe498384b4dbbb564eb71bd3b9708.service: Deactivated successfully.
Oct 14 09:39:46 np0005486759.ooo.test sudo[243311]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14672 DF PROTO=TCP SPT=54522 DPT=9105 SEQ=3900032822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DD1010000000001030307) 
Oct 14 09:39:47 np0005486759.ooo.test sudo[244726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptjsynubjlsdqmsyegzlzaerdvmrpzmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434787.0725648-934-196044890028022/AnsiballZ_file.py
Oct 14 09:39:47 np0005486759.ooo.test sudo[244726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:47 np0005486759.ooo.test python3.9[244728]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:47 np0005486759.ooo.test sudo[244726]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:48 np0005486759.ooo.test python3.9[244836]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:39:49 np0005486759.ooo.test sudo[244948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwfxwlyulyppqkykrntjgqhkngddorlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434788.7919765-952-252771410997700/AnsiballZ_file.py
Oct 14 09:39:49 np0005486759.ooo.test sudo[244948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:49 np0005486759.ooo.test python3.9[244950]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:39:49 np0005486759.ooo.test sudo[244948]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46233 DF PROTO=TCP SPT=46250 DPT=9882 SEQ=3985463284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DDA010000000001030307) 
Oct 14 09:39:50 np0005486759.ooo.test sudo[245058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syqiyapqsezmhauujgzdrtdotvuopllk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434789.6791606-963-48425530521171/AnsiballZ_systemd_service.py
Oct 14 09:39:50 np0005486759.ooo.test sudo[245058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:50 np0005486759.ooo.test python3.9[245060]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:39:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:39:50 np0005486759.ooo.test systemd-rc-local-generator[245082]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:39:50 np0005486759.ooo.test systemd-sysv-generator[245088]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:39:50 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:51 np0005486759.ooo.test sudo[245058]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:51 np0005486759.ooo.test python3.9[245204]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:39:51 np0005486759.ooo.test network[245221]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:39:51 np0005486759.ooo.test network[245222]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:39:51 np0005486759.ooo.test network[245223]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:39:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41097 DF PROTO=TCP SPT=47196 DPT=9102 SEQ=1504655506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DE6810000000001030307) 
Oct 14 09:39:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:39:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:39:54.141 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:39:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:39:54.142 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:39:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:39:54.143 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:39:55 np0005486759.ooo.test sudo[245456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lunyjqgqjkejvmgcjskwprpjmfggrpyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434795.4859893-982-261189416569996/AnsiballZ_systemd_service.py
Oct 14 09:39:55 np0005486759.ooo.test sudo[245456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:56 np0005486759.ooo.test python3.9[245458]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:56 np0005486759.ooo.test sudo[245456]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:56 np0005486759.ooo.test sudo[245567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lujtmuvwdlrwuabkevdmgwtjzdzaaprv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434796.2366793-982-69358439913669/AnsiballZ_systemd_service.py
Oct 14 09:39:56 np0005486759.ooo.test sudo[245567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41098 DF PROTO=TCP SPT=47196 DPT=9102 SEQ=1504655506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8DF6410000000001030307) 
Oct 14 09:39:56 np0005486759.ooo.test python3.9[245569]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:56 np0005486759.ooo.test sudo[245567]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:57 np0005486759.ooo.test sudo[245678]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccesxmbnkunvkeyalitfireepvqofful ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434796.9442072-982-162887766994823/AnsiballZ_systemd_service.py
Oct 14 09:39:57 np0005486759.ooo.test sudo[245678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:39:57 np0005486759.ooo.test podman[245680]: 2025-10-14 09:39:57.266373356 +0000 UTC m=+0.056636956 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 14 09:39:57 np0005486759.ooo.test podman[245680]: 2025-10-14 09:39:57.319262038 +0000 UTC m=+0.109525628 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:39:57 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:39:57 np0005486759.ooo.test python3.9[245681]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:57 np0005486759.ooo.test sudo[245678]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:57 np0005486759.ooo.test sudo[245814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdswrsiqmwiztibztndyrowbkajdalbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434797.6004498-982-228943540374948/AnsiballZ_systemd_service.py
Oct 14 09:39:57 np0005486759.ooo.test sudo[245814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:58 np0005486759.ooo.test python3.9[245816]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:58 np0005486759.ooo.test sudo[245814]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:58 np0005486759.ooo.test sudo[245925]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnxfqhaiwplzqwpuereuwwrmujmufseh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434798.3075185-982-147593360360/AnsiballZ_systemd_service.py
Oct 14 09:39:58 np0005486759.ooo.test sudo[245925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:39:58 np0005486759.ooo.test python3.9[245927]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:39:58 np0005486759.ooo.test sudo[245925]: pam_unix(sudo:session): session closed for user root
Oct 14 09:39:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:39:59 np0005486759.ooo.test systemd[1]: tmp-crun.ymMz3F.mount: Deactivated successfully.
Oct 14 09:39:59 np0005486759.ooo.test podman[245929]: 2025-10-14 09:39:59.446471032 +0000 UTC m=+0.074266790 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd)
Oct 14 09:39:59 np0005486759.ooo.test podman[245929]: 2025-10-14 09:39:59.462227335 +0000 UTC m=+0.090023113 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:39:59 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:40:00 np0005486759.ooo.test sudo[246056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnumewiwurfnhqrztnnimuztgzembiqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434799.7160387-982-165819544457586/AnsiballZ_systemd_service.py
Oct 14 09:40:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:40:00 np0005486759.ooo.test sudo[246056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:00 np0005486759.ooo.test podman[246058]: 2025-10-14 09:40:00.149873862 +0000 UTC m=+0.091311865 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:40:00 np0005486759.ooo.test podman[246058]: 2025-10-14 09:40:00.185526554 +0000 UTC m=+0.126964537 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2)
Oct 14 09:40:00 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:40:00 np0005486759.ooo.test python3.9[246059]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:40:00 np0005486759.ooo.test sudo[246056]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:00 np0005486759.ooo.test sudo[246183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esbawhxkkaznckgrayxyvwezyajzuxti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434800.5391781-982-25572472032087/AnsiballZ_systemd_service.py
Oct 14 09:40:00 np0005486759.ooo.test sudo[246183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:40:00 np0005486759.ooo.test podman[246185]: 2025-10-14 09:40:00.963467881 +0000 UTC m=+0.067449948 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 09:40:00 np0005486759.ooo.test podman[246185]: 2025-10-14 09:40:00.972575868 +0000 UTC m=+0.076557925 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid)
Oct 14 09:40:00 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:40:01 np0005486759.ooo.test python3.9[246186]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:40:01 np0005486759.ooo.test sudo[246183]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:02 np0005486759.ooo.test sudo[246313]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbheorndogayfolfblmvczjuwqmbqrqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434801.3126287-982-19098390878582/AnsiballZ_systemd_service.py
Oct 14 09:40:02 np0005486759.ooo.test sudo[246313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:02 np0005486759.ooo.test python3.9[246315]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:40:03 np0005486759.ooo.test sudo[246313]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:04 np0005486759.ooo.test sudo[246424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aflmwweubrngfrndtpwsksymkcyazosv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434803.9265187-1041-192367960932883/AnsiballZ_file.py
Oct 14 09:40:04 np0005486759.ooo.test sudo[246424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51424 DF PROTO=TCP SPT=43494 DPT=9100 SEQ=2387789513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E14270000000001030307) 
Oct 14 09:40:04 np0005486759.ooo.test python3.9[246426]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:04 np0005486759.ooo.test sudo[246424]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:04 np0005486759.ooo.test sudo[246534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmjefxlhsioggmfawqnbygbkcwxdhwgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434804.572685-1041-12481612301031/AnsiballZ_file.py
Oct 14 09:40:04 np0005486759.ooo.test sudo[246534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:05 np0005486759.ooo.test python3.9[246536]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:05 np0005486759.ooo.test sudo[246534]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51425 DF PROTO=TCP SPT=43494 DPT=9100 SEQ=2387789513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E18410000000001030307) 
Oct 14 09:40:05 np0005486759.ooo.test sudo[246644]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqhgpgqyhaldpjzjlpcuixnpswirvfkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434805.1507857-1041-14054262698179/AnsiballZ_file.py
Oct 14 09:40:05 np0005486759.ooo.test sudo[246644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:05 np0005486759.ooo.test python3.9[246646]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:05 np0005486759.ooo.test sudo[246644]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:05 np0005486759.ooo.test sudo[246754]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzwudenasxtxpdkfkbisbasweeeanban ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434805.759737-1041-11237380546588/AnsiballZ_file.py
Oct 14 09:40:05 np0005486759.ooo.test sudo[246754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:06 np0005486759.ooo.test python3.9[246756]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:06 np0005486759.ooo.test sudo[246754]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:06 np0005486759.ooo.test sudo[246864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pidchgoiowfjdvghcjexeagwpphhdrvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434806.2934756-1041-135916315505816/AnsiballZ_file.py
Oct 14 09:40:06 np0005486759.ooo.test sudo[246864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:06 np0005486759.ooo.test python3.9[246866]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:06 np0005486759.ooo.test sudo[246864]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:07 np0005486759.ooo.test sudo[246974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlvjbvhkuojygyycewnijditxksgkpvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434806.9330328-1041-189028856555135/AnsiballZ_file.py
Oct 14 09:40:07 np0005486759.ooo.test sudo[246974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51426 DF PROTO=TCP SPT=43494 DPT=9100 SEQ=2387789513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E20420000000001030307) 
Oct 14 09:40:07 np0005486759.ooo.test python3.9[246976]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:07 np0005486759.ooo.test sudo[246974]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:07 np0005486759.ooo.test sudo[247084]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsybkizpiktsigyfxjxgbznjjqgfwkdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434807.533138-1041-21770043878642/AnsiballZ_file.py
Oct 14 09:40:07 np0005486759.ooo.test sudo[247084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:07 np0005486759.ooo.test python3.9[247086]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:07 np0005486759.ooo.test sudo[247084]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:08 np0005486759.ooo.test sudo[247194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgvkoblcczuduzflcizwhsuivabiculv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434808.1099277-1041-84863218308930/AnsiballZ_file.py
Oct 14 09:40:08 np0005486759.ooo.test sudo[247194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:08 np0005486759.ooo.test python3.9[247196]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:08 np0005486759.ooo.test sudo[247194]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:09 np0005486759.ooo.test sudo[247304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxqvmcdjadgkhfhlsxtgyhxvsqzlwxcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434808.7796104-1098-209394336511086/AnsiballZ_file.py
Oct 14 09:40:09 np0005486759.ooo.test sudo[247304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:09 np0005486759.ooo.test python3.9[247306]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:09 np0005486759.ooo.test sudo[247304]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:09 np0005486759.ooo.test sudo[247414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwzlwhanmoaqdsvayvvmfrwzdmbxrzhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434809.460088-1098-237681302973387/AnsiballZ_file.py
Oct 14 09:40:09 np0005486759.ooo.test sudo[247414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:09 np0005486759.ooo.test python3.9[247416]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:09 np0005486759.ooo.test sudo[247414]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:10 np0005486759.ooo.test sudo[247524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnowbmzjcxzrmuojgnqtrhgvpmjfilxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434810.0523472-1098-128904652421901/AnsiballZ_file.py
Oct 14 09:40:10 np0005486759.ooo.test sudo[247524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:10 np0005486759.ooo.test python3.9[247526]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:10 np0005486759.ooo.test sudo[247524]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:10 np0005486759.ooo.test sudo[247634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsvqxhnnqqylqogjwyfbsibgmnuwcvad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434810.6470659-1098-11231025853186/AnsiballZ_file.py
Oct 14 09:40:10 np0005486759.ooo.test sudo[247634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:11 np0005486759.ooo.test python3.9[247636]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:11 np0005486759.ooo.test sudo[247634]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51427 DF PROTO=TCP SPT=43494 DPT=9100 SEQ=2387789513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E30020000000001030307) 
Oct 14 09:40:11 np0005486759.ooo.test sudo[247744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvkfzfwsxrjqmqnmacebfafdsuihsocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434811.2258244-1098-100166791744993/AnsiballZ_file.py
Oct 14 09:40:11 np0005486759.ooo.test sudo[247744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:11 np0005486759.ooo.test python3.9[247746]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:11 np0005486759.ooo.test sudo[247744]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:12 np0005486759.ooo.test sudo[247854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyprjtsyunuahuvlzvvmoymqcjpqvkvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434811.798413-1098-107562918139580/AnsiballZ_file.py
Oct 14 09:40:12 np0005486759.ooo.test sudo[247854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:12 np0005486759.ooo.test python3.9[247856]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:12 np0005486759.ooo.test sudo[247854]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34348 DF PROTO=TCP SPT=46742 DPT=9882 SEQ=2463332302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E33780000000001030307) 
Oct 14 09:40:12 np0005486759.ooo.test sudo[247964]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-algtrlyqdmvaeujiombqmpygvdzulceo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434812.3942654-1098-94823716893014/AnsiballZ_file.py
Oct 14 09:40:12 np0005486759.ooo.test sudo[247964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:12 np0005486759.ooo.test python3.9[247966]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:12 np0005486759.ooo.test sudo[247964]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:13 np0005486759.ooo.test sudo[248074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjazjuwqzkqcsyjafzadanrattwhtyav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434813.0220523-1098-146513602256314/AnsiballZ_file.py
Oct 14 09:40:13 np0005486759.ooo.test sudo[248074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34349 DF PROTO=TCP SPT=46742 DPT=9882 SEQ=2463332302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E37820000000001030307) 
Oct 14 09:40:13 np0005486759.ooo.test python3.9[248076]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:13 np0005486759.ooo.test sudo[248074]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:13 np0005486759.ooo.test sudo[248184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpkrhyyigxowzqfmouimsolmxxbitalr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434813.7270372-1156-144417178742071/AnsiballZ_command.py
Oct 14 09:40:14 np0005486759.ooo.test sudo[248184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:14 np0005486759.ooo.test python3.9[248186]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                           systemctl disable --now certmonger.service
                                                           test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                         fi
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:14 np0005486759.ooo.test sudo[248184]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:14 np0005486759.ooo.test python3.9[248296]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:40:15 np0005486759.ooo.test sudo[248404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sklhaxewmfjpnmlonlezpenyvwbojnqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434815.2268112-1174-200677348440443/AnsiballZ_systemd_service.py
Oct 14 09:40:15 np0005486759.ooo.test sudo[248404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:15 np0005486759.ooo.test python3.9[248406]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:40:15 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:40:15 np0005486759.ooo.test systemd-rc-local-generator[248431]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:40:15 np0005486759.ooo.test systemd-sysv-generator[248435]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:40:15 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:40:16 np0005486759.ooo.test sudo[248404]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:16 np0005486759.ooo.test sudo[248550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiwohzzpcinswdwnxccaeejgiyulrwzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434816.2498667-1182-54921694557014/AnsiballZ_command.py
Oct 14 09:40:16 np0005486759.ooo.test sudo[248550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:16 np0005486759.ooo.test python3.9[248552]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:16 np0005486759.ooo.test sudo[248550]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:17 np0005486759.ooo.test sudo[248661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cndpbgwqgnvvjkqztssebrmwutsgelhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434816.8858757-1182-218773321605164/AnsiballZ_command.py
Oct 14 09:40:17 np0005486759.ooo.test sudo[248661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41850 DF PROTO=TCP SPT=48282 DPT=9105 SEQ=3144938100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E46410000000001030307) 
Oct 14 09:40:17 np0005486759.ooo.test python3.9[248663]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:17 np0005486759.ooo.test sudo[248661]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:17 np0005486759.ooo.test sudo[248772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flvnlamjcsvagarnbrvqdxotoatyiumg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434817.429562-1182-263085372206013/AnsiballZ_command.py
Oct 14 09:40:17 np0005486759.ooo.test sudo[248772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:17 np0005486759.ooo.test python3.9[248774]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:17 np0005486759.ooo.test sudo[248772]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:18 np0005486759.ooo.test sudo[248883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okginycnadnkwmcwejfaaibnqyeoyhcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434817.9795291-1182-182864594870550/AnsiballZ_command.py
Oct 14 09:40:18 np0005486759.ooo.test sudo[248883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:18 np0005486759.ooo.test python3.9[248885]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:18 np0005486759.ooo.test sudo[248883]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:18 np0005486759.ooo.test sudo[248994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdjsbgreadhirprkeluqsdgjaivnfchv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434818.6345565-1182-248509910103726/AnsiballZ_command.py
Oct 14 09:40:18 np0005486759.ooo.test sudo[248994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:19 np0005486759.ooo.test python3.9[248996]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:19 np0005486759.ooo.test sudo[248994]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34351 DF PROTO=TCP SPT=46742 DPT=9882 SEQ=2463332302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E4F410000000001030307) 
Oct 14 09:40:19 np0005486759.ooo.test sudo[249105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcyhttmrbyvvrjtdtqrccarzrftofdyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434819.2200387-1182-40131803239247/AnsiballZ_command.py
Oct 14 09:40:19 np0005486759.ooo.test sudo[249105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:19 np0005486759.ooo.test python3.9[249107]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:19 np0005486759.ooo.test sudo[249105]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:20 np0005486759.ooo.test sudo[249216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wljijslgizondkfgkrfqcthibzttnsxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434819.8620472-1182-67163430172251/AnsiballZ_command.py
Oct 14 09:40:20 np0005486759.ooo.test sudo[249216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:20 np0005486759.ooo.test python3.9[249218]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:20 np0005486759.ooo.test sudo[249216]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:20 np0005486759.ooo.test sudo[249327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwbzlgnfzddadeusewanrwkettcswjqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434820.4233418-1182-58898781754791/AnsiballZ_command.py
Oct 14 09:40:20 np0005486759.ooo.test sudo[249327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:20 np0005486759.ooo.test python3.9[249329]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:40:20 np0005486759.ooo.test sudo[249327]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:22 np0005486759.ooo.test sudo[249438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfpblfxnouknmnybbwvprmrwemfuzxpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434821.8561013-1261-192957660613271/AnsiballZ_file.py
Oct 14 09:40:22 np0005486759.ooo.test sudo[249438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:22 np0005486759.ooo.test python3.9[249440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:22 np0005486759.ooo.test sudo[249438]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58102 DF PROTO=TCP SPT=37014 DPT=9102 SEQ=2005490946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E5BC10000000001030307) 
Oct 14 09:40:22 np0005486759.ooo.test sudo[249548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqobneqdotmaitgniowbzcolitkmmczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434822.4576862-1261-91743503097922/AnsiballZ_file.py
Oct 14 09:40:22 np0005486759.ooo.test sudo[249548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:22 np0005486759.ooo.test python3.9[249550]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:22 np0005486759.ooo.test sudo[249548]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:23 np0005486759.ooo.test sudo[249658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edpqljjegnmgbqqvvmzlvgrvtlddebbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434823.007217-1261-41122089850527/AnsiballZ_file.py
Oct 14 09:40:23 np0005486759.ooo.test sudo[249658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:24 np0005486759.ooo.test python3.9[249660]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:24 np0005486759.ooo.test sudo[249658]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:24 np0005486759.ooo.test sudo[249768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggdkgnakizubvrzakwddtdvdwsfptdmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434824.3577702-1283-68188942056507/AnsiballZ_file.py
Oct 14 09:40:24 np0005486759.ooo.test sudo[249768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:24 np0005486759.ooo.test python3.9[249770]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:24 np0005486759.ooo.test sudo[249768]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:25 np0005486759.ooo.test sudo[249878]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iloycfvytnbpijakuprvtbthxokytyor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434824.945538-1283-161560632020731/AnsiballZ_file.py
Oct 14 09:40:25 np0005486759.ooo.test sudo[249878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:25 np0005486759.ooo.test python3.9[249880]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:25 np0005486759.ooo.test sudo[249878]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58103 DF PROTO=TCP SPT=37014 DPT=9102 SEQ=2005490946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E6B820000000001030307) 
Oct 14 09:40:26 np0005486759.ooo.test sudo[249988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piznmwoidgfyuuqfmdjnihjyoztwacav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434826.321607-1283-175104792617864/AnsiballZ_file.py
Oct 14 09:40:26 np0005486759.ooo.test sudo[249988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:26 np0005486759.ooo.test python3.9[249990]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:26 np0005486759.ooo.test sudo[249988]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:27 np0005486759.ooo.test sudo[250098]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lplfzvfyvdyaxpvlxanqxbwiwgtbneqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434826.9956458-1283-196241671386145/AnsiballZ_file.py
Oct 14 09:40:27 np0005486759.ooo.test sudo[250098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:40:27 np0005486759.ooo.test podman[250101]: 2025-10-14 09:40:27.427790583 +0000 UTC m=+0.059895262 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:40:27 np0005486759.ooo.test podman[250101]: 2025-10-14 09:40:27.457071517 +0000 UTC m=+0.089176186 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 14 09:40:27 np0005486759.ooo.test python3.9[250100]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:27 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:40:27 np0005486759.ooo.test sudo[250098]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:27 np0005486759.ooo.test sudo[250234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnubdrkzlntwwqfbqyltvqcuvtvswjen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434827.603123-1283-10628684771310/AnsiballZ_file.py
Oct 14 09:40:27 np0005486759.ooo.test sudo[250234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:28 np0005486759.ooo.test python3.9[250236]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:28 np0005486759.ooo.test sudo[250234]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:28 np0005486759.ooo.test sudo[250344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtqlmzmjxepuriyefnqxskzbwwodekqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434828.2521741-1283-276589748730768/AnsiballZ_file.py
Oct 14 09:40:28 np0005486759.ooo.test sudo[250344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:28 np0005486759.ooo.test python3.9[250346]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:28 np0005486759.ooo.test sudo[250344]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:29 np0005486759.ooo.test sudo[250454]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdteitqturiongynjaylmgfehxyiukny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434828.8017902-1283-143678929568529/AnsiballZ_file.py
Oct 14 09:40:29 np0005486759.ooo.test sudo[250454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:29 np0005486759.ooo.test python3.9[250456]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:29 np0005486759.ooo.test sudo[250454]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:29 np0005486759.ooo.test sudo[250564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yazzuqcmqxhzdxbuvexhiumqcwtxuivu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434829.4865992-1283-155670301486899/AnsiballZ_file.py
Oct 14 09:40:29 np0005486759.ooo.test sudo[250564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:40:29 np0005486759.ooo.test podman[250566]: 2025-10-14 09:40:29.845402795 +0000 UTC m=+0.052200311 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 14 09:40:29 np0005486759.ooo.test podman[250566]: 2025-10-14 09:40:29.855730482 +0000 UTC m=+0.062528058 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct 14 09:40:29 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:40:29 np0005486759.ooo.test python3.9[250567]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:30 np0005486759.ooo.test sudo[250564]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:40:30 np0005486759.ooo.test podman[250657]: 2025-10-14 09:40:30.419140393 +0000 UTC m=+0.053202611 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:40:30 np0005486759.ooo.test podman[250657]: 2025-10-14 09:40:30.448454362 +0000 UTC m=+0.082516600 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 09:40:30 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:40:30 np0005486759.ooo.test sudo[250710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uczipwyazqsdywudusqankspxsxvouwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434830.2490702-1283-73609701552255/AnsiballZ_file.py
Oct 14 09:40:30 np0005486759.ooo.test sudo[250710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:30 np0005486759.ooo.test python3.9[250712]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:30 np0005486759.ooo.test sudo[250710]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:40:31 np0005486759.ooo.test podman[250730]: 2025-10-14 09:40:31.452231242 +0000 UTC m=+0.077424134 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 09:40:31 np0005486759.ooo.test podman[250730]: 2025-10-14 09:40:31.484228288 +0000 UTC m=+0.109421190 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 09:40:31 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:40:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64884 DF PROTO=TCP SPT=60584 DPT=9100 SEQ=3040231619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E89570000000001030307) 
Oct 14 09:40:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64885 DF PROTO=TCP SPT=60584 DPT=9100 SEQ=3040231619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E8D410000000001030307) 
Oct 14 09:40:35 np0005486759.ooo.test sudo[250839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vatozhpakbxdimeswjytsqqnhnaomraw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434835.17646-1466-172945796917384/AnsiballZ_getent.py
Oct 14 09:40:35 np0005486759.ooo.test sudo[250839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:35 np0005486759.ooo.test python3.9[250841]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 14 09:40:35 np0005486759.ooo.test sudo[250839]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:36 np0005486759.ooo.test sudo[250950]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvkuyejkaixetvolleogemrnsaqfkxlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434836.1059453-1474-108649753380561/AnsiballZ_group.py
Oct 14 09:40:36 np0005486759.ooo.test sudo[250950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:36 np0005486759.ooo.test python3.9[250952]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 14 09:40:36 np0005486759.ooo.test groupadd[250953]: group added to /etc/group: name=nova, GID=42436
Oct 14 09:40:36 np0005486759.ooo.test groupadd[250953]: group added to /etc/gshadow: name=nova
Oct 14 09:40:36 np0005486759.ooo.test groupadd[250953]: new group: name=nova, GID=42436
Oct 14 09:40:36 np0005486759.ooo.test sudo[250950]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64886 DF PROTO=TCP SPT=60584 DPT=9100 SEQ=3040231619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8E95410000000001030307) 
Oct 14 09:40:38 np0005486759.ooo.test sudo[251066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oypsyporksmvnwilgxhdeiauiqfkqzvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434837.5786917-1482-120486126985164/AnsiballZ_user.py
Oct 14 09:40:38 np0005486759.ooo.test sudo[251066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:38 np0005486759.ooo.test python3.9[251068]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486759.ooo.test update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 14 09:40:38 np0005486759.ooo.test useradd[251070]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Oct 14 09:40:38 np0005486759.ooo.test useradd[251070]: add 'nova' to group 'libvirt'
Oct 14 09:40:38 np0005486759.ooo.test useradd[251070]: add 'nova' to shadow group 'libvirt'
Oct 14 09:40:38 np0005486759.ooo.test sudo[251066]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:39 np0005486759.ooo.test sshd[251094]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:40:39 np0005486759.ooo.test sshd[251094]: Accepted publickey for zuul from 192.168.122.31 port 60182 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:40:39 np0005486759.ooo.test systemd-logind[759]: New session 38 of user zuul.
Oct 14 09:40:39 np0005486759.ooo.test systemd[1]: Started Session 38 of User zuul.
Oct 14 09:40:40 np0005486759.ooo.test sshd[251094]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:40:40 np0005486759.ooo.test sshd[251097]: Received disconnect from 192.168.122.31 port 60182:11: disconnected by user
Oct 14 09:40:40 np0005486759.ooo.test sshd[251097]: Disconnected from user zuul 192.168.122.31 port 60182
Oct 14 09:40:40 np0005486759.ooo.test sshd[251094]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:40:40 np0005486759.ooo.test systemd[1]: session-38.scope: Deactivated successfully.
Oct 14 09:40:40 np0005486759.ooo.test systemd-logind[759]: Session 38 logged out. Waiting for processes to exit.
Oct 14 09:40:40 np0005486759.ooo.test systemd-logind[759]: Removed session 38.
Oct 14 09:40:40 np0005486759.ooo.test python3.9[251205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64887 DF PROTO=TCP SPT=60584 DPT=9100 SEQ=3040231619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EA5020000000001030307) 
Oct 14 09:40:41 np0005486759.ooo.test python3.9[251291]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434840.3329523-1507-267887054907728/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:42 np0005486759.ooo.test python3.9[251399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41718 DF PROTO=TCP SPT=48288 DPT=9882 SEQ=3668396512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EA8A80000000001030307) 
Oct 14 09:40:42 np0005486759.ooo.test python3.9[251454]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41719 DF PROTO=TCP SPT=48288 DPT=9882 SEQ=3668396512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EACC10000000001030307) 
Oct 14 09:40:43 np0005486759.ooo.test python3.9[251562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:43 np0005486759.ooo.test python3.9[251648]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434842.8856485-1507-57751192245829/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:44 np0005486759.ooo.test python3.9[251756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:45 np0005486759.ooo.test python3.9[251842]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434844.0881994-1507-41185300932680/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=158960547ad6af7ca0183dbf7d845472651d1682 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:45 np0005486759.ooo.test python3.9[251950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:46 np0005486759.ooo.test python3.9[252036]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434845.185127-1507-150110590587523/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:46 np0005486759.ooo.test sudo[252144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpchwaawbmzyyewnurrdzyrqgdsfcnbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434846.3799675-1576-260043894538830/AnsiballZ_file.py
Oct 14 09:40:46 np0005486759.ooo.test sudo[252144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:46 np0005486759.ooo.test python3.9[252146]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:46 np0005486759.ooo.test sudo[252144]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48072 DF PROTO=TCP SPT=46690 DPT=9105 SEQ=272830691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EBB810000000001030307) 
Oct 14 09:40:47 np0005486759.ooo.test sudo[252254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icepavzwsytrslqibhehmccaxqclrqxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434847.057396-1584-92958332486474/AnsiballZ_copy.py
Oct 14 09:40:47 np0005486759.ooo.test sudo[252254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:47 np0005486759.ooo.test python3.9[252256]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:47 np0005486759.ooo.test sudo[252254]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:47 np0005486759.ooo.test sudo[252364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpsgijzyohotklblciuzpzbwwaiczkrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434847.6853094-1592-229961777231880/AnsiballZ_stat.py
Oct 14 09:40:47 np0005486759.ooo.test sudo[252364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:48 np0005486759.ooo.test python3.9[252366]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:40:48 np0005486759.ooo.test sudo[252364]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:48 np0005486759.ooo.test sudo[252476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbkkdlywekovmeulkvgkrgjgbzihzzqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434848.3149683-1601-69413127676998/AnsiballZ_file.py
Oct 14 09:40:48 np0005486759.ooo.test sudo[252476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:48 np0005486759.ooo.test python3.9[252478]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:40:48 np0005486759.ooo.test sudo[252476]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41721 DF PROTO=TCP SPT=48288 DPT=9882 SEQ=3668396512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EC4810000000001030307) 
Oct 14 09:40:49 np0005486759.ooo.test python3.9[252586]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:40:50 np0005486759.ooo.test python3.9[252696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:50 np0005486759.ooo.test python3.9[252782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434849.6759048-1618-34390241092355/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:51 np0005486759.ooo.test python3.9[252890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:40:51 np0005486759.ooo.test python3.9[252976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434850.928556-1633-78187112426898/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:40:52 np0005486759.ooo.test sudo[253084]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waeffkdlydwtgfrmqkhxwmhfciweubrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434852.2491941-1650-108420211248417/AnsiballZ_container_config_data.py
Oct 14 09:40:52 np0005486759.ooo.test sudo[253084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34702 DF PROTO=TCP SPT=37272 DPT=9102 SEQ=2332309894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8ED0C10000000001030307) 
Oct 14 09:40:52 np0005486759.ooo.test python3.9[253086]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 14 09:40:52 np0005486759.ooo.test sudo[253084]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:53 np0005486759.ooo.test sudo[253194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeaogoooqzfuhwpihhflwihaagvbryyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434852.985641-1659-70907030969775/AnsiballZ_container_config_hash.py
Oct 14 09:40:53 np0005486759.ooo.test sudo[253194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:53 np0005486759.ooo.test python3.9[253196]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:40:53 np0005486759.ooo.test sudo[253194]: pam_unix(sudo:session): session closed for user root
Oct 14 09:40:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:40:54.141 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:40:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:40:54.143 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:40:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:40:54.145 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:40:54 np0005486759.ooo.test sudo[253304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmiaqeouwbztrhnnbitcspsfuhnkyoyv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434853.8543997-1669-82099675569298/AnsiballZ_edpm_container_manage.py
Oct 14 09:40:54 np0005486759.ooo.test sudo[253304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:40:54 np0005486759.ooo.test python3[253306]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:40:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34703 DF PROTO=TCP SPT=37272 DPT=9102 SEQ=2332309894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EE0810000000001030307) 
Oct 14 09:40:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:41:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:41:00 np0005486759.ooo.test podman[253343]: 2025-10-14 09:41:00.100215178 +0000 UTC m=+2.363661912 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 09:41:00 np0005486759.ooo.test podman[253343]: 2025-10-14 09:41:00.144334576 +0000 UTC m=+2.407781320 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 09:41:00 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:41:00 np0005486759.ooo.test systemd[1]: tmp-crun.SQ3YB3.mount: Deactivated successfully.
Oct 14 09:41:00 np0005486759.ooo.test podman[253363]: 2025-10-14 09:41:00.226927787 +0000 UTC m=+0.125018750 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:41:00 np0005486759.ooo.test podman[253363]: 2025-10-14 09:41:00.237228896 +0000 UTC m=+0.135319849 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:41:00 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:41:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:41:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:41:03 np0005486759.ooo.test podman[253412]: 2025-10-14 09:41:03.970300815 +0000 UTC m=+1.598915995 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 09:41:04 np0005486759.ooo.test podman[253401]: 2025-10-14 09:41:03.951151231 +0000 UTC m=+2.583372825 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct 14 09:41:04 np0005486759.ooo.test podman[253412]: 2025-10-14 09:41:04.005266977 +0000 UTC m=+1.633882147 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 09:41:04 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:41:04 np0005486759.ooo.test podman[253320]: 2025-10-14 09:40:54.529266425 +0000 UTC m=+0.032837331 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 14 09:41:04 np0005486759.ooo.test podman[253401]: 2025-10-14 09:41:04.029216601 +0000 UTC m=+2.661438185 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:41:04 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:41:04 np0005486759.ooo.test podman[253460]: 
Oct 14 09:41:04 np0005486759.ooo.test podman[253460]: 2025-10-14 09:41:04.23118004 +0000 UTC m=+0.077134495 container create 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:41:04 np0005486759.ooo.test podman[253460]: 2025-10-14 09:41:04.197561056 +0000 UTC m=+0.043515501 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 14 09:41:04 np0005486759.ooo.test python3[253306]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 14 09:41:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59849 DF PROTO=TCP SPT=40286 DPT=9100 SEQ=669663648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8EFE880000000001030307) 
Oct 14 09:41:04 np0005486759.ooo.test sudo[253304]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:04 np0005486759.ooo.test sudo[253603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qahcmocxgvdshgkeajuocwvdtkvrnbsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434864.5456326-1677-265029331203895/AnsiballZ_stat.py
Oct 14 09:41:04 np0005486759.ooo.test sudo[253603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:05 np0005486759.ooo.test python3.9[253605]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:05 np0005486759.ooo.test sudo[253603]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59850 DF PROTO=TCP SPT=40286 DPT=9100 SEQ=669663648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F02810000000001030307) 
Oct 14 09:41:05 np0005486759.ooo.test sudo[253715]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzqtyxarkufiwnsmitzzfsgkopwixctx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434865.3537803-1689-246240270174712/AnsiballZ_container_config_data.py
Oct 14 09:41:05 np0005486759.ooo.test sudo[253715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:05 np0005486759.ooo.test python3.9[253717]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 14 09:41:05 np0005486759.ooo.test sudo[253715]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:06 np0005486759.ooo.test sudo[253825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibvxrrhoksgwbrlapqokcfizjufpsnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434866.033889-1698-264558531325805/AnsiballZ_container_config_hash.py
Oct 14 09:41:06 np0005486759.ooo.test sudo[253825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:06 np0005486759.ooo.test python3.9[253827]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:41:06 np0005486759.ooo.test sudo[253825]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:06 np0005486759.ooo.test sudo[253935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xowlaadwrnwcckiytblicozscoyxkxjq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434866.7449777-1708-230669824140553/AnsiballZ_edpm_container_manage.py
Oct 14 09:41:06 np0005486759.ooo.test sudo[253935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:07 np0005486759.ooo.test python3[253937]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:41:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59851 DF PROTO=TCP SPT=40286 DPT=9100 SEQ=669663648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F0A810000000001030307) 
Oct 14 09:41:07 np0005486759.ooo.test python3[253937]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "b5b57d3572ac74b7c41332c066527d5039dbd47e134e43d7cb5d76b7732d99f5",
                                                                 "Digest": "sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-13T12:50:19.385564198Z",
                                                                 "Config": {
                                                                      "User": "nova",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 1207014273,
                                                                 "VirtualSize": 1207014273,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",
                                                                           "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",
                                                                           "sha256:e0ba9b00dd1340fa4eba9e9cd5f316c11381d47a31460e5b834a6ca56f60033f",
                                                                           "sha256:731e9354c974a424a2f6724faa85f84baef270eb006be0de18bbdc87ff420f97"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "nova",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843286399Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843354051Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843394192Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843417133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843442193Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843461914Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:43.236856724Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:17.539596691Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.007092512Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.334560883Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.713915587Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.426474494Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.742526819Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.072068096Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.376327744Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.639696917Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.946940986Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.329166855Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.709072452Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.066214819Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.407947122Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.744473297Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.044338828Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.376253048Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:29.890793292Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.186632274Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.418527973Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:31.913162322Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817436155Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817485046Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817496507Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817505987Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:34.821748777Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:00.340362183Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:40.80916313Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:43.984050021Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:20.872493025Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:21.523603796Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:21.810108901Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:34.864836738Z",
                                                                           "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:43.551617349Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:59.074531506Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:18.664061292Z",
                                                                           "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.027951629Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.382890946Z",
                                                                           "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.382951197Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:25.718273507Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 14 09:41:07 np0005486759.ooo.test podman[253987]: 2025-10-14 09:41:07.543389601 +0000 UTC m=+0.061331178 container remove aebc296d57e4cd1448c2a7085b162224d7de7a58a4d9f7e5cb4e8ffcb00095cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=5fbf038504b4f996506e416c0a4ec212fba00b4d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-nova-compute/images/17.1.9-1, managed_by=tripleo_ansible, release=1, build-date=2025-07-21T14:48:37, version=17.1.9, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '61017e001fc358991ba0100081a72ad5-b29b30662a12a8864f5ea0f40846b2cc'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12)
Oct 14 09:41:07 np0005486759.ooo.test python3[253937]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Oct 14 09:41:07 np0005486759.ooo.test podman[254001]: 
Oct 14 09:41:07 np0005486759.ooo.test podman[254001]: 2025-10-14 09:41:07.600139044 +0000 UTC m=+0.046163418 container create f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, config_id=edpm, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 09:41:07 np0005486759.ooo.test podman[254001]: 2025-10-14 09:41:07.57857712 +0000 UTC m=+0.024601514 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 14 09:41:07 np0005486759.ooo.test python3[253937]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct 14 09:41:07 np0005486759.ooo.test sudo[253935]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:08 np0005486759.ooo.test sudo[254145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svaczczpqogqavisnweupjbswlbhcjzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434867.8713741-1716-96568993618119/AnsiballZ_stat.py
Oct 14 09:41:08 np0005486759.ooo.test sudo[254145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:08 np0005486759.ooo.test python3.9[254147]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:08 np0005486759.ooo.test sudo[254145]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:08 np0005486759.ooo.test sudo[254257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkfvpiymdeijtymafiqeobhkzhyianwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434868.6285346-1725-273550700825877/AnsiballZ_file.py
Oct 14 09:41:08 np0005486759.ooo.test sudo[254257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:09 np0005486759.ooo.test python3.9[254259]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:09 np0005486759.ooo.test sudo[254257]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:09 np0005486759.ooo.test sudo[254366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihvqottvlnshwyppkegdnfsjykuybwuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434869.143896-1725-50433115767171/AnsiballZ_copy.py
Oct 14 09:41:09 np0005486759.ooo.test sudo[254366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:09 np0005486759.ooo.test python3.9[254368]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434869.143896-1725-50433115767171/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:09 np0005486759.ooo.test sudo[254366]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:10 np0005486759.ooo.test sudo[254421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-galvbzwjlvqplgxqklnubxrfofksoitt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434869.143896-1725-50433115767171/AnsiballZ_systemd.py
Oct 14 09:41:10 np0005486759.ooo.test sudo[254421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:10 np0005486759.ooo.test python3.9[254423]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:41:10 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:41:10 np0005486759.ooo.test systemd-rc-local-generator[254446]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:41:10 np0005486759.ooo.test systemd-sysv-generator[254451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:41:10 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:41:10 np0005486759.ooo.test sudo[254421]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:10 np0005486759.ooo.test sudo[254511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdkqrtvqwmcdtgcgipmcpkmwxoobgsgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434869.143896-1725-50433115767171/AnsiballZ_systemd.py
Oct 14 09:41:10 np0005486759.ooo.test sudo[254511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:11 np0005486759.ooo.test python3.9[254513]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:41:11 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:41:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59852 DF PROTO=TCP SPT=40286 DPT=9100 SEQ=669663648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F1A410000000001030307) 
Oct 14 09:41:11 np0005486759.ooo.test systemd-sysv-generator[254541]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:41:11 np0005486759.ooo.test systemd-rc-local-generator[254537]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:41:11 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:41:11 np0005486759.ooo.test systemd[1]: Starting nova_compute container...
Oct 14 09:41:11 np0005486759.ooo.test systemd[1]: tmp-crun.LeHt3w.mount: Deactivated successfully.
Oct 14 09:41:11 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:41:11 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:11 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:11 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:11 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:11 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:11 np0005486759.ooo.test podman[254554]: 2025-10-14 09:41:11.797568551 +0000 UTC m=+0.132182950 container init f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Oct 14 09:41:11 np0005486759.ooo.test podman[254554]: 2025-10-14 09:41:11.806805268 +0000 UTC m=+0.141419667 container start f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=edpm, org.label-schema.build-date=20251009, tcib_managed=true, managed_by=edpm_ansible)
Oct 14 09:41:11 np0005486759.ooo.test podman[254554]: nova_compute
Oct 14 09:41:11 np0005486759.ooo.test systemd[1]: Started nova_compute container.
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + sudo -E kolla_set_configs
Oct 14 09:41:11 np0005486759.ooo.test sudo[254511]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Validating config file
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying service configuration files
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Deleting /etc/ceph
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Creating directory /etc/ceph
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /etc/ceph
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Writing out command to execute
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: ++ cat /run_command
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + CMD=nova-compute
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + ARGS=
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + sudo kolla_copy_cacerts
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + [[ ! -n '' ]]
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + . kolla_extend_start
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: Running command: 'nova-compute'
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + echo 'Running command: '\''nova-compute'\'''
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + umask 0022
Oct 14 09:41:11 np0005486759.ooo.test nova_compute[254566]: + exec nova-compute
Oct 14 09:41:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44512 DF PROTO=TCP SPT=51996 DPT=9882 SEQ=3106936231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F1DD80000000001030307) 
Oct 14 09:41:12 np0005486759.ooo.test python3.9[254685]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44513 DF PROTO=TCP SPT=51996 DPT=9882 SEQ=3106936231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F21C20000000001030307) 
Oct 14 09:41:13 np0005486759.ooo.test python3.9[254794]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:13 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:13.611 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:41:13 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:13.612 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:41:13 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:13.612 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:41:13 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:13.612 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 14 09:41:13 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:13.730 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:13 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:13.739 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:14 np0005486759.ooo.test python3.9[254906]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.270 2 INFO nova.virt.driver [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.372 2 INFO nova.compute.provider_config [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.390 2 WARNING nova.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.390 2 DEBUG oslo_concurrency.lockutils [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.390 2 DEBUG oslo_concurrency.lockutils [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.391 2 DEBUG oslo_concurrency.lockutils [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.391 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.391 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.391 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.391 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.391 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.392 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.393 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] console_host                   = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.394 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.395 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.396 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.397 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.397 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.397 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.397 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.397 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.397 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.398 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.399 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.400 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.401 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.402 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.402 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.402 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.402 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.402 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.402 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.403 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.404 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.405 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.406 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.407 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.408 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.409 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.410 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.411 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.412 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.413 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.414 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.414 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.414 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.414 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.414 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.414 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.415 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.416 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.417 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.418 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.419 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.420 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.421 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.421 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.421 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.421 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.421 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.421 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.422 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.423 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.424 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.424 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.424 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.424 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.424 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.424 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.425 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.426 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.427 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.427 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.427 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.427 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.427 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.427 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.428 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.429 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.430 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.431 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.432 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.433 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.434 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.435 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.436 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.437 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.438 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.439 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.440 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.441 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.441 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.441 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.441 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.441 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.441 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.442 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.442 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.442 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.442 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.442 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.442 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.443 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.444 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.444 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.444 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.444 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.444 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.444 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.445 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.446 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.447 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.448 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.449 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.449 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.449 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.449 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.449 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.449 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.450 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.450 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.450 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.450 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.450 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.450 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.451 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.451 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.451 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.451 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.451 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.451 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.452 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.453 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.454 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.455 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.456 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.457 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.457 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.457 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.457 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.457 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.457 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.458 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.459 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.460 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.461 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.462 2 WARNING oslo_config.cfg [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: and ``live_migration_inbound_addr`` respectively.
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: ).  Its value may be silently ignored in the future.
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.462 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.462 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.462 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.462 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.462 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.463 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.463 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.463 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.463 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.463 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.463 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.464 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.464 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.464 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.464 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.464 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.464 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.465 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.466 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.467 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.467 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.467 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.467 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.467 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.467 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.468 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.468 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.468 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.468 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.468 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.468 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.469 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.470 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.470 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.470 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.470 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.470 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.470 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.471 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.472 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.473 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.473 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.473 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.473 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.473 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.473 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.474 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.474 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.474 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.474 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.474 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.474 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.475 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.476 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.477 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.478 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.479 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.479 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.479 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.479 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.479 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.479 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.480 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.481 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.482 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.483 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.483 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.483 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.483 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.483 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.483 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.484 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.484 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.484 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.484 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.484 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.484 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.485 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.486 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.486 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.486 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.486 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.486 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.486 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.487 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.488 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.489 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.489 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.489 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.489 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.489 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.490 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.490 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.490 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.490 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.490 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.490 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.491 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.492 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.493 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.494 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.495 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.496 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.497 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.497 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.497 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.497 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.497 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.497 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.498 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.498 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.498 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.498 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.498 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.499 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.499 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.499 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.499 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.499 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.500 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.500 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.500 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.500 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.501 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.501 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.501 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.501 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.501 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.501 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.502 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.502 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.502 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.502 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.502 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.503 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.503 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.503 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.503 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.503 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.504 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.504 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.504 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.504 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.504 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.505 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.505 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.505 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.505 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.505 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.506 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.506 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.506 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.506 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.506 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.507 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.507 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.507 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.507 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.508 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.508 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.508 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.508 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.508 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.509 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.509 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.509 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.509 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.510 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.510 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.510 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.510 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.511 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.511 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.511 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.511 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.511 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.512 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.512 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.512 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.512 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.513 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.513 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.513 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.513 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.513 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.514 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.514 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.514 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.514 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.515 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.515 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.515 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.515 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.515 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.516 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.516 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.516 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.516 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.517 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.517 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.517 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.517 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.517 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.518 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.518 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.518 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.518 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.519 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.519 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.519 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.519 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.520 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.520 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.520 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.520 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.520 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.521 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.521 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.521 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.521 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.522 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.522 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.522 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.522 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.522 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.523 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.523 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.523 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.523 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.523 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.524 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.524 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.524 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.524 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.524 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.525 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.526 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.527 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.528 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.529 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.530 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.531 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.532 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.533 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.533 2 DEBUG oslo_service.service [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.534 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.552 2 INFO nova.virt.node [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Determined node identity 2da4b4c2-8401-4cdb-85a2-115635137a6d from /var/lib/nova/compute_id
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.553 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.553 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.554 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.554 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.562 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1f75846eb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.564 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1f75846eb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.564 2 INFO nova.virt.libvirt.driver [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Connection event '1' reason 'None'
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.578 2 INFO nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Libvirt host capabilities <capabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <host>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <uuid>03f5bc8b-edfd-405c-8f42-0ac9afa0b79f</uuid>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <arch>x86_64</arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model>EPYC-Rome-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <vendor>AMD</vendor>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <microcode version='16777317'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <signature family='23' model='49' stepping='0'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='x2apic'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='tsc-deadline'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='osxsave'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='hypervisor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='tsc_adjust'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='spec-ctrl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='stibp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='arch-capabilities'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='cmp_legacy'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='topoext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='virt-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='lbrv'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='tsc-scale'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='vmcb-clean'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='pause-filter'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='pfthreshold'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='svme-addr-chk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='rdctl-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='skip-l1dfl-vmentry'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='mds-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature name='pschange-mc-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <pages unit='KiB' size='4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <pages unit='KiB' size='2048'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <pages unit='KiB' size='1048576'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <power_management>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <suspend_mem/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <suspend_disk/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <suspend_hybrid/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </power_management>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <iommu support='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <migration_features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <live/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <uri_transports>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <uri_transport>tcp</uri_transport>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <uri_transport>rdma</uri_transport>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </uri_transports>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </migration_features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <topology>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <cells num='1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <cell id='0'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           <memory unit='KiB'>16116612</memory>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           <pages unit='KiB' size='4'>4029153</pages>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           <pages unit='KiB' size='2048'>0</pages>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           <distances>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <sibling id='0' value='10'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           </distances>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           <cpus num='8'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:           </cpus>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         </cell>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </cells>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </topology>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <cache>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </cache>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <secmodel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model>selinux</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <doi>0</doi>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </secmodel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <secmodel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model>dac</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <doi>0</doi>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </secmodel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </host>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <guest>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <os_type>hvm</os_type>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <arch name='i686'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <wordsize>32</wordsize>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <domain type='qemu'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <domain type='kvm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <pae/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <nonpae/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <acpi default='on' toggle='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <apic default='on' toggle='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <cpuselection/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <deviceboot/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <disksnapshot default='on' toggle='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <externalSnapshot/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </guest>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <guest>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <os_type>hvm</os_type>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <arch name='x86_64'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <wordsize>64</wordsize>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <domain type='qemu'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <domain type='kvm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <acpi default='on' toggle='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <apic default='on' toggle='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <cpuselection/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <deviceboot/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <disksnapshot default='on' toggle='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <externalSnapshot/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </guest>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: </capabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.580 2 DEBUG nova.virt.libvirt.volume.mount [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.583 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.594 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: <domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <domain>kvm</domain>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <arch>i686</arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <vcpu max='1024'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <iothreads supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <os supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='firmware'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <loader supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>rom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pflash</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='readonly'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>yes</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='secure'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </loader>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </os>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='maximumMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <vendor>AMD</vendor>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='succor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='custom' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-128'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-256'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-512'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <memoryBacking supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='sourceType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>file</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>anonymous</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>memfd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </memoryBacking>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <disk supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='diskDevice'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>disk</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cdrom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>floppy</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>lun</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>fdc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>sata</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </disk>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <graphics supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vnc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egl-headless</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>dbus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </graphics>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <video supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='modelType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vga</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cirrus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>none</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>bochs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ramfb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </video>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hostdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='mode'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>subsystem</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='startupPolicy'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>mandatory</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>requisite</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>optional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='subsysType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pci</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='capsType'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='pciBackend'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hostdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <rng supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>random</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </rng>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <filesystem supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='driverType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>path</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>handle</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtiofs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </filesystem>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <tpm supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-tis</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-crb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emulator</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>external</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendVersion'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>2.0</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </tpm>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <redirdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </redirdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <channel supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pty</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>unix</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </channel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <crypto supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>qemu</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </crypto>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <interface supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>passt</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </interface>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <panic supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>isa</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>hyperv</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </panic>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <gic supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <genid supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backup supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <async-teardown supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <ps2 supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sev supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sgx supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hyperv supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='features'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>relaxed</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vapic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>spinlocks</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vpindex</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>runtime</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>synic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>stimer</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reset</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vendor_id</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>frequencies</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reenlightenment</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tlbflush</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ipi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>avic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emsr_bitmap</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>xmm_input</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hyperv>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <launchSecurity supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: </domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.598 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: <domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <domain>kvm</domain>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <arch>i686</arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <vcpu max='240'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <iothreads supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <os supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='firmware'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <loader supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>rom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pflash</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='readonly'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>yes</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='secure'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </loader>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </os>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='maximumMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <vendor>AMD</vendor>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='succor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='custom' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-128'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-256'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-512'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <memoryBacking supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='sourceType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>file</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>anonymous</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>memfd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </memoryBacking>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <disk supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='diskDevice'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>disk</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cdrom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>floppy</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>lun</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ide</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>fdc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>sata</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </disk>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <graphics supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vnc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egl-headless</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>dbus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </graphics>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <video supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='modelType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vga</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cirrus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>none</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>bochs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ramfb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </video>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hostdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='mode'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>subsystem</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='startupPolicy'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>mandatory</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>requisite</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>optional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='subsysType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pci</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='capsType'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='pciBackend'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hostdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <rng supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>random</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </rng>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <filesystem supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='driverType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>path</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>handle</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtiofs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </filesystem>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <tpm supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-tis</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-crb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emulator</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>external</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendVersion'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>2.0</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </tpm>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <redirdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </redirdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <channel supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pty</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>unix</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </channel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <crypto supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>qemu</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </crypto>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <interface supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>passt</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </interface>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <panic supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>isa</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>hyperv</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </panic>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <gic supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <genid supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backup supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <async-teardown supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <ps2 supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sev supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sgx supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hyperv supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='features'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>relaxed</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vapic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>spinlocks</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vpindex</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>runtime</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>synic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>stimer</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reset</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vendor_id</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>frequencies</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reenlightenment</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tlbflush</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ipi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>avic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emsr_bitmap</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>xmm_input</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hyperv>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <launchSecurity supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: </domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.622 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.627 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: <domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <domain>kvm</domain>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <arch>x86_64</arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <vcpu max='1024'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <iothreads supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <os supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='firmware'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>efi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <loader supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>rom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pflash</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='readonly'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>yes</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='secure'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>yes</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </loader>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </os>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='maximumMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <vendor>AMD</vendor>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='succor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='custom' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-128'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-256'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-512'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <memoryBacking supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='sourceType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>file</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>anonymous</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>memfd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </memoryBacking>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <disk supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='diskDevice'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>disk</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cdrom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>floppy</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>lun</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>fdc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>sata</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </disk>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <graphics supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vnc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egl-headless</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>dbus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </graphics>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <video supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='modelType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vga</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cirrus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>none</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>bochs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ramfb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </video>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hostdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='mode'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>subsystem</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='startupPolicy'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>mandatory</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>requisite</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>optional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='subsysType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pci</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='capsType'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='pciBackend'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hostdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <rng supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>random</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </rng>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <filesystem supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='driverType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>path</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>handle</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtiofs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </filesystem>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <tpm supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-tis</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-crb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emulator</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>external</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendVersion'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>2.0</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </tpm>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <redirdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </redirdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <channel supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pty</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>unix</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </channel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <crypto supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>qemu</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </crypto>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <interface supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>passt</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </interface>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <panic supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>isa</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>hyperv</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </panic>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <gic supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <genid supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backup supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <async-teardown supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <ps2 supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sev supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sgx supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hyperv supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='features'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>relaxed</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vapic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>spinlocks</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vpindex</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>runtime</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>synic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>stimer</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reset</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vendor_id</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>frequencies</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reenlightenment</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tlbflush</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ipi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>avic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emsr_bitmap</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>xmm_input</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hyperv>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <launchSecurity supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: </domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.680 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: <domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <domain>kvm</domain>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <arch>x86_64</arch>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <vcpu max='240'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <iothreads supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <os supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='firmware'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <loader supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>rom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pflash</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='readonly'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>yes</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='secure'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>no</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </loader>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </os>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='maximumMigratable'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>on</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>off</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <vendor>AMD</vendor>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='succor'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <mode name='custom' supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Denverton-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='auto-ibrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amd-psfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='stibp-always-on'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='EPYC-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-128'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-256'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx10-512'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='prefetchiti'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Haswell-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512er'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512pf'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fma4'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tbm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xop'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='amx-tile'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-bf16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-fp16'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bitalg'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrc'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fzrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='la57'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='taa-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xfd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ifma'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cmpccxadd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fbsdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='fsrs'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ibrs-all'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mcdt-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pbrsb-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='psdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='serialize'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vaes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='hle'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='rtm'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512bw'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512cd'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512dq'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512f'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='avx512vl'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='invpcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pcid'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='pku'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='mpx'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='core-capability'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='split-lock-detect'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='cldemote'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='erms'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='gfni'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdir64b'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='movdiri'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='xsaves'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='athlon-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='core2duo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='coreduo-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='n270-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='ss'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <blockers model='phenom-v1'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnow'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <feature name='3dnowext'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </blockers>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </mode>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </cpu>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <memoryBacking supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <enum name='sourceType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>file</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>anonymous</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <value>memfd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </memoryBacking>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <disk supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='diskDevice'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>disk</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cdrom</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>floppy</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>lun</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ide</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>fdc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>sata</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </disk>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <graphics supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vnc</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egl-headless</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>dbus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </graphics>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <video supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='modelType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vga</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>cirrus</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>none</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>bochs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ramfb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </video>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hostdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='mode'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>subsystem</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='startupPolicy'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>mandatory</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>requisite</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>optional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='subsysType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pci</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>scsi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='capsType'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='pciBackend'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hostdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <rng supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtio-non-transitional</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>random</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>egd</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </rng>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <filesystem supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='driverType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>path</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>handle</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>virtiofs</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </filesystem>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <tpm supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-tis</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tpm-crb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emulator</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>external</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendVersion'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>2.0</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </tpm>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <redirdev supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='bus'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>usb</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </redirdev>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <channel supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>pty</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>unix</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </channel>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <crypto supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='type'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>qemu</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendModel'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>builtin</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </crypto>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <interface supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='backendType'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>default</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>passt</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </interface>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <panic supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='model'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>isa</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>hyperv</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </panic>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </devices>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   <features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <gic supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <genid supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <backup supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <async-teardown supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <ps2 supported='yes'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sev supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <sgx supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <hyperv supported='yes'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       <enum name='features'>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>relaxed</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vapic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>spinlocks</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vpindex</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>runtime</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>synic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>stimer</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reset</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>vendor_id</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>frequencies</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>reenlightenment</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>tlbflush</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>ipi</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>avic</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>emsr_bitmap</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:         <value>xmm_input</value>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:       </enum>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     </hyperv>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:     <launchSecurity supported='no'/>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:   </features>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: </domainCapabilities>
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.729 2 DEBUG nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.730 2 INFO nova.virt.libvirt.host [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Secure Boot support detected
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.731 2 INFO nova.virt.libvirt.driver [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.732 2 INFO nova.virt.libvirt.driver [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.745 2 DEBUG nova.virt.libvirt.driver [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.784 2 INFO nova.virt.node [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Determined node identity 2da4b4c2-8401-4cdb-85a2-115635137a6d from /var/lib/nova/compute_id
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.802 2 DEBUG nova.compute.manager [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Verified node 2da4b4c2-8401-4cdb-85a2-115635137a6d matches my host np0005486759.ooo.test _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.857 2 DEBUG nova.compute.manager [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.861 2 DEBUG nova.virt.libvirt.vif [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005486759.ooo.test',hostname='test',id=1,image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T08:45:21Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005486759.ooo.test',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8bf64e81a4214f9490d231a2e79ab3d8',ramdisk_id='',reservation_id='r-8vq1axpu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-14T08:45:21Z,user_data=None,user_id='2aff2e6f927a42b1b822d05cd9349762',uuid=4408214d-dae5-4452-92e9-eb4abd6589d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.862 2 DEBUG nova.network.os_vif_util [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Converting VIF {"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.863 2 DEBUG nova.network.os_vif_util [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.864 2 DEBUG os_vif [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.913 2 DEBUG ovsdbapp.backend.ovs_idl [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.913 2 DEBUG ovsdbapp.backend.ovs_idl [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.913 2 DEBUG ovsdbapp.backend.ovs_idl [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.932 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.933 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:41:14 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:14.934 2 INFO oslo.privsep.daemon [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp3s2u8480/privsep.sock']
Oct 14 09:41:15 np0005486759.ooo.test sudo[255043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdwyylaoulhdrqvmaidxoyqnejajkmnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434874.9260466-1785-224307424607516/AnsiballZ_podman_container.py
Oct 14 09:41:15 np0005486759.ooo.test sudo[255043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.490 2 INFO oslo.privsep.daemon [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Spawned new privsep daemon via rootwrap
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.393 40 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.397 40 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.399 40 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.400 40 INFO oslo.privsep.daemon [-] privsep daemon running as pid 40
Oct 14 09:41:15 np0005486759.ooo.test python3.9[255045]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 14 09:41:15 np0005486759.ooo.test sudo[255043]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:15 np0005486759.ooo.test systemd-journald[35787]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 120.1 (400 of 333 items), suggesting rotation.
Oct 14 09:41:15 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 09:41:15 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee08de8-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeee08de8-f9, col_values=(('external_ids', {'iface-id': 'eee08de8-f983-4ebe-a654-f67f48659e50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:cf:16', 'vm-uuid': '4408214d-dae5-4452-92e9-eb4abd6589d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.777 2 INFO os_vif [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9')
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.778 2 DEBUG nova.compute.manager [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.781 2 DEBUG nova.compute.manager [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Oct 14 09:41:15 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:15.782 2 INFO nova.compute.manager [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 14 09:41:15 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:41:16 np0005486759.ooo.test sudo[255181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoxlyofkgtfcrenvlcrveepspoqozfdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434875.9691308-1793-253772163146033/AnsiballZ_systemd.py
Oct 14 09:41:16 np0005486759.ooo.test sudo[255181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.538 2 INFO nova.service [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Updating service version for nova-compute on np0005486759.ooo.test from 57 to 66
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.570 2 DEBUG oslo_concurrency.lockutils [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.570 2 DEBUG oslo_concurrency.lockutils [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.570 2 DEBUG oslo_concurrency.lockutils [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.571 2 DEBUG nova.compute.resource_tracker [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:41:16 np0005486759.ooo.test python3.9[255183]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.634 2 DEBUG oslo_concurrency.processutils [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:16 np0005486759.ooo.test systemd[1]: Stopping nova_compute container...
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.727 2 DEBUG oslo_concurrency.processutils [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.727 2 DEBUG oslo_concurrency.processutils [None req-264d4e6d-f2c0-4c95-af3b-669c9cba2e9a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:16 np0005486759.ooo.test systemd[1]: tmp-crun.mPG6uu.mount: Deactivated successfully.
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.743 2 DEBUG oslo_concurrency.lockutils [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.743 2 DEBUG oslo_concurrency.lockutils [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:41:16 np0005486759.ooo.test nova_compute[254566]: 2025-10-14 09:41:16.744 2 DEBUG oslo_concurrency.lockutils [None req-de0265cf-750c-40d3-9e09-f569bc802ad2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:41:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33034 DF PROTO=TCP SPT=59774 DPT=9105 SEQ=2408086409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F30810000000001030307) 
Oct 14 09:41:17 np0005486759.ooo.test virtqemud[225922]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 14 09:41:17 np0005486759.ooo.test virtqemud[225922]: hostname: np0005486759.ooo.test
Oct 14 09:41:17 np0005486759.ooo.test virtqemud[225922]: End of file while reading data: Input/output error
Oct 14 09:41:17 np0005486759.ooo.test systemd[1]: libpod-f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e.scope: Deactivated successfully.
Oct 14 09:41:17 np0005486759.ooo.test systemd[1]: libpod-f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e.scope: Consumed 3.830s CPU time.
Oct 14 09:41:17 np0005486759.ooo.test podman[255188]: 2025-10-14 09:41:17.173533257 +0000 UTC m=+0.495506121 container died f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm)
Oct 14 09:41:17 np0005486759.ooo.test systemd[1]: tmp-crun.P1Rhx8.mount: Deactivated successfully.
Oct 14 09:41:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e-userdata-shm.mount: Deactivated successfully.
Oct 14 09:41:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16-merged.mount: Deactivated successfully.
Oct 14 09:41:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44515 DF PROTO=TCP SPT=51996 DPT=9882 SEQ=3106936231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F39810000000001030307) 
Oct 14 09:41:21 np0005486759.ooo.test podman[255188]: 2025-10-14 09:41:21.288383331 +0000 UTC m=+4.610356225 container cleanup f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Oct 14 09:41:21 np0005486759.ooo.test podman[255188]: nova_compute
Oct 14 09:41:21 np0005486759.ooo.test podman[255206]: 2025-10-14 09:41:21.293551371 +0000 UTC m=+4.116527094 container cleanup f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 09:41:21 np0005486759.ooo.test podman[255475]: 2025-10-14 09:41:21.372309852 +0000 UTC m=+0.046531038 container cleanup f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:41:21 np0005486759.ooo.test podman[255475]: nova_compute
Oct 14 09:41:21 np0005486759.ooo.test systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 14 09:41:21 np0005486759.ooo.test systemd[1]: Stopped nova_compute container.
Oct 14 09:41:21 np0005486759.ooo.test systemd[1]: Starting nova_compute container...
Oct 14 09:41:21 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:41:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:21 np0005486759.ooo.test podman[255488]: 2025-10-14 09:41:21.510755132 +0000 UTC m=+0.110541843 container init f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 09:41:21 np0005486759.ooo.test podman[255488]: 2025-10-14 09:41:21.519393052 +0000 UTC m=+0.119179793 container start f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:41:21 np0005486759.ooo.test podman[255488]: nova_compute
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + sudo -E kolla_set_configs
Oct 14 09:41:21 np0005486759.ooo.test systemd[1]: Started nova_compute container.
Oct 14 09:41:21 np0005486759.ooo.test sudo[255181]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Validating config file
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying service configuration files
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /etc/ceph
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Creating directory /etc/ceph
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /etc/ceph
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Writing out command to execute
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: ++ cat /run_command
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + CMD=nova-compute
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + ARGS=
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + sudo kolla_copy_cacerts
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + [[ ! -n '' ]]
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + . kolla_extend_start
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: Running command: 'nova-compute'
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + echo 'Running command: '\''nova-compute'\'''
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + umask 0022
Oct 14 09:41:21 np0005486759.ooo.test nova_compute[255504]: + exec nova-compute
Oct 14 09:41:22 np0005486759.ooo.test sudo[255623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpxscxfnoeavknekyvowpptjqrqgwazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434881.7652183-1802-173697554823414/AnsiballZ_podman_container.py
Oct 14 09:41:22 np0005486759.ooo.test sudo[255623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:22 np0005486759.ooo.test python3.9[255625]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 14 09:41:22 np0005486759.ooo.test systemd[1]: Started libpod-conmon-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227.scope.
Oct 14 09:41:22 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:41:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63053 DF PROTO=TCP SPT=45346 DPT=9102 SEQ=1946715869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F46010000000001030307) 
Oct 14 09:41:22 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:22 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:22 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 09:41:22 np0005486759.ooo.test podman[255651]: 2025-10-14 09:41:22.580576324 +0000 UTC m=+0.134913278 container init 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 14 09:41:22 np0005486759.ooo.test podman[255651]: 2025-10-14 09:41:22.589610375 +0000 UTC m=+0.143947269 container start 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 09:41:22 np0005486759.ooo.test python3.9[255625]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Applying nova statedir ownership
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4 already 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4 to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.info
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/console.log
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/d4dee7ea20c47bbf691f78ae3efd9dd29eccd913
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-d4dee7ea20c47bbf691f78ae3efd9dd29eccd913
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-storage-registry-lock
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/compute_nodes
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/7dbe5bae7bc27ef07490c629ec1f09edaa9e8c135ff89c3f08f1e44f39cf5928
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/9469aff02825a9e3dcdb3ceeb358f8d540dc07c8b6e9cd975f170399051d29c3
Oct 14 09:41:22 np0005486759.ooo.test nova_compute_init[255672]: INFO:nova_statedir:Nova statedir ownership complete
Oct 14 09:41:22 np0005486759.ooo.test systemd[1]: libpod-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227.scope: Deactivated successfully.
Oct 14 09:41:22 np0005486759.ooo.test podman[255673]: 2025-10-14 09:41:22.684106092 +0000 UTC m=+0.076473296 container died 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=nova_compute_init, org.label-schema.license=GPLv2)
Oct 14 09:41:22 np0005486759.ooo.test podman[255685]: 2025-10-14 09:41:22.733914314 +0000 UTC m=+0.077024682 container cleanup 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Oct 14 09:41:22 np0005486759.ooo.test systemd[1]: libpod-conmon-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227.scope: Deactivated successfully.
Oct 14 09:41:22 np0005486759.ooo.test sudo[255623]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:23 np0005486759.ooo.test sshd[230015]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:41:23 np0005486759.ooo.test systemd[1]: session-36.scope: Deactivated successfully.
Oct 14 09:41:23 np0005486759.ooo.test systemd[1]: session-36.scope: Consumed 2min 19.635s CPU time.
Oct 14 09:41:23 np0005486759.ooo.test systemd-logind[759]: Session 36 logged out. Waiting for processes to exit.
Oct 14 09:41:23 np0005486759.ooo.test systemd-logind[759]: Removed session 36.
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.348 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.348 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.348 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.349 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.466 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.488 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7-merged.mount: Deactivated successfully.
Oct 14 09:41:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227-userdata-shm.mount: Deactivated successfully.
Oct 14 09:41:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:23.975 2 INFO nova.virt.driver [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.093 2 INFO nova.compute.provider_config [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.104 2 WARNING nova.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.104 2 DEBUG oslo_concurrency.lockutils [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.104 2 DEBUG oslo_concurrency.lockutils [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.104 2 DEBUG oslo_concurrency.lockutils [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.105 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.105 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.105 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.105 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.105 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.105 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.106 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.107 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] console_host                   = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.108 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.109 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.109 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.109 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.109 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.109 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.109 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.110 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.111 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.112 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.112 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.112 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.112 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.112 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.112 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.113 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.114 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.115 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.116 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.117 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.117 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.117 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.117 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.117 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.117 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.118 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.119 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.120 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.121 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.122 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.123 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.124 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.125 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.126 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.127 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.127 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.127 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.127 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.127 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.127 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.128 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.129 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.130 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.130 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.130 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.130 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.130 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.130 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.131 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.132 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.133 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.134 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.135 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.136 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.137 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.138 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.139 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.139 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.139 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.139 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.139 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.139 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.140 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.141 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.142 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.142 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.142 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.142 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.142 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.142 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.143 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.143 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.143 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.143 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.143 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.143 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.144 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.145 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.146 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.146 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.146 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.146 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.146 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.146 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.147 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.148 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.148 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.148 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.148 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.148 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.149 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.150 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.151 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.151 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.151 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.151 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.151 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.151 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.152 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.152 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.152 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.152 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.152 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.153 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.153 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.153 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.153 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.153 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.153 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.154 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.154 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.154 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.154 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.154 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.154 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.155 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.156 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.156 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.156 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.156 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.156 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.156 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.157 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.158 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.159 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.160 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.161 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.161 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.161 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.161 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.161 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.161 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.162 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.163 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.163 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.163 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.163 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.163 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.163 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.164 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.164 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.164 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.164 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.164 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.164 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.165 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.166 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.167 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.168 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.169 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.170 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.171 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.172 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.173 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.174 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.175 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.176 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.176 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.176 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.176 2 WARNING oslo_config.cfg [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: and ``live_migration_inbound_addr`` respectively.
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: ).  Its value may be silently ignored in the future.
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.176 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.176 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.177 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.177 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.177 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.177 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.177 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.177 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.178 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.179 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.180 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.181 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.181 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.182 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.182 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.182 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.182 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.182 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.183 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.184 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.184 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.184 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.184 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.184 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.185 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.186 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.187 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.188 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.189 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.190 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.190 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.190 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.190 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.190 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.190 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.191 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.192 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.192 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.192 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.192 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.192 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.192 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.193 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.194 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.194 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.194 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.194 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.194 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.194 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.195 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.196 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.196 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.196 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.196 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.196 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.196 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.197 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.197 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.197 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.197 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.197 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.197 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.198 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.198 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.198 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.198 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.198 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.198 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.199 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.200 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.201 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.202 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.203 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.204 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.204 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.204 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.204 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.204 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.204 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.205 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.206 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.207 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.207 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.207 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.207 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.207 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.207 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.208 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.209 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.210 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.211 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.212 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.212 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.212 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.212 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.212 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.212 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.213 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.213 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.213 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.213 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.213 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.213 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.214 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.214 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.214 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.214 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.214 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.214 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.215 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.216 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.217 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.217 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.217 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.217 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.217 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.217 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.218 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.219 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.220 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.220 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.220 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.220 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.220 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.220 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.221 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.221 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.221 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.221 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.221 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.221 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.222 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.223 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.224 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.225 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.226 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.227 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.228 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.228 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.228 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.228 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.228 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.228 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.229 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.230 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.231 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.232 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.232 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.232 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.232 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.232 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.232 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.233 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.234 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.235 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.235 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.235 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.235 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.235 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.235 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.236 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.237 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.238 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.238 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.238 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.238 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.238 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.239 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.240 2 DEBUG oslo_service.service [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.240 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.281 2 INFO nova.virt.node [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Determined node identity 2da4b4c2-8401-4cdb-85a2-115635137a6d from /var/lib/nova/compute_id
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.282 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.283 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.283 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.283 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.293 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc027702520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.295 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc027702520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.296 2 INFO nova.virt.libvirt.driver [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Connection event '1' reason 'None'
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.303 2 INFO nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Libvirt host capabilities <capabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <host>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <uuid>03f5bc8b-edfd-405c-8f42-0ac9afa0b79f</uuid>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <arch>x86_64</arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model>EPYC-Rome-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <vendor>AMD</vendor>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <microcode version='16777317'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <signature family='23' model='49' stepping='0'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='x2apic'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='tsc-deadline'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='osxsave'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='hypervisor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='tsc_adjust'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='spec-ctrl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='stibp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='arch-capabilities'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='cmp_legacy'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='topoext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='virt-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='lbrv'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='tsc-scale'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='vmcb-clean'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='pause-filter'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='pfthreshold'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='svme-addr-chk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='rdctl-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='skip-l1dfl-vmentry'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='mds-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature name='pschange-mc-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <pages unit='KiB' size='4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <pages unit='KiB' size='2048'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <pages unit='KiB' size='1048576'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <power_management>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <suspend_mem/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <suspend_disk/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <suspend_hybrid/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </power_management>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <iommu support='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <migration_features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <live/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <uri_transports>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <uri_transport>tcp</uri_transport>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <uri_transport>rdma</uri_transport>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </uri_transports>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </migration_features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <topology>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <cells num='1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <cell id='0'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           <memory unit='KiB'>16116612</memory>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           <pages unit='KiB' size='4'>4029153</pages>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           <pages unit='KiB' size='2048'>0</pages>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           <distances>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <sibling id='0' value='10'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           </distances>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           <cpus num='8'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:           </cpus>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         </cell>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </cells>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </topology>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <cache>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </cache>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <secmodel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model>selinux</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <doi>0</doi>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </secmodel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <secmodel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model>dac</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <doi>0</doi>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </secmodel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </host>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <guest>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <os_type>hvm</os_type>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <arch name='i686'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <wordsize>32</wordsize>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <domain type='qemu'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <domain type='kvm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <pae/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <nonpae/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <acpi default='on' toggle='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <apic default='on' toggle='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <cpuselection/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <deviceboot/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <disksnapshot default='on' toggle='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <externalSnapshot/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </guest>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <guest>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <os_type>hvm</os_type>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <arch name='x86_64'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <wordsize>64</wordsize>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <domain type='qemu'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <domain type='kvm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <acpi default='on' toggle='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <apic default='on' toggle='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <cpuselection/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <deviceboot/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <disksnapshot default='on' toggle='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <externalSnapshot/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </guest>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: </capabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.309 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.313 2 DEBUG nova.virt.libvirt.volume.mount [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.314 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: <domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <domain>kvm</domain>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <arch>i686</arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <vcpu max='1024'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <iothreads supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <os supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='firmware'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <loader supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>rom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pflash</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='readonly'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>yes</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='secure'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </loader>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </os>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='maximumMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <vendor>AMD</vendor>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='succor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='custom' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-128'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-256'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-512'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <memoryBacking supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='sourceType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>file</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>anonymous</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>memfd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </memoryBacking>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <disk supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='diskDevice'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>disk</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cdrom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>floppy</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>lun</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>fdc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>sata</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </disk>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <graphics supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vnc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egl-headless</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>dbus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </graphics>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <video supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='modelType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vga</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cirrus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>none</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>bochs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ramfb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </video>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hostdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='mode'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>subsystem</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='startupPolicy'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>mandatory</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>requisite</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>optional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='subsysType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pci</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='capsType'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='pciBackend'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hostdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <rng supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>random</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </rng>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <filesystem supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='driverType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>path</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>handle</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtiofs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </filesystem>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <tpm supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-tis</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-crb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emulator</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>external</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendVersion'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>2.0</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </tpm>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <redirdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </redirdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <channel supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pty</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>unix</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </channel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <crypto supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>qemu</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </crypto>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <interface supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>passt</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </interface>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <panic supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>isa</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>hyperv</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </panic>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <gic supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <genid supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backup supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <async-teardown supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <ps2 supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sev supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sgx supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hyperv supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='features'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>relaxed</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vapic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>spinlocks</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vpindex</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>runtime</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>synic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>stimer</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reset</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vendor_id</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>frequencies</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reenlightenment</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tlbflush</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ipi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>avic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emsr_bitmap</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>xmm_input</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hyperv>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <launchSecurity supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: </domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.321 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: <domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <domain>kvm</domain>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <arch>i686</arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <vcpu max='240'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <iothreads supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <os supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='firmware'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <loader supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>rom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pflash</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='readonly'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>yes</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='secure'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </loader>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </os>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='maximumMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <vendor>AMD</vendor>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='succor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='custom' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-128'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-256'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-512'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <memoryBacking supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='sourceType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>file</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>anonymous</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>memfd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </memoryBacking>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <disk supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='diskDevice'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>disk</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cdrom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>floppy</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>lun</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ide</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>fdc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>sata</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </disk>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <graphics supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vnc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egl-headless</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>dbus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </graphics>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <video supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='modelType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vga</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cirrus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>none</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>bochs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ramfb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </video>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hostdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='mode'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>subsystem</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='startupPolicy'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>mandatory</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>requisite</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>optional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='subsysType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pci</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='capsType'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='pciBackend'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hostdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <rng supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>random</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </rng>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <filesystem supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='driverType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>path</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>handle</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtiofs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </filesystem>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <tpm supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-tis</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-crb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emulator</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>external</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendVersion'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>2.0</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </tpm>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <redirdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </redirdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <channel supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pty</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>unix</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </channel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <crypto supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>qemu</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </crypto>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <interface supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>passt</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </interface>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <panic supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>isa</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>hyperv</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </panic>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <gic supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <genid supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backup supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <async-teardown supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <ps2 supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sev supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sgx supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hyperv supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='features'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>relaxed</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vapic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>spinlocks</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vpindex</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>runtime</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>synic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>stimer</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reset</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vendor_id</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>frequencies</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reenlightenment</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tlbflush</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ipi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>avic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emsr_bitmap</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>xmm_input</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hyperv>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <launchSecurity supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: </domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.343 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.347 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: <domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <domain>kvm</domain>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <arch>x86_64</arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <vcpu max='1024'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <iothreads supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <os supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='firmware'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>efi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <loader supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>rom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pflash</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='readonly'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>yes</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='secure'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>yes</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </loader>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </os>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='maximumMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <vendor>AMD</vendor>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='succor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='custom' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-128'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-256'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-512'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <memoryBacking supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='sourceType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>file</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>anonymous</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>memfd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </memoryBacking>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <disk supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='diskDevice'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>disk</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cdrom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>floppy</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>lun</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>fdc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>sata</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </disk>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <graphics supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vnc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egl-headless</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>dbus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </graphics>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <video supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='modelType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vga</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cirrus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>none</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>bochs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ramfb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </video>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hostdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='mode'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>subsystem</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='startupPolicy'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>mandatory</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>requisite</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>optional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='subsysType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pci</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='capsType'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='pciBackend'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hostdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <rng supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>random</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </rng>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <filesystem supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='driverType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>path</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>handle</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtiofs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </filesystem>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <tpm supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-tis</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-crb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emulator</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>external</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendVersion'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>2.0</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </tpm>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <redirdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </redirdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <channel supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pty</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>unix</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </channel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <crypto supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>qemu</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </crypto>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <interface supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>passt</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </interface>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <panic supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>isa</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>hyperv</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </panic>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <gic supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <genid supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backup supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <async-teardown supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <ps2 supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sev supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sgx supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hyperv supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='features'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>relaxed</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vapic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>spinlocks</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vpindex</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>runtime</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>synic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>stimer</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reset</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vendor_id</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>frequencies</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reenlightenment</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tlbflush</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ipi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>avic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emsr_bitmap</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>xmm_input</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hyperv>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <launchSecurity supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: </domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.396 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: <domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <domain>kvm</domain>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <arch>x86_64</arch>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <vcpu max='240'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <iothreads supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <os supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='firmware'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <loader supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>rom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pflash</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='readonly'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>yes</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='secure'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>no</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </loader>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </os>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='maximum' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='maximumMigratable'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>on</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>off</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='host-model' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <vendor>AMD</vendor>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='x2apic'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='stibp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='succor'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lbrv'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='mds-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <mode name='custom' supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Broadwell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Cooperlake-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Denverton-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Dhyana-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='auto-ibrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amd-psfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='no-nested-data-bp'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='null-sel-clr-base'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='stibp-always-on'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='EPYC-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-128'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-256'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx10-512'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='prefetchiti'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Haswell-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='IvyBridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='KnightsMill-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4fmaps'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-4vnniw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512er'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512pf'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fma4'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tbm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xop'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='amx-tile'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-bf16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-fp16'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bitalg'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vbmi2'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrc'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fzrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='la57'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='taa-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='tsx-ldtrk'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xfd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='SierraForest-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ifma'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-ne-convert'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx-vnni-int8'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='bus-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cmpccxadd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fbsdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='fsrs'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ibrs-all'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mcdt-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pbrsb-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='psdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='serialize'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vaes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='vpclmulqdq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='hle'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='rtm'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512bw'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512cd'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512dq'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512f'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='avx512vl'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='invpcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pcid'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='pku'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='mpx'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v2'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v3'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='core-capability'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='split-lock-detect'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='Snowridge-v4'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='cldemote'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='erms'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='gfni'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdir64b'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='movdiri'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='xsaves'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='athlon-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='core2duo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='coreduo-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='n270-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='ss'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <blockers model='phenom-v1'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnow'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <feature name='3dnowext'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </blockers>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </mode>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </cpu>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <memoryBacking supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <enum name='sourceType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>file</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>anonymous</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <value>memfd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </memoryBacking>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <disk supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='diskDevice'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>disk</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cdrom</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>floppy</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>lun</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ide</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>fdc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>sata</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </disk>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <graphics supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vnc</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egl-headless</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>dbus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </graphics>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <video supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='modelType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vga</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>cirrus</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>none</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>bochs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ramfb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </video>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hostdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='mode'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>subsystem</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='startupPolicy'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>mandatory</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>requisite</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>optional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='subsysType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pci</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>scsi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='capsType'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='pciBackend'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hostdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <rng supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtio-non-transitional</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>random</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>egd</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </rng>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <filesystem supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='driverType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>path</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>handle</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>virtiofs</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </filesystem>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <tpm supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-tis</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tpm-crb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emulator</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>external</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendVersion'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>2.0</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </tpm>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <redirdev supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='bus'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>usb</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </redirdev>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <channel supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>pty</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>unix</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </channel>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <crypto supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='type'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>qemu</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendModel'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>builtin</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </crypto>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <interface supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='backendType'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>default</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>passt</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </interface>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <panic supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='model'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>isa</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>hyperv</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </panic>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </devices>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   <features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <gic supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <vmcoreinfo supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <genid supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backingStoreInput supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <backup supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <async-teardown supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <ps2 supported='yes'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sev supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <sgx supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <hyperv supported='yes'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       <enum name='features'>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>relaxed</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vapic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>spinlocks</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vpindex</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>runtime</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>synic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>stimer</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reset</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>vendor_id</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>frequencies</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>reenlightenment</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>tlbflush</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>ipi</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>avic</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>emsr_bitmap</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:         <value>xmm_input</value>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:       </enum>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     </hyperv>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:     <launchSecurity supported='no'/>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:   </features>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: </domainCapabilities>
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.444 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.445 2 INFO nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Secure Boot support detected
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.446 2 INFO nova.virt.libvirt.driver [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.446 2 INFO nova.virt.libvirt.driver [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.456 2 DEBUG nova.virt.libvirt.driver [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.478 2 INFO nova.virt.node [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Determined node identity 2da4b4c2-8401-4cdb-85a2-115635137a6d from /var/lib/nova/compute_id
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.497 2 DEBUG nova.compute.manager [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Verified node 2da4b4c2-8401-4cdb-85a2-115635137a6d matches my host np0005486759.ooo.test _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.536 2 DEBUG nova.compute.manager [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.540 2 DEBUG nova.virt.libvirt.vif [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005486759.ooo.test',hostname='test',id=1,image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T08:45:21Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005486759.ooo.test',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8bf64e81a4214f9490d231a2e79ab3d8',ramdisk_id='',reservation_id='r-8vq1axpu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-14T08:45:21Z,user_data=None,user_id='2aff2e6f927a42b1b822d05cd9349762',uuid=4408214d-dae5-4452-92e9-eb4abd6589d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.540 2 DEBUG nova.network.os_vif_util [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Converting VIF {"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.541 2 DEBUG nova.network.os_vif_util [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.542 2 DEBUG os_vif [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.577 2 DEBUG ovsdbapp.backend.ovs_idl [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.577 2 DEBUG ovsdbapp.backend.ovs_idl [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.577 2 DEBUG ovsdbapp.backend.ovs_idl [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.600 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.600 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:41:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:24.601 2 INFO oslo.privsep.daemon [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpej761wwo/privsep.sock']
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.181 2 INFO oslo.privsep.daemon [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Spawned new privsep daemon via rootwrap
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.095 40 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.100 40 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.104 40 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.104 40 INFO oslo.privsep.daemon [-] privsep daemon running as pid 40
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.455 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee08de8-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.456 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeee08de8-f9, col_values=(('external_ids', {'iface-id': 'eee08de8-f983-4ebe-a654-f67f48659e50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:cf:16', 'vm-uuid': '4408214d-dae5-4452-92e9-eb4abd6589d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.457 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.457 2 INFO os_vif [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9')
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.458 2 DEBUG nova.compute.manager [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.462 2 DEBUG nova.compute.manager [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Oct 14 09:41:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:25.462 2 INFO nova.compute.manager [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.330 2 DEBUG oslo_concurrency.lockutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.331 2 DEBUG oslo_concurrency.lockutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.331 2 DEBUG oslo_concurrency.lockutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.331 2 DEBUG nova.compute.resource_tracker [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.428 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.502 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.503 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.577 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.579 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63054 DF PROTO=TCP SPT=45346 DPT=9102 SEQ=1946715869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F55C10000000001030307) 
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.657 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.658 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:41:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:26.702 2 DEBUG oslo_concurrency.processutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:41:26 np0005486759.ooo.test systemd[1]: Starting libvirt nodedev daemon...
Oct 14 09:41:26 np0005486759.ooo.test systemd[1]: Started libvirt nodedev daemon.
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.019 2 WARNING nova.virt.libvirt.driver [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.021 2 DEBUG nova.compute.resource_tracker [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=13242MB free_disk=386.8920555114746GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.021 2 DEBUG oslo_concurrency.lockutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.021 2 DEBUG oslo_concurrency.lockutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.503 2 DEBUG nova.compute.resource_tracker [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.504 2 DEBUG nova.compute.resource_tracker [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:41:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:27.504 2 DEBUG nova.compute.resource_tracker [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.148 2 DEBUG nova.scheduler.client.report [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.167 2 DEBUG nova.scheduler.client.report [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 0, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.167 2 DEBUG nova.compute.provider_tree [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 0, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.182 2 DEBUG nova.scheduler.client.report [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.200 2 DEBUG nova.scheduler.client.report [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.242 2 DEBUG nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.242 2 INFO nova.virt.libvirt.host [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] kernel doesn't support AMD SEV
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.244 2 DEBUG nova.compute.provider_tree [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 399, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.244 2 DEBUG nova.virt.libvirt.driver [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.320 2 DEBUG nova.scheduler.client.report [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updated inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 399, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.321 2 DEBUG nova.compute.provider_tree [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updating resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.321 2 DEBUG nova.compute.provider_tree [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.419 2 DEBUG nova.compute.provider_tree [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Updating resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.458 2 DEBUG nova.compute.resource_tracker [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.458 2 DEBUG oslo_concurrency.lockutils [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.458 2 DEBUG nova.service [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.569 2 DEBUG nova.service [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Oct 14 09:41:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:28.570 2 DEBUG nova.servicegroup.drivers.db [None req-2a18881d-772c-4b5e-840f-7a87bb95c4d6 - - - - - -] DB_Driver: join new ServiceGroup member np0005486759.ooo.test to the compute group, service = <Service: host=np0005486759.ooo.test, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Oct 14 09:41:29 np0005486759.ooo.test sshd[255797]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:41:29 np0005486759.ooo.test sshd[255797]: Accepted publickey for zuul from 192.168.122.31 port 55682 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:41:29 np0005486759.ooo.test systemd-logind[759]: New session 39 of user zuul.
Oct 14 09:41:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:29.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:29 np0005486759.ooo.test systemd[1]: Started Session 39 of User zuul.
Oct 14 09:41:29 np0005486759.ooo.test sshd[255797]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:41:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:29.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:30 np0005486759.ooo.test python3.9[255908]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:41:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:41:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:41:30 np0005486759.ooo.test podman[255913]: 2025-10-14 09:41:30.440538434 +0000 UTC m=+0.064596031 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:41:30 np0005486759.ooo.test systemd[1]: tmp-crun.spiIc6.mount: Deactivated successfully.
Oct 14 09:41:30 np0005486759.ooo.test podman[255914]: 2025-10-14 09:41:30.502776197 +0000 UTC m=+0.124671322 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct 14 09:41:30 np0005486759.ooo.test podman[255913]: 2025-10-14 09:41:30.52740319 +0000 UTC m=+0.151460807 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:41:30 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:41:30 np0005486759.ooo.test podman[255914]: 2025-10-14 09:41:30.588260092 +0000 UTC m=+0.210155197 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Oct 14 09:41:30 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:41:32 np0005486759.ooo.test sudo[256064]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnetumllfyhgdlsrjszsepigofdvyydn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434891.005041-36-164190576838860/AnsiballZ_systemd_service.py
Oct 14 09:41:32 np0005486759.ooo.test sudo[256064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:32 np0005486759.ooo.test python3.9[256066]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:41:32 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:41:32 np0005486759.ooo.test systemd-rc-local-generator[256091]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:41:32 np0005486759.ooo.test systemd-sysv-generator[256097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:41:32 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:41:32 np0005486759.ooo.test sudo[256064]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:33 np0005486759.ooo.test python3.9[256210]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:41:33 np0005486759.ooo.test network[256227]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:41:33 np0005486759.ooo.test network[256228]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:41:33 np0005486759.ooo.test network[256229]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:41:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:41:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:41:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57712 DF PROTO=TCP SPT=54530 DPT=9100 SEQ=3795893293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F73B80000000001030307) 
Oct 14 09:41:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:34.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:34 np0005486759.ooo.test podman[256236]: 2025-10-14 09:41:34.368917309 +0000 UTC m=+0.082035497 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:41:34 np0005486759.ooo.test podman[256236]: 2025-10-14 09:41:34.374610754 +0000 UTC m=+0.087728962 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:41:34 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:41:34 np0005486759.ooo.test podman[256238]: 2025-10-14 09:41:34.353241965 +0000 UTC m=+0.067914328 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 09:41:34 np0005486759.ooo.test podman[256238]: 2025-10-14 09:41:34.435234709 +0000 UTC m=+0.149907032 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 14 09:41:34 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:41:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:34.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:34 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:41:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57713 DF PROTO=TCP SPT=54530 DPT=9100 SEQ=3795893293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F77C20000000001030307) 
Oct 14 09:41:36 np0005486759.ooo.test sudo[256500]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rplnrjdvmcqiiysfzzuugjjqbvhcvolc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434896.5386007-55-4913273509015/AnsiballZ_systemd_service.py
Oct 14 09:41:36 np0005486759.ooo.test sudo[256500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:37 np0005486759.ooo.test python3.9[256502]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:41:37 np0005486759.ooo.test sudo[256500]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57714 DF PROTO=TCP SPT=54530 DPT=9100 SEQ=3795893293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F7FC20000000001030307) 
Oct 14 09:41:37 np0005486759.ooo.test sudo[256611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kugtcjplzirwmjrdmnnjbidawdeaptvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434897.521473-65-101243845228313/AnsiballZ_file.py
Oct 14 09:41:37 np0005486759.ooo.test sudo[256611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:38 np0005486759.ooo.test python3.9[256613]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:38 np0005486759.ooo.test sudo[256611]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:38 np0005486759.ooo.test systemd-journald[35787]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Oct 14 09:41:38 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 09:41:38 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:41:38 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:41:38 np0005486759.ooo.test sudo[256722]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkvibpzqooicyeibuxhnsgwmxedtpihq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434898.3581426-73-233884325892961/AnsiballZ_file.py
Oct 14 09:41:38 np0005486759.ooo.test sudo[256722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:38 np0005486759.ooo.test python3.9[256724]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:38 np0005486759.ooo.test sudo[256722]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:39.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:39 np0005486759.ooo.test sudo[256832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orbvwsxtldinwqkbyxglvzqctcdagwlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434899.102058-82-208542023325672/AnsiballZ_command.py
Oct 14 09:41:39 np0005486759.ooo.test sudo[256832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:39.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:39 np0005486759.ooo.test python3.9[256834]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                           systemctl disable --now certmonger.service
                                                           test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                         fi
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:41:39 np0005486759.ooo.test sudo[256832]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:40 np0005486759.ooo.test python3.9[256944]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:41:41 np0005486759.ooo.test sudo[257052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onozviqlputzkaebjgcoccijmgspjdsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434900.8351867-100-131473782653943/AnsiballZ_systemd_service.py
Oct 14 09:41:41 np0005486759.ooo.test sudo[257052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57715 DF PROTO=TCP SPT=54530 DPT=9100 SEQ=3795893293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F8F820000000001030307) 
Oct 14 09:41:41 np0005486759.ooo.test python3.9[257054]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:41:41 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:41:41 np0005486759.ooo.test systemd-rc-local-generator[257071]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:41:41 np0005486759.ooo.test systemd-sysv-generator[257077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:41:41 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:41:41 np0005486759.ooo.test sudo[257052]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:42 np0005486759.ooo.test sudo[257198]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fopapmjqisixzjqtymcwhmuuvwedaekv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434901.9627492-108-93862670389027/AnsiballZ_command.py
Oct 14 09:41:42 np0005486759.ooo.test sudo[257198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53415 DF PROTO=TCP SPT=59426 DPT=9882 SEQ=1623269003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F93080000000001030307) 
Oct 14 09:41:42 np0005486759.ooo.test python3.9[257200]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:41:42 np0005486759.ooo.test sudo[257198]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:42 np0005486759.ooo.test sudo[257309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuqusmcfouicxqpnkfcjhxolibpwdett ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434902.695161-117-157341013099336/AnsiballZ_file.py
Oct 14 09:41:42 np0005486759.ooo.test sudo[257309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:43 np0005486759.ooo.test python3.9[257311]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:41:43 np0005486759.ooo.test sudo[257309]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53416 DF PROTO=TCP SPT=59426 DPT=9882 SEQ=1623269003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8F97010000000001030307) 
Oct 14 09:41:44 np0005486759.ooo.test python3.9[257419]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:44.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:44.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:44 np0005486759.ooo.test python3.9[257529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:45 np0005486759.ooo.test python3.9[257615]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434904.22467-133-203394756719715/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=f52210d3ae12943a6ad5e528ec7b471a6006e70e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:41:46 np0005486759.ooo.test sudo[257723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpzfehcchfantdgpxrkvvrhtacbrytin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434905.60301-148-252889035638748/AnsiballZ_group.py
Oct 14 09:41:46 np0005486759.ooo.test sudo[257723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:46 np0005486759.ooo.test python3.9[257725]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct 14 09:41:46 np0005486759.ooo.test sudo[257723]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:46 np0005486759.ooo.test sudo[257833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqlcfiastyazoroisgicgshtpnbyujyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434906.5006964-159-152032678569755/AnsiballZ_getent.py
Oct 14 09:41:46 np0005486759.ooo.test sudo[257833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19790 DF PROTO=TCP SPT=59706 DPT=9105 SEQ=4008768849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FA5C10000000001030307) 
Oct 14 09:41:47 np0005486759.ooo.test python3.9[257835]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct 14 09:41:47 np0005486759.ooo.test sudo[257833]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:47 np0005486759.ooo.test sudo[257944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcledgcfraigzavtnffmwosuaofsihel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434907.384846-167-80116243225866/AnsiballZ_group.py
Oct 14 09:41:47 np0005486759.ooo.test sudo[257944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:47 np0005486759.ooo.test python3.9[257946]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 14 09:41:47 np0005486759.ooo.test groupadd[257947]: group added to /etc/group: name=ceilometer, GID=42405
Oct 14 09:41:47 np0005486759.ooo.test groupadd[257947]: group added to /etc/gshadow: name=ceilometer
Oct 14 09:41:47 np0005486759.ooo.test groupadd[257947]: new group: name=ceilometer, GID=42405
Oct 14 09:41:47 np0005486759.ooo.test sudo[257944]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:48 np0005486759.ooo.test sudo[258060]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibftcdloclclikoodzmxandzknbuekxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434908.0833805-175-70439409766595/AnsiballZ_user.py
Oct 14 09:41:48 np0005486759.ooo.test sudo[258060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:41:48 np0005486759.ooo.test python3.9[258062]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005486759.ooo.test update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 14 09:41:48 np0005486759.ooo.test useradd[258064]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Oct 14 09:41:48 np0005486759.ooo.test useradd[258064]: add 'ceilometer' to group 'libvirt'
Oct 14 09:41:48 np0005486759.ooo.test useradd[258064]: add 'ceilometer' to shadow group 'libvirt'
Oct 14 09:41:48 np0005486759.ooo.test sudo[258060]: pam_unix(sudo:session): session closed for user root
Oct 14 09:41:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53418 DF PROTO=TCP SPT=59426 DPT=9882 SEQ=1623269003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FAEC10000000001030307) 
Oct 14 09:41:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:49.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:49.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:50 np0005486759.ooo.test python3.9[258178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:50 np0005486759.ooo.test python3.9[258264]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1760434909.6182477-201-272512964970302/.source.conf _original_basename=ceilometer.conf follow=False checksum=26405e17bcef8c2f66aea4d736047d660bc05833 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:51 np0005486759.ooo.test python3.9[258372]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:51 np0005486759.ooo.test python3.9[258458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1760434910.785849-201-269967762811423/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:52 np0005486759.ooo.test python3.9[258566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23252 DF PROTO=TCP SPT=43838 DPT=9102 SEQ=3563354439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FBB410000000001030307) 
Oct 14 09:41:52 np0005486759.ooo.test python3.9[258652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1760434911.8947358-201-111928675565503/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:53 np0005486759.ooo.test python3.9[258760]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:41:54.142 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:41:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:41:54.142 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:41:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:41:54.144 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:41:54 np0005486759.ooo.test python3.9[258868]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:41:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:54.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:54.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:56 np0005486759.ooo.test python3.9[258976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:56 np0005486759.ooo.test python3.9[259062]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434915.62676-260-213215109346526/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23253 DF PROTO=TCP SPT=43838 DPT=9102 SEQ=3563354439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FCB020000000001030307) 
Oct 14 09:41:57 np0005486759.ooo.test python3.9[259170]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:58 np0005486759.ooo.test python3.9[259225]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:58 np0005486759.ooo.test python3.9[259333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:41:59 np0005486759.ooo.test python3.9[259419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434918.3021312-260-3434340670904/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:41:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:59.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:41:59.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:41:59 np0005486759.ooo.test python3.9[259527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:00 np0005486759.ooo.test python3.9[259613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434919.4250562-260-276762654075602/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:00 np0005486759.ooo.test python3.9[259721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:42:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:42:01 np0005486759.ooo.test python3.9[259807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434920.5011835-260-186130016728638/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:01 np0005486759.ooo.test systemd[1]: tmp-crun.jkeh30.mount: Deactivated successfully.
Oct 14 09:42:01 np0005486759.ooo.test podman[259809]: 2025-10-14 09:42:01.434113725 +0000 UTC m=+0.058982758 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:42:01 np0005486759.ooo.test systemd[1]: tmp-crun.dBEVw9.mount: Deactivated successfully.
Oct 14 09:42:01 np0005486759.ooo.test podman[259808]: 2025-10-14 09:42:01.443142892 +0000 UTC m=+0.067496529 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Oct 14 09:42:01 np0005486759.ooo.test podman[259809]: 2025-10-14 09:42:01.450218297 +0000 UTC m=+0.075087360 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:42:01 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:42:01 np0005486759.ooo.test podman[259808]: 2025-10-14 09:42:01.477425203 +0000 UTC m=+0.101778880 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:42:01 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:42:02 np0005486759.ooo.test python3.9[259958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:02 np0005486759.ooo.test python3.9[260044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434921.573308-260-77330014697315/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:03 np0005486759.ooo.test python3.9[260152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:03 np0005486759.ooo.test python3.9[260238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434922.768112-260-32648922296584/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27004 DF PROTO=TCP SPT=48294 DPT=9100 SEQ=894299638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FE8E70000000001030307) 
Oct 14 09:42:04 np0005486759.ooo.test python3.9[260346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:04 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:04.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:04 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:04.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:04 np0005486759.ooo.test python3.9[260432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434923.881327-260-242478678342617/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27005 DF PROTO=TCP SPT=48294 DPT=9100 SEQ=894299638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FED010000000001030307) 
Oct 14 09:42:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:42:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:42:05 np0005486759.ooo.test systemd[1]: tmp-crun.Sf0Tbi.mount: Deactivated successfully.
Oct 14 09:42:05 np0005486759.ooo.test podman[260542]: 2025-10-14 09:42:05.455596271 +0000 UTC m=+0.082328611 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:42:05 np0005486759.ooo.test podman[260542]: 2025-10-14 09:42:05.463316785 +0000 UTC m=+0.090049115 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:42:05 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:42:05 np0005486759.ooo.test podman[260541]: 2025-10-14 09:42:05.427996262 +0000 UTC m=+0.061142456 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 09:42:05 np0005486759.ooo.test python3.9[260540]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:05 np0005486759.ooo.test podman[260541]: 2025-10-14 09:42:05.507353597 +0000 UTC m=+0.140499811 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 09:42:05 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:42:05 np0005486759.ooo.test python3.9[260662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434925.0540328-260-196202780077198/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:06 np0005486759.ooo.test python3.9[260770]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:07 np0005486759.ooo.test python3.9[260856]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434926.1172628-260-197319282480218/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27006 DF PROTO=TCP SPT=48294 DPT=9100 SEQ=894299638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F8FF5010000000001030307) 
Oct 14 09:42:07 np0005486759.ooo.test python3.9[260964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:08 np0005486759.ooo.test python3.9[261050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760434927.253511-260-20981446428820/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:09 np0005486759.ooo.test sudo[261158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnbsreecrllgetvktsesukldcegodixb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434928.5829053-415-265019332900751/AnsiballZ_file.py
Oct 14 09:42:09 np0005486759.ooo.test sudo[261158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:09.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:09 np0005486759.ooo.test python3.9[261160]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:42:09 np0005486759.ooo.test sudo[261158]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:09.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:10 np0005486759.ooo.test sudo[261268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkyadkikibfmlgzrbadiynlrozuykegt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434929.7849808-423-30296812693592/AnsiballZ_systemd_service.py
Oct 14 09:42:10 np0005486759.ooo.test sudo[261268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:10 np0005486759.ooo.test python3.9[261270]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:42:10 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:10 np0005486759.ooo.test systemd-rc-local-generator[261299]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:10 np0005486759.ooo.test systemd-sysv-generator[261302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:10 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:10 np0005486759.ooo.test systemd[1]: Listening on Podman API Socket.
Oct 14 09:42:10 np0005486759.ooo.test sudo[261268]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27007 DF PROTO=TCP SPT=48294 DPT=9100 SEQ=894299638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9004C10000000001030307) 
Oct 14 09:42:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17953 DF PROTO=TCP SPT=54882 DPT=9882 SEQ=2161059377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9008380000000001030307) 
Oct 14 09:42:12 np0005486759.ooo.test sudo[261418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjwgowdmjyyvghwfwsnlpqvkfjegpomz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434931.0613053-432-23216067218619/AnsiballZ_stat.py
Oct 14 09:42:12 np0005486759.ooo.test sudo[261418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:12 np0005486759.ooo.test python3.9[261420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:12 np0005486759.ooo.test sudo[261418]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:12 np0005486759.ooo.test sudo[261506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-govauzmbnvvpvkodihlqiinvuytqwquo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434931.0613053-432-23216067218619/AnsiballZ_copy.py
Oct 14 09:42:12 np0005486759.ooo.test sudo[261506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:13 np0005486759.ooo.test python3.9[261508]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434931.0613053-432-23216067218619/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:42:13 np0005486759.ooo.test sudo[261506]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17954 DF PROTO=TCP SPT=54882 DPT=9882 SEQ=2161059377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F900C420000000001030307) 
Oct 14 09:42:13 np0005486759.ooo.test sudo[261561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrwlrtwkngkawwposriwdvceygmttbok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434931.0613053-432-23216067218619/AnsiballZ_stat.py
Oct 14 09:42:13 np0005486759.ooo.test sudo[261561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:13 np0005486759.ooo.test python3.9[261563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:13 np0005486759.ooo.test sudo[261561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:14 np0005486759.ooo.test sudo[261649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibaujuxdkzlhzfwufimuwwqxnbctovmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434931.0613053-432-23216067218619/AnsiballZ_copy.py
Oct 14 09:42:14 np0005486759.ooo.test sudo[261649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:14 np0005486759.ooo.test python3.9[261651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434931.0613053-432-23216067218619/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:42:14 np0005486759.ooo.test sudo[261649]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:14.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:14.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:15 np0005486759.ooo.test sudo[261759]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myscvvbpuxnebcomcblmkbikvhpoxgbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434934.56666-460-162462511236846/AnsiballZ_container_config_data.py
Oct 14 09:42:15 np0005486759.ooo.test sudo[261759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:15 np0005486759.ooo.test python3.9[261761]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct 14 09:42:15 np0005486759.ooo.test sudo[261759]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:15 np0005486759.ooo.test sudo[261869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjhfmxebpvrzybozbyshljtchnhfskyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434935.4694614-469-161770449486073/AnsiballZ_container_config_hash.py
Oct 14 09:42:15 np0005486759.ooo.test sudo[261869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:16 np0005486759.ooo.test python3.9[261871]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:42:16 np0005486759.ooo.test sudo[261869]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:17 np0005486759.ooo.test sudo[261979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpslozhticufyyjssafbrnbzlguospui ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434936.508199-479-55554587715958/AnsiballZ_edpm_container_manage.py
Oct 14 09:42:17 np0005486759.ooo.test sudo[261979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45347 DF PROTO=TCP SPT=56588 DPT=9105 SEQ=1757212836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F901B010000000001030307) 
Oct 14 09:42:17 np0005486759.ooo.test python3[261981]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:42:17 np0005486759.ooo.test python3[261981]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "ff8aaa87a0dadf978d112c753603163797c5ab8a31d9fdfbc1412a1a3cc6baaa",
                                                                 "Digest": "sha256:fdfe6c13298281d9bde0044bcf6e037d1a31c741234642f0584858e76761296b",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:fdfe6c13298281d9bde0044bcf6e037d1a31c741234642f0584858e76761296b"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-14T06:21:17.025659624Z",
                                                                 "Config": {
                                                                      "User": "root",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 505004291,
                                                                 "VirtualSize": 505004291,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790/diff:/var/lib/containers/storage/overlay/1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec/diff:/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",
                                                                           "sha256:f640179b0564dc7abbe22bd39fc8810d5bbb8e54094fe7ebc5b3c45b658c4983",
                                                                           "sha256:a244c51d91c7fa48dd864b4fedb26f2afb3cd16eb13faecea61eec45f3182851",
                                                                           "sha256:4da4e1be651faf4cb682c510a475353c690bc8308e24a4b892f317b994e706e4"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "root",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969219151Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969253522Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969285133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969308103Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969342284Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969363945Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:55.340499198Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:32.389605838Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.587912811Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.976619634Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:36.392967414Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.005863592Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.29378883Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.651733508Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.077574384Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.492629447Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.841668394Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.241713606Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.624152332Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.968354993Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.280465471Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.616162553Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.039895541Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.340755181Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.002994823Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.284637314Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.582935524Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:47.185088535Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260120756Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260167227Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260179498Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260189038Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:50.485771038Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:11:48.328117095Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:12:30.499124675Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:12:33.437399647Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:18.151552425Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:22.088342109Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:15:03.514885931Z",
                                                                           "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:15:07.241340827Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:20:38.192820918Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:21:17.022766922Z",
                                                                           "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:21:21.861087409Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct 14 09:42:17 np0005486759.ooo.test podman[262032]: 2025-10-14 09:42:17.616218557 +0000 UTC m=+0.055890810 container remove 48d974244ecae9936a865fec243f380328ed89f09e162fb6e6df70e93dd62f4c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9509e102f1abab83a0acc6d291975c60'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-07-21T14:45:33, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-ceilometer-compute/images/17.1.9-1, architecture=x86_64, io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=032b792693069cded21b3a74ee4baa1db4887fb3, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container)
Oct 14 09:42:17 np0005486759.ooo.test python3[261981]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Oct 14 09:42:17 np0005486759.ooo.test podman[262046]: 
Oct 14 09:42:17 np0005486759.ooo.test podman[262046]: 2025-10-14 09:42:17.681886546 +0000 UTC m=+0.052104349 container create f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:42:17 np0005486759.ooo.test podman[262046]: 2025-10-14 09:42:17.65306551 +0000 UTC m=+0.023283343 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct 14 09:42:17 np0005486759.ooo.test python3[261981]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Oct 14 09:42:17 np0005486759.ooo.test sudo[261979]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:18 np0005486759.ooo.test sudo[262193]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svongyfjxiqnwswxucrozjbweygnidfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434937.9918416-487-130324534308582/AnsiballZ_stat.py
Oct 14 09:42:18 np0005486759.ooo.test sudo[262193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:18 np0005486759.ooo.test python3.9[262195]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:42:18 np0005486759.ooo.test sudo[262193]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:18 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:18.572 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:18 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:18.592 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Triggering sync for uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 14 09:42:18 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:18.593 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:42:18 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:18.593 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:42:18 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:18.593 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:18 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:18.657 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:42:18 np0005486759.ooo.test sudo[262305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgbzcuvfcniefksicijqqnhuhebwcfde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434938.7033691-496-131546127546219/AnsiballZ_file.py
Oct 14 09:42:18 np0005486759.ooo.test sudo[262305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:19 np0005486759.ooo.test python3.9[262307]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:19 np0005486759.ooo.test sudo[262305]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17956 DF PROTO=TCP SPT=54882 DPT=9882 SEQ=2161059377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9024010000000001030307) 
Oct 14 09:42:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:19.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:19 np0005486759.ooo.test sudo[262414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvhsrsobfrpfbyprqedkkufevrcarkmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434939.237423-496-10478645884312/AnsiballZ_copy.py
Oct 14 09:42:19 np0005486759.ooo.test sudo[262414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:19.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:19 np0005486759.ooo.test python3.9[262416]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434939.237423-496-10478645884312/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:19 np0005486759.ooo.test sudo[262414]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:20 np0005486759.ooo.test sudo[262469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgebqqjxtsxwpawqcweysiawlyvjvygk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434939.237423-496-10478645884312/AnsiballZ_systemd.py
Oct 14 09:42:20 np0005486759.ooo.test sudo[262469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:20 np0005486759.ooo.test python3.9[262471]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:42:20 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:20 np0005486759.ooo.test systemd-rc-local-generator[262499]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:20 np0005486759.ooo.test systemd-sysv-generator[262503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:20 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:20 np0005486759.ooo.test sudo[262469]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:21 np0005486759.ooo.test sudo[262561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmqljbgmkqtjwsinexsawqxfkegptxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434939.237423-496-10478645884312/AnsiballZ_systemd.py
Oct 14 09:42:21 np0005486759.ooo.test sudo[262561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:21 np0005486759.ooo.test python3.9[262563]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:21 np0005486759.ooo.test systemd-sysv-generator[262595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:21 np0005486759.ooo.test systemd-rc-local-generator[262590]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: Starting ceilometer_agent_compute container...
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:42:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3400c472c3e408054fc28baa03ffd14253fec1f9377b50c68a9924d0a4c4d303/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct 14 09:42:21 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3400c472c3e408054fc28baa03ffd14253fec1f9377b50c68a9924d0a4c4d303/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:42:21 np0005486759.ooo.test podman[262604]: 2025-10-14 09:42:21.938497452 +0000 UTC m=+0.097881425 container init f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:42:21 np0005486759.ooo.test ceilometer_agent_compute[262619]: + sudo -E kolla_set_configs
Oct 14 09:42:21 np0005486759.ooo.test ceilometer_agent_compute[262619]: sudo: unable to send audit message: Operation not permitted
Oct 14 09:42:21 np0005486759.ooo.test sudo[262625]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 09:42:21 np0005486759.ooo.test sudo[262625]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:42:21 np0005486759.ooo.test sudo[262625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:42:21 np0005486759.ooo.test podman[262604]: 2025-10-14 09:42:21.961467844 +0000 UTC m=+0.120851797 container start f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 09:42:21 np0005486759.ooo.test podman[262604]: ceilometer_agent_compute
Oct 14 09:42:21 np0005486759.ooo.test systemd[1]: Started ceilometer_agent_compute container.
Oct 14 09:42:21 np0005486759.ooo.test sudo[262561]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Validating config file
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Copying service configuration files
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: INFO:__main__:Writing out command to execute
Oct 14 09:42:22 np0005486759.ooo.test sudo[262625]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: ++ cat /run_command
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + ARGS=
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + sudo kolla_copy_cacerts
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: sudo: unable to send audit message: Operation not permitted
Oct 14 09:42:22 np0005486759.ooo.test sudo[262640]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 14 09:42:22 np0005486759.ooo.test sudo[262640]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:42:22 np0005486759.ooo.test sudo[262640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:42:22 np0005486759.ooo.test sudo[262640]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + [[ ! -n '' ]]
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + . kolla_extend_start
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + umask 0022
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct 14 09:42:22 np0005486759.ooo.test podman[262628]: 2025-10-14 09:42:22.032611177 +0000 UTC m=+0.065516335 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 09:42:22 np0005486759.ooo.test podman[262628]: 2025-10-14 09:42:22.062746256 +0000 UTC m=+0.095651414 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009)
Oct 14 09:42:22 np0005486759.ooo.test podman[262628]: unhealthy
Oct 14 09:42:22 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:42:22 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Failed with result 'exit-code'.
Oct 14 09:42:22 np0005486759.ooo.test sudo[262755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clnawmdkzowzbvgwefcnenoiwwbitxhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434942.133755-520-19228900288195/AnsiballZ_systemd.py
Oct 14 09:42:22 np0005486759.ooo.test sudo[262755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53129 DF PROTO=TCP SPT=50064 DPT=9102 SEQ=2007695371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9030810000000001030307) 
Oct 14 09:42:22 np0005486759.ooo.test python3.9[262757]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:42:22 np0005486759.ooo.test systemd[1]: Stopping ceilometer_agent_compute container...
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.745 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.746 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.747 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.748 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.749 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.750 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.751 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.752 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.753 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.754 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.755 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.756 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.757 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.758 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.759 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.774 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.775 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.775 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.784 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.841 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.885 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.885 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.903 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.903 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.903 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.904 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.905 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.906 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.907 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.908 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.909 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.910 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.911 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.912 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.913 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.914 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.915 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.916 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.917 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.918 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.919 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.920 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.921 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.922 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.922 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.922 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.922 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Oct 14 09:42:22 np0005486759.ooo.test ceilometer_agent_compute[262619]: 2025-10-14 09:42:22.926 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Oct 14 09:42:22 np0005486759.ooo.test virtqemud[225922]: End of file while reading data: Input/output error
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: libpod-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.scope: Deactivated successfully.
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: libpod-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.scope: Consumed 1.103s CPU time.
Oct 14 09:42:23 np0005486759.ooo.test podman[262761]: 2025-10-14 09:42:23.060462859 +0000 UTC m=+0.315168009 container died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.timer: Deactivated successfully.
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89-userdata-shm.mount: Deactivated successfully.
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3400c472c3e408054fc28baa03ffd14253fec1f9377b50c68a9924d0a4c4d303-merged.mount: Deactivated successfully.
Oct 14 09:42:23 np0005486759.ooo.test podman[262761]: 2025-10-14 09:42:23.099014005 +0000 UTC m=+0.353719145 container cleanup f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009)
Oct 14 09:42:23 np0005486759.ooo.test podman[262761]: ceilometer_agent_compute
Oct 14 09:42:23 np0005486759.ooo.test podman[262791]: 2025-10-14 09:42:23.190401792 +0000 UTC m=+0.056494978 container cleanup f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Oct 14 09:42:23 np0005486759.ooo.test podman[262791]: ceilometer_agent_compute
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Stopped ceilometer_agent_compute container.
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Starting ceilometer_agent_compute container...
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:42:23 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3400c472c3e408054fc28baa03ffd14253fec1f9377b50c68a9924d0a4c4d303/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct 14 09:42:23 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3400c472c3e408054fc28baa03ffd14253fec1f9377b50c68a9924d0a4c4d303/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:42:23 np0005486759.ooo.test podman[262804]: 2025-10-14 09:42:23.316425162 +0000 UTC m=+0.092689670 container init f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + sudo -E kolla_set_configs
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: sudo: unable to send audit message: Operation not permitted
Oct 14 09:42:23 np0005486759.ooo.test sudo[262824]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 09:42:23 np0005486759.ooo.test sudo[262824]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:42:23 np0005486759.ooo.test sudo[262824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:42:23 np0005486759.ooo.test podman[262804]: 2025-10-14 09:42:23.349000769 +0000 UTC m=+0.125265217 container start f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 09:42:23 np0005486759.ooo.test podman[262804]: ceilometer_agent_compute
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: Started ceilometer_agent_compute container.
Oct 14 09:42:23 np0005486759.ooo.test sudo[262755]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Validating config file
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Copying service configuration files
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: INFO:__main__:Writing out command to execute
Oct 14 09:42:23 np0005486759.ooo.test sudo[262824]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: ++ cat /run_command
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + ARGS=
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + sudo kolla_copy_cacerts
Oct 14 09:42:23 np0005486759.ooo.test podman[262827]: 2025-10-14 09:42:23.416741844 +0000 UTC m=+0.074049807 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: sudo: unable to send audit message: Operation not permitted
Oct 14 09:42:23 np0005486759.ooo.test sudo[262851]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 14 09:42:23 np0005486759.ooo.test sudo[262851]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 14 09:42:23 np0005486759.ooo.test sudo[262851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 14 09:42:23 np0005486759.ooo.test sudo[262851]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + [[ ! -n '' ]]
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + . kolla_extend_start
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + umask 0022
Oct 14 09:42:23 np0005486759.ooo.test ceilometer_agent_compute[262818]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct 14 09:42:23 np0005486759.ooo.test podman[262827]: 2025-10-14 09:42:23.452337737 +0000 UTC m=+0.109645700 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:42:23 np0005486759.ooo.test podman[262827]: unhealthy
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:42:23 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Failed with result 'exit-code'.
Oct 14 09:42:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:23.583 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:23.584 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:23.584 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:42:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:23.584 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:42:23 np0005486759.ooo.test sudo[262957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlcycuauwbyzdjhlgwkzpwwrwoublutj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434943.5672789-528-126225910703230/AnsiballZ_stat.py
Oct 14 09:42:23 np0005486759.ooo.test sudo[262957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:23 np0005486759.ooo.test python3.9[262959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:24 np0005486759.ooo.test sudo[262957]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.233 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.234 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.234 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.234 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.234 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.234 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.235 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.235 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.235 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.235 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.235 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.235 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.236 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.236 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.236 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.236 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.236 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.237 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.237 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.237 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.237 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.237 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.237 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.238 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.238 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.238 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.238 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.238 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.239 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.239 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.239 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.239 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.240 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.240 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.240 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.240 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.240 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.240 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.241 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.241 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.241 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.241 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.241 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.241 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.242 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.242 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.242 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.242 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.242 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.243 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.244 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.248 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.249 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.250 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.251 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.252 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.253 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.254 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.255 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.256 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.257 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.257 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.257 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.272 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.273 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.274 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.286 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct 14 09:42:24 np0005486759.ooo.test sudo[263048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbfnswqihxlkfhwxynoqjmeudsmvqocx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434943.5672789-528-126225910703230/AnsiballZ_copy.py
Oct 14 09:42:24 np0005486759.ooo.test sudo[263048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.415 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.416 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.417 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.418 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.419 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.420 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.421 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.422 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.423 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.424 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.425 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.426 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.427 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.428 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.429 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.430 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.431 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.432 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.433 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.436 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.441 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct 14 09:42:24 np0005486759.ooo.test python3.9[263050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434943.5672789-528-126225910703230/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:42:24 np0005486759.ooo.test sudo[263048]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:24.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:24.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:24.834 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}50eeb5a4c00a410f1bbbbed2e47d5de806bc8c7db2948185f23e35b99defdaf0" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.080 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Tue, 14 Oct 2025 09:42:24 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e93cd57a-4555-4adc-8fb8-624548fda6d9 x-openstack-request-id: req-e93cd57a-4555-4adc-8fb8-624548fda6d9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.080 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "0bdb7446-7a7f-4e51-8a88-180de2e09857", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/0bdb7446-7a7f-4e51-8a88-180de2e09857"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/0bdb7446-7a7f-4e51-8a88-180de2e09857"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.080 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-e93cd57a-4555-4adc-8fb8-624548fda6d9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.081 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/0bdb7446-7a7f-4e51-8a88-180de2e09857 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}50eeb5a4c00a410f1bbbbed2e47d5de806bc8c7db2948185f23e35b99defdaf0" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.129 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Tue, 14 Oct 2025 09:42:25 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-0e11f9c4-52b1-4393-9f71-ca9ac876c24d x-openstack-request-id: req-0e11f9c4-52b1-4393-9f71-ca9ac876c24d _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.129 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "0bdb7446-7a7f-4e51-8a88-180de2e09857", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/0bdb7446-7a7f-4e51-8a88-180de2e09857"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/0bdb7446-7a7f-4e51-8a88-180de2e09857"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.129 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/0bdb7446-7a7f-4e51-8a88-180de2e09857 used request id req-0e11f9c4-52b1-4393-9f71-ca9ac876c24d request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.130 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.130 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.159 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.160 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51b65a58-bdc5-47fe-b501-fbdb5830a684', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.131392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1711b9a0-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '941ba26ecf535c7a1e24b594a5a209cc205849896c3d90cb5af74b21c4aca05d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.131392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1711c706-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '6b99dc272e645fcba0e4e8becfea3f0e669d84f0c8842155a9d5d9eab7ba5c48'}]}, 'timestamp': '2025-10-14 09:42:25.160290', '_unique_id': '892913ec9c034bb08cec7e74ccf31489'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.164 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.169 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4408214d-dae5-4452-92e9-eb4abd6589d4 / tapeee08de8-f9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.169 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e844794-2cdd-49bb-892b-9fc9e2471173', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.166625', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '171331ea-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': 'b8e7f1bb77543b8e39ef0ea219827eab42d39ab16d2438ce07d036551dc10ad5'}]}, 'timestamp': '2025-10-14 09:42:25.169596', '_unique_id': '68dc1dba62ed4e0cbce47e0deab18864'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.170 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ec4fff5-05d2-4372-9abe-bb4aca383877', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.170793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '17136b42-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': 'e717ff1404c0befffaa05ac67d617e431371a0f97850c04fc50992ed6d3a00dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.170793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '17137376-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '03bedd8270750d50e70e6dbb0a729b07e80ee91bd3280990a87c1b796f5c80f8'}]}, 'timestamp': '2025-10-14 09:42:25.171232', '_unique_id': '915b091b26d54cd6b69c200aac483976'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.171 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.172 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.172 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c57f3fed-5377-4ed6-bc8e-fed20de3b2aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.172249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1713a31e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': 'fc432a2ecb53cef90bba0656b1f5ead4c2f89620d1d01bb4348126fdd5c8182b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.172249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1713aabc-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '3676b55cb68dbbfa899139a05ea168d8eea045d325437739ef90f8430ee8cf18'}]}, 'timestamp': '2025-10-14 09:42:25.172646', '_unique_id': '8f1cd3465d734b7eb9496d12bf316d8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test sudo[263161]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shfdrcncwdlykwuhtwdwcguwhwedgnma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434944.914888-545-26656611517654/AnsiballZ_container_config_data.py
Oct 14 09:42:25 np0005486759.ooo.test sudo[263161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.188 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.188 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '518abc94-1dc4-454d-a9b5-c6c130b26daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.173674', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '171610b8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.368786357, 'message_signature': '481ec3e42c40450b3ce894b140d3ed07bcb47cf6ac2d769376635006cdc69fd9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.173674', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '17161bee-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.368786357, 'message_signature': 'ef3c89b400cde0dec3eb4ba9d0a022cd19ea84d9c7252d4e0557337232b9e69e'}]}, 'timestamp': '2025-10-14 09:42:25.188685', '_unique_id': 'ceb6dcdb2bb54b358bda243178382710'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.189 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.190 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.190 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.190 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a69923d-6c6b-490d-8691-3a55b2c98601', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.190499', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '17166c0c-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': '2bc7c65181fc887b89d6969b89a951e110e2e77cc156b17b3d93636414c5af31'}]}, 'timestamp': '2025-10-14 09:42:25.190745', '_unique_id': 'a19ed054c42f44f693f5d656f64c1627'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.191 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3e54d97-a8eb-498a-9e81-718c9ca35e42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.191798', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '17169eca-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': 'be2fce48eea72282836dcc0fad9becae45e002d76d93f23c3c1d01e6f6de071a'}]}, 'timestamp': '2025-10-14 09:42:25.192032', '_unique_id': 'aaf5a67713d04b47b9eb723688d36047'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.192 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9563d52e-9c7a-473a-a663-7bcdee06f748', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.193178', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '1716d4a8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': '060994334299e76ae18b2d255d5e33192bcd69ed0ed5aa2306d9a9474db218e0'}]}, 'timestamp': '2025-10-14 09:42:25.193394', '_unique_id': '37917011355b4a3883fa7491742d195b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.193 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.194 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.194 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '836efcba-bf26-4ccb-b863-9c515dba6ad0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.194393', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1717040a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.368786357, 'message_signature': '10ea8144bff2ec3364e2b0ac89b8512b624a1625e99341ab969ce21fc3e54210'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.194393', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '17170b76-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.368786357, 'message_signature': '8c7345ea713e5911bda55f37d4e8f8d067277026ee48484569e263c80bfd20ce'}]}, 'timestamp': '2025-10-14 09:42:25.194781', '_unique_id': '6d04a7ed9af449c4a1ad5ce02b05c8aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.195 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5df678a7-d5d0-4687-a95c-521821aa3123', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.195976', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '1717423a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': 'edf048a5179df40b77f527e0c4b7a52cb7f7e1dbeb43fe20397a8008cfef42fe'}]}, 'timestamp': '2025-10-14 09:42:25.196197', '_unique_id': '284aadb49c534ad59337402f4b5414d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.196 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eedc502-68c6-4d45-b55b-2f010429c121', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.197247', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '1717739a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': 'f9930cd7cfeb06ed0f3c4fd1b47c460b0e72bedfe3f3a16ac42a031b9117b374'}]}, 'timestamp': '2025-10-14 09:42:25.197466', '_unique_id': '863b7e3e93e74c2e9b104efc0d5f093b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.197 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.198 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.198 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.198 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72527beb-6357-491c-a52b-f300712bc1aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.198813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1717b1d4-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.368786357, 'message_signature': '36776c162e7134e51f7abac86485029646590b934fc81ce420e1152dff935a0c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.198813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1717bbb6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.368786357, 'message_signature': 'fe656d5ae87458844c12284a1e98068366938b5276141f993dc7f7e7d79ff905'}]}, 'timestamp': '2025-10-14 09:42:25.199295', '_unique_id': '6541271e7d6b48588b2ac867b056d9ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.199 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.200 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 47220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f83b295a-6abb-4fae-bb25-4bc64659c29b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47220000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:42:25.200388', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '171b41a0-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.416930649, 'message_signature': '602a942d7c985dd1096ce519ad8e168ca657ff49bbdcd4ac15c9be7627f55428'}]}, 'timestamp': '2025-10-14 09:42:25.222402', '_unique_id': 'e02635d0d16949d3abc4bcef8740b989'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.222 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.223 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83fa3621-baa3-4f8d-9261-40358bc56c20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.223528', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '171b7652-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': '36a05b7c34250c366c858ab51e83970368c463ed6562c5cfdc5799434280b42b'}]}, 'timestamp': '2025-10-14 09:42:25.223746', '_unique_id': '312e7746eaa646af9466fdcf5c499fdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.224 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39a52ed2-efd2-4b2c-9a93-10e96df2431e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.224743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '171ba5a0-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '6c2d878768faa80710055d4f7798e123e14a581df7faef78399df52dfa79333f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.224743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '171baeba-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '13afaf6baef695424f76466f85be127eed6d4b25d93604d5624be50ed69fad1f'}]}, 'timestamp': '2025-10-14 09:42:25.225176', '_unique_id': 'c2f674fdc196469498e5dc3c538ff1bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.225 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.226 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.226 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7db24589-900b-4993-a863-c199cccc1a86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.226193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '171bdeee-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '21b119e37fe55c985cfacc11ee6a103f9fb0adab8dc1feac47a23f6140cb3c37'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.226193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '171be678-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '64c58bdee0fc55572c22a5a27d0f28a75b52cf85cb9d63858bf4599f977e92fc'}]}, 'timestamp': '2025-10-14 09:42:25.226601', '_unique_id': 'a59003b1d6c9454dbaac0a8847386579'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.227 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7238af79-531c-425d-8160-1e5ecb308d77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:42:25.227621', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '171c15ee-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': 'beacfe261f07e883261bf611b56963c300714e8f23fb80413f255bfd7a7688da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:42:25.227621', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '171c1d14-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.326499872, 'message_signature': '7eec0ca615b7afb4b3adc0434e448cd3f2fef535d71ec2472875bda8a288a099'}]}, 'timestamp': '2025-10-14 09:42:25.228015', '_unique_id': '3bbc50f6f9f144199fdc3568de41b264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.228 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee1a1cfe-71d4-4848-b009-42f1619bcfa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.229078', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '171c5090-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': 'd616bca01b33ad7a7507d89c43b268553761b52f1c7167ebd3d0947de728d25f'}]}, 'timestamp': '2025-10-14 09:42:25.229331', '_unique_id': '52c96ddebbb14ea1893fb886bcb807b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.229 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9aed1d50-15bc-4c92-abe0-1195e9ae45f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.230321', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '171c7f98-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': '28195fa36c37e4f2abf88d428d6743c83f02f93de8c4824007e42f6cbe2bb644'}]}, 'timestamp': '2025-10-14 09:42:25.230537', '_unique_id': '8deab456111744388c6c1368756bb6e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.230 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.231 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.231 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.231 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e17928f0-5c85-4fab-853e-4da271614db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:42:25.231792', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '171cb8c8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.361789895, 'message_signature': '4f4f1defdf2a81dbebc27172c1689ad889d939f0c2a85932c6a7cbc71692774f'}]}, 'timestamp': '2025-10-14 09:42:25.232035', '_unique_id': '07c0966e31a6456a8a2157f4db9edee5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.232 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b5b9c45-8f2b-4a94-abea-95792d3bef74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:42:25.233137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '171ced7a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 10965.416930649, 'message_signature': '6e522a71199c149803874854fe056eed936d19124cce13e49568ff26d009d5e8'}]}, 'timestamp': '2025-10-14 09:42:25.233357', '_unique_id': '4b88bd1e81984727a744c9ce53a65a61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:42:25 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:42:25.233 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:42:25 np0005486759.ooo.test python3.9[263163]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct 14 09:42:25 np0005486759.ooo.test sudo[263161]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:25.728 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:42:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:25.729 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:42:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:25.729 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:42:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:25.729 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:42:25 np0005486759.ooo.test sudo[263271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgyobynjmxnludztqbhdnrivzlwgpdfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434945.663174-554-174397923804272/AnsiballZ_container_config_hash.py
Oct 14 09:42:25 np0005486759.ooo.test sudo[263271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:26 np0005486759.ooo.test python3.9[263273]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.119 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:42:26 np0005486759.ooo.test sudo[263271]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.145 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.145 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.145 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.146 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.146 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.146 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.146 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.147 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.147 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.147 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.169 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.169 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.169 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.170 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.285 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.362 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.363 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.420 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.422 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.476 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.478 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.556 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:42:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53130 DF PROTO=TCP SPT=50064 DPT=9102 SEQ=2007695371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9040410000000001030307) 
Oct 14 09:42:26 np0005486759.ooo.test sudo[263393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cymjhwbqbxebggiamwfarwpsrzdzyyqm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434946.4521978-564-183674914695284/AnsiballZ_edpm_container_manage.py
Oct 14 09:42:26 np0005486759.ooo.test sudo[263393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.722 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.724 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=13118MB free_disk=386.8875160217285GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.724 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.725 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.808 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.809 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.809 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.848 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.863 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.865 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:42:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:26.865 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:42:26 np0005486759.ooo.test python3[263395]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:42:27 np0005486759.ooo.test podman[263435]: 
Oct 14 09:42:27 np0005486759.ooo.test podman[263435]: 2025-10-14 09:42:27.241851252 +0000 UTC m=+0.077194547 container create 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:42:27 np0005486759.ooo.test podman[263435]: 2025-10-14 09:42:27.199778824 +0000 UTC m=+0.035122169 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Oct 14 09:42:27 np0005486759.ooo.test python3[263395]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct 14 09:42:27 np0005486759.ooo.test sudo[263393]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:27 np0005486759.ooo.test sudo[263580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zplkdbjaqxvbguxmdlhtkqgmulsjhqwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434947.5765305-572-50276359831842/AnsiballZ_stat.py
Oct 14 09:42:27 np0005486759.ooo.test sudo[263580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:28 np0005486759.ooo.test python3.9[263582]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:42:28 np0005486759.ooo.test sudo[263580]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:28 np0005486759.ooo.test sudo[263692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwntydmvnsddmgjqiutsxcxpeaqhksvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434948.310754-581-207287921123801/AnsiballZ_file.py
Oct 14 09:42:28 np0005486759.ooo.test sudo[263692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:28 np0005486759.ooo.test python3.9[263694]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:28 np0005486759.ooo.test sudo[263692]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:29.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:29 np0005486759.ooo.test sudo[263801]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrqvdxcnrhwsezgdailbuttuqrotvolg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434948.867643-581-121766443090905/AnsiballZ_copy.py
Oct 14 09:42:29 np0005486759.ooo.test sudo[263801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:30 np0005486759.ooo.test python3.9[263803]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434948.867643-581-121766443090905/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:30 np0005486759.ooo.test sudo[263801]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:30 np0005486759.ooo.test sudo[263856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biurjwnfwkmzulmtirzewttpiobmkijc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434948.867643-581-121766443090905/AnsiballZ_systemd.py
Oct 14 09:42:30 np0005486759.ooo.test sudo[263856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:30 np0005486759.ooo.test python3.9[263858]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:42:30 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:30 np0005486759.ooo.test systemd-rc-local-generator[263882]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:30 np0005486759.ooo.test systemd-sysv-generator[263886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:30 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:30 np0005486759.ooo.test sudo[263856]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:31 np0005486759.ooo.test sudo[263947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwarqldtrszbgzyoqqyhjdznjghbnllr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434948.867643-581-121766443090905/AnsiballZ_systemd.py
Oct 14 09:42:31 np0005486759.ooo.test sudo[263947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:31 np0005486759.ooo.test python3.9[263949]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:31 np0005486759.ooo.test podman[263952]: 2025-10-14 09:42:31.624772157 +0000 UTC m=+0.063658507 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:42:31 np0005486759.ooo.test podman[263952]: 2025-10-14 09:42:31.637330417 +0000 UTC m=+0.076216787 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:42:31 np0005486759.ooo.test systemd-sysv-generator[264007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:31 np0005486759.ooo.test systemd-rc-local-generator[264003]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:31 np0005486759.ooo.test podman[263951]: 2025-10-14 09:42:31.687756761 +0000 UTC m=+0.124850623 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:42:31 np0005486759.ooo.test podman[263951]: 2025-10-14 09:42:31.723351633 +0000 UTC m=+0.160445155 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:42:31 np0005486759.ooo.test systemd[1]: Starting node_exporter container...
Oct 14 09:42:32 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:42:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:42:32 np0005486759.ooo.test podman[264031]: 2025-10-14 09:42:32.054509869 +0000 UTC m=+0.141618316 container init 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.066Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.066Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.066Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.067Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.067Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.067Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.067Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.067Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.067Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=arp
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=bcache
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=bonding
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=btrfs
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=conntrack
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=cpu
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=diskstats
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=edac
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=filefd
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=filesystem
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=infiniband
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=ipvs
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=loadavg
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=mdadm
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=meminfo
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=netclass
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=netdev
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=netstat
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=nfs
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=nfsd
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=nvme
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=schedstat
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=sockstat
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=softnet
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=systemd
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=tapestats
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=vmstat
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=xfs
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.068Z caller=node_exporter.go:117 level=info collector=zfs
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.069Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct 14 09:42:32 np0005486759.ooo.test node_exporter[264045]: ts=2025-10-14T09:42:32.069Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Oct 14 09:42:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:42:32 np0005486759.ooo.test podman[264031]: 2025-10-14 09:42:32.08409638 +0000 UTC m=+0.171204798 container start 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:42:32 np0005486759.ooo.test podman[264031]: node_exporter
Oct 14 09:42:32 np0005486759.ooo.test systemd[1]: Started node_exporter container.
Oct 14 09:42:32 np0005486759.ooo.test sudo[263947]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:32 np0005486759.ooo.test podman[264054]: 2025-10-14 09:42:32.135088353 +0000 UTC m=+0.047714479 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:42:32 np0005486759.ooo.test rsyslogd[758]: imjournal from <localhost:node_exporter>: begin to drop messages due to rate-limiting
Oct 14 09:42:32 np0005486759.ooo.test podman[264054]: 2025-10-14 09:42:32.145196985 +0000 UTC m=+0.057823121 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:42:32 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:42:32 np0005486759.ooo.test sudo[264184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdwxpxuauigvumgkxwgrcqykrqczcgjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434952.446166-605-107805137649057/AnsiballZ_systemd.py
Oct 14 09:42:32 np0005486759.ooo.test sudo[264184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:33 np0005486759.ooo.test python3.9[264186]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Stopping node_exporter container...
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: libpod-347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.scope: Deactivated successfully.
Oct 14 09:42:33 np0005486759.ooo.test podman[264190]: 2025-10-14 09:42:33.130823503 +0000 UTC m=+0.050605931 container stop 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:42:33 np0005486759.ooo.test podman[264190]: 2025-10-14 09:42:33.159251967 +0000 UTC m=+0.079034435 container died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.timer: Deactivated successfully.
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3-userdata-shm.mount: Deactivated successfully.
Oct 14 09:42:33 np0005486759.ooo.test podman[264190]: 2025-10-14 09:42:33.250857902 +0000 UTC m=+0.170640300 container cleanup 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:42:33 np0005486759.ooo.test podman[264190]: node_exporter
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 14 09:42:33 np0005486759.ooo.test podman[264216]: 2025-10-14 09:42:33.338577132 +0000 UTC m=+0.060320000 container cleanup 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:42:33 np0005486759.ooo.test podman[264216]: node_exporter
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Stopped node_exporter container.
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Starting node_exporter container...
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:42:33 np0005486759.ooo.test podman[264228]: 2025-10-14 09:42:33.506635999 +0000 UTC m=+0.125202764 container init 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.521Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.521Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.521Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.521Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.522Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.522Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.522Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.522Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.522Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=arp
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=bcache
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=bonding
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=btrfs
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=conntrack
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=cpu
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=diskstats
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=edac
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=filefd
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=filesystem
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=infiniband
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=ipvs
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=loadavg
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=mdadm
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=meminfo
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=netclass
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=netdev
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=netstat
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=nfs
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=nfsd
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=nvme
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=schedstat
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=sockstat
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=softnet
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=systemd
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=tapestats
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=vmstat
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=xfs
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.523Z caller=node_exporter.go:117 level=info collector=zfs
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.524Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct 14 09:42:33 np0005486759.ooo.test node_exporter[264243]: ts=2025-10-14T09:42:33.524Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:42:33 np0005486759.ooo.test podman[264228]: 2025-10-14 09:42:33.545926089 +0000 UTC m=+0.164492844 container start 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:42:33 np0005486759.ooo.test podman[264228]: node_exporter
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: Started node_exporter container.
Oct 14 09:42:33 np0005486759.ooo.test sudo[264184]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:33 np0005486759.ooo.test podman[264252]: 2025-10-14 09:42:33.593987068 +0000 UTC m=+0.048094531 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:42:33 np0005486759.ooo.test podman[264252]: 2025-10-14 09:42:33.624048065 +0000 UTC m=+0.078155528 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:42:33 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:42:34 np0005486759.ooo.test sudo[264382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvyvowwvlytavrozrhwpqkwwdlbimxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434953.7311583-613-219443718365259/AnsiballZ_stat.py
Oct 14 09:42:34 np0005486759.ooo.test sudo[264382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:34 np0005486759.ooo.test python3.9[264384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:34 np0005486759.ooo.test sudo[264382]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59215 DF PROTO=TCP SPT=42664 DPT=9100 SEQ=3958993199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F905E180000000001030307) 
Oct 14 09:42:34 np0005486759.ooo.test sudo[264470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikbgkbaldqyhiqdvucgymupofrmsnklb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434953.7311583-613-219443718365259/AnsiballZ_copy.py
Oct 14 09:42:34 np0005486759.ooo.test sudo[264470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:34.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:34.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:34 np0005486759.ooo.test python3.9[264472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434953.7311583-613-219443718365259/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:42:34 np0005486759.ooo.test sudo[264470]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59216 DF PROTO=TCP SPT=42664 DPT=9100 SEQ=3958993199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9062020000000001030307) 
Oct 14 09:42:35 np0005486759.ooo.test sudo[264580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onnadljfxrfgixmsxszgnaddezncanqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434955.026868-630-211097735687543/AnsiballZ_container_config_data.py
Oct 14 09:42:35 np0005486759.ooo.test sudo[264580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:35 np0005486759.ooo.test python3.9[264582]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct 14 09:42:35 np0005486759.ooo.test sudo[264580]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:36 np0005486759.ooo.test sudo[264690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzwoafyivwezzkqxojsddanmqfdjmzzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434955.749417-639-102545230759207/AnsiballZ_container_config_hash.py
Oct 14 09:42:36 np0005486759.ooo.test sudo[264690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:42:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:42:36 np0005486759.ooo.test podman[264694]: 2025-10-14 09:42:36.146518408 +0000 UTC m=+0.079969615 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 09:42:36 np0005486759.ooo.test podman[264693]: 2025-10-14 09:42:36.193299777 +0000 UTC m=+0.128471899 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible)
Oct 14 09:42:36 np0005486759.ooo.test podman[264693]: 2025-10-14 09:42:36.199719111 +0000 UTC m=+0.134891223 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:42:36 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:42:36 np0005486759.ooo.test podman[264694]: 2025-10-14 09:42:36.225246073 +0000 UTC m=+0.158697220 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 14 09:42:36 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:42:36 np0005486759.ooo.test python3.9[264692]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:42:36 np0005486759.ooo.test sudo[264690]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:36 np0005486759.ooo.test sudo[264836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sivdcwvrrzxmosthgnqvcmsnlcntoakr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434956.6514573-649-142528189277870/AnsiballZ_edpm_container_manage.py
Oct 14 09:42:36 np0005486759.ooo.test sudo[264836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:37 np0005486759.ooo.test python3[264838]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:42:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59217 DF PROTO=TCP SPT=42664 DPT=9100 SEQ=3958993199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F906A010000000001030307) 
Oct 14 09:42:38 np0005486759.ooo.test podman[264852]: 2025-10-14 09:42:37.285413073 +0000 UTC m=+0.027673512 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct 14 09:42:39 np0005486759.ooo.test podman[264925]: 
Oct 14 09:42:39 np0005486759.ooo.test podman[264925]: 2025-10-14 09:42:39.211254684 +0000 UTC m=+0.079195910 container create 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Oct 14 09:42:39 np0005486759.ooo.test podman[264925]: 2025-10-14 09:42:39.178914196 +0000 UTC m=+0.046855482 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct 14 09:42:39 np0005486759.ooo.test python3[264838]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct 14 09:42:39 np0005486759.ooo.test sudo[264836]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:39.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:39.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:39 np0005486759.ooo.test sudo[265070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckxrhvyyuffjsjwfjiagihutudlhbdht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434959.5202992-657-175038387082704/AnsiballZ_stat.py
Oct 14 09:42:39 np0005486759.ooo.test sudo[265070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:39 np0005486759.ooo.test python3.9[265072]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:42:39 np0005486759.ooo.test sudo[265070]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:40 np0005486759.ooo.test sudo[265182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdxidishdwrzyxiddjbiqopllwosrqmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434960.2301161-666-260580337608921/AnsiballZ_file.py
Oct 14 09:42:40 np0005486759.ooo.test sudo[265182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:40 np0005486759.ooo.test python3.9[265184]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:40 np0005486759.ooo.test sudo[265182]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:41 np0005486759.ooo.test sudo[265291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtdhvkgfudaxdbntenzzrmodolaznarb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434960.8282778-666-226745350562649/AnsiballZ_copy.py
Oct 14 09:42:41 np0005486759.ooo.test sudo[265291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59218 DF PROTO=TCP SPT=42664 DPT=9100 SEQ=3958993199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9079C10000000001030307) 
Oct 14 09:42:41 np0005486759.ooo.test python3.9[265293]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760434960.8282778-666-226745350562649/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:42:41 np0005486759.ooo.test sudo[265291]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:41 np0005486759.ooo.test sudo[265346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itfaabhuirrggczmfiselqpczwvqopjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434960.8282778-666-226745350562649/AnsiballZ_systemd.py
Oct 14 09:42:41 np0005486759.ooo.test sudo[265346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:42 np0005486759.ooo.test python3.9[265348]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:42:42 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:42 np0005486759.ooo.test systemd-rc-local-generator[265370]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:42 np0005486759.ooo.test systemd-sysv-generator[265373]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:42 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33874 DF PROTO=TCP SPT=34316 DPT=9882 SEQ=1754994524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F907D680000000001030307) 
Oct 14 09:42:42 np0005486759.ooo.test sudo[265346]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:42 np0005486759.ooo.test sudo[265437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykbuszxkeuiwksxmctiohnhzlzeitfta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434960.8282778-666-226745350562649/AnsiballZ_systemd.py
Oct 14 09:42:42 np0005486759.ooo.test sudo[265437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:42 np0005486759.ooo.test python3.9[265439]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:42:43 np0005486759.ooo.test systemd-sysv-generator[265472]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:42:43 np0005486759.ooo.test systemd-rc-local-generator[265468]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Starting podman_exporter container...
Oct 14 09:42:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33875 DF PROTO=TCP SPT=34316 DPT=9882 SEQ=1754994524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9081810000000001030307) 
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:42:43 np0005486759.ooo.test podman[265480]: 2025-10-14 09:42:43.420860731 +0000 UTC m=+0.106671953 container init 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:42:43 np0005486759.ooo.test podman_exporter[265494]: ts=2025-10-14T09:42:43.432Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 14 09:42:43 np0005486759.ooo.test podman_exporter[265494]: ts=2025-10-14T09:42:43.432Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 14 09:42:43 np0005486759.ooo.test podman_exporter[265494]: ts=2025-10-14T09:42:43.432Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 14 09:42:43 np0005486759.ooo.test podman_exporter[265494]: ts=2025-10-14T09:42:43.432Z caller=handler.go:105 level=info collector=container
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:42:43 np0005486759.ooo.test podman[265480]: 2025-10-14 09:42:43.44486728 +0000 UTC m=+0.130678492 container start 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:42:43 np0005486759.ooo.test podman[265480]: podman_exporter
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Starting Podman API Service...
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Started podman_exporter container.
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: Started Podman API Service.
Oct 14 09:42:43 np0005486759.ooo.test sudo[265437]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:43Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:43Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:43Z" level=info msg="Setting parallel job count to 25"
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:43Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:43Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:42:43 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 14 09:42:43 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:43Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:42:43 np0005486759.ooo.test podman[265504]: 2025-10-14 09:42:43.503122329 +0000 UTC m=+0.052910858 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:42:43 np0005486759.ooo.test podman[265504]: 2025-10-14 09:42:43.533774952 +0000 UTC m=+0.083563481 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:42:43 np0005486759.ooo.test podman[265504]: unhealthy
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:42:43 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:42:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:44.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:44.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:44.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:42:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:44.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:44 np0005486759.ooo.test sudo[265649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txdjgqumogwcrlahjifjviakvaubyaay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434963.6750607-690-272404742333313/AnsiballZ_systemd.py
Oct 14 09:42:44 np0005486759.ooo.test sudo[265649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:44.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:44.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:45 np0005486759.ooo.test python3.9[265651]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: Stopping podman_exporter container...
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:42:43 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1"
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: libpod-8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.scope: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test podman[265655]: 2025-10-14 09:42:45.134840012 +0000 UTC m=+0.056308857 container died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.timer: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:42:45 np0005486759.ooo.test podman[265655]: 2025-10-14 09:42:45.177380747 +0000 UTC m=+0.098849582 container cleanup 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:42:45 np0005486759.ooo.test podman[265655]: podman_exporter
Oct 14 09:42:45 np0005486759.ooo.test podman[265668]: 2025-10-14 09:42:45.242397232 +0000 UTC m=+0.107848930 container cleanup 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-43dd12bb7c99a85b0d477bd354ea12651a5892876e47db1d4cfb4170c120cd54-merged.mount: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd-userdata-shm.mount: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:45 np0005486759.ooo.test systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 14 09:42:45 np0005486759.ooo.test podman[265682]: 2025-10-14 09:42:45.630602495 +0000 UTC m=+0.066903357 container cleanup 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:42:45 np0005486759.ooo.test podman[265682]: podman_exporter
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: Stopped podman_exporter container.
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: Starting podman_exporter container...
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ada07e432d0e43eebd550951648a1927a38ab08f9b982361ae15057deb14876d-merged.mount: Deactivated successfully.
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:46 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:42:47 np0005486759.ooo.test podman[265693]: 2025-10-14 09:42:47.011612556 +0000 UTC m=+0.721276249 container init 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:42:47 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:42:47.028Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 14 09:42:47 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:42:47.028Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 14 09:42:47 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:42:47 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 14 09:42:47 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:42:47 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:42:47.028Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 14 09:42:47 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:42:47.028Z caller=handler.go:105 level=info collector=container
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:42:47 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:47 np0005486759.ooo.test podman[265693]: 2025-10-14 09:42:47.098763331 +0000 UTC m=+0.808426984 container start 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:42:47 np0005486759.ooo.test podman[265693]: podman_exporter
Oct 14 09:42:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58287 DF PROTO=TCP SPT=44608 DPT=9105 SEQ=2301004359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9090410000000001030307) 
Oct 14 09:42:47 np0005486759.ooo.test podman[265717]: 2025-10-14 09:42:47.125896191 +0000 UTC m=+0.075618607 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:42:47 np0005486759.ooo.test podman[265717]: 2025-10-14 09:42:47.131035676 +0000 UTC m=+0.080758092 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:42:47 np0005486759.ooo.test podman[265717]: unhealthy
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: Started podman_exporter container.
Oct 14 09:42:47 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:47 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:42:47 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:42:47 np0005486759.ooo.test sudo[265649]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:48 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:42:48Z" level=error msg="Getting root fs size for \"0ad3a2694350330a0b322c26c724c85d944d360619e7457a823c7901d24bfba1\": unmounting layer e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df: replacing mount point \"/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged\": device or resource busy"
Oct 14 09:42:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:48 np0005486759.ooo.test sudo[265846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tksvzupercyiqldytptivvdcgfmlsrkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434967.9607594-698-3179833731068/AnsiballZ_stat.py
Oct 14 09:42:48 np0005486759.ooo.test sudo[265846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:48 np0005486759.ooo.test python3.9[265848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:42:48 np0005486759.ooo.test sudo[265846]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:48 np0005486759.ooo.test sudo[265934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gluwjbqpltngttagclqewjdfmvjpnwfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434967.9607594-698-3179833731068/AnsiballZ_copy.py
Oct 14 09:42:48 np0005486759.ooo.test sudo[265934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:48 np0005486759.ooo.test python3.9[265936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760434967.9607594-698-3179833731068/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:42:49 np0005486759.ooo.test sudo[265934]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b272b93e3e9b77af062c082df69449b0ad42f33081484c8b336852990e7bca40-merged.mount: Deactivated successfully.
Oct 14 09:42:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33877 DF PROTO=TCP SPT=34316 DPT=9882 SEQ=1754994524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9099410000000001030307) 
Oct 14 09:42:49 np0005486759.ooo.test sudo[266044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfhtqhdrauzjrftqqvkxoxpgzkotzdfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434969.2740993-715-267715379640913/AnsiballZ_container_config_data.py
Oct 14 09:42:49 np0005486759.ooo.test sudo[266044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ada07e432d0e43eebd550951648a1927a38ab08f9b982361ae15057deb14876d-merged.mount: Deactivated successfully.
Oct 14 09:42:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ada07e432d0e43eebd550951648a1927a38ab08f9b982361ae15057deb14876d-merged.mount: Deactivated successfully.
Oct 14 09:42:49 np0005486759.ooo.test python3.9[266046]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct 14 09:42:49 np0005486759.ooo.test sudo[266044]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:49.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:49.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:49.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:42:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:49.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:49.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:49 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:49.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:50 np0005486759.ooo.test sudo[266154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjqlxxfidvtxyklwdcufbytksvbfgsyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760434970.025034-724-104152665217002/AnsiballZ_container_config_hash.py
Oct 14 09:42:50 np0005486759.ooo.test sudo[266154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:50 np0005486759.ooo.test python3.9[266156]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:42:50 np0005486759.ooo.test sudo[266154]: pam_unix(sudo:session): session closed for user root
Oct 14 09:42:51 np0005486759.ooo.test sudo[266264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeeikrtfdeeumjcwppqgdazanpkxovqg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760434970.7614703-734-23781699183408/AnsiballZ_edpm_container_manage.py
Oct 14 09:42:51 np0005486759.ooo.test sudo[266264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:42:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:42:51 np0005486759.ooo.test python3[266266]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:42:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:42:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34001 DF PROTO=TCP SPT=48948 DPT=9102 SEQ=2283560242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90A5810000000001030307) 
Oct 14 09:42:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:42:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:42:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:42:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:42:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:42:54.142 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:42:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:42:54.143 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:42:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:42:54.144 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:42:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f49b9fcb7527e4e06386bb74b403d49154983873c705746d0322d416fcfe3182-merged.mount: Deactivated successfully.
Oct 14 09:42:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b272b93e3e9b77af062c082df69449b0ad42f33081484c8b336852990e7bca40-merged.mount: Deactivated successfully.
Oct 14 09:42:54 np0005486759.ooo.test podman[266293]: 2025-10-14 09:42:54.206379841 +0000 UTC m=+0.078607832 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:42:54 np0005486759.ooo.test podman[266293]: 2025-10-14 09:42:54.239240996 +0000 UTC m=+0.111468927 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009)
Oct 14 09:42:54 np0005486759.ooo.test podman[266293]: unhealthy
Oct 14 09:42:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:54.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:54.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:54.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:42:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:54.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:54.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:54 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:54.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:42:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:42:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:42:55 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:42:55 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Failed with result 'exit-code'.
Oct 14 09:42:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:42:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34002 DF PROTO=TCP SPT=48948 DPT=9102 SEQ=2283560242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90B5410000000001030307) 
Oct 14 09:42:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:42:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:42:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:42:59 np0005486759.ooo.test podman[266280]: 2025-10-14 09:42:53.484260417 +0000 UTC m=+0.043921340 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct 14 09:42:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:59.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:59.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:42:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:59.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:42:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:59.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:42:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:59.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:42:59 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:42:59.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:43:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b38c1a94310de94fdaff9837ece5582692a37d83810e45311fea0b8d6975fc83-merged.mount: Deactivated successfully.
Oct 14 09:43:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b38c1a94310de94fdaff9837ece5582692a37d83810e45311fea0b8d6975fc83-merged.mount: Deactivated successfully.
Oct 14 09:43:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:43:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:43:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:43:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:43:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:43:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:43:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57759 DF PROTO=TCP SPT=47672 DPT=9100 SEQ=594404203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90D34A0000000001030307) 
Oct 14 09:43:04 np0005486759.ooo.test podman[266359]: 2025-10-14 09:43:04.500023503 +0000 UTC m=+2.122019181 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:43:04 np0005486759.ooo.test podman[266382]: 2025-10-14 09:43:04.589906127 +0000 UTC m=+0.668613269 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:43:04 np0005486759.ooo.test podman[266359]: 2025-10-14 09:43:04.593366688 +0000 UTC m=+2.215362406 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true)
Oct 14 09:43:04 np0005486759.ooo.test podman[266360]: 2025-10-14 09:43:04.600313841 +0000 UTC m=+2.224010204 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:43:04 np0005486759.ooo.test podman[266382]: 2025-10-14 09:43:04.61868648 +0000 UTC m=+0.697393602 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:43:04 np0005486759.ooo.test podman[266360]: 2025-10-14 09:43:04.640743098 +0000 UTC m=+2.264439471 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2)
Oct 14 09:43:04 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:04.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57760 DF PROTO=TCP SPT=47672 DPT=9100 SEQ=594404203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90D7420000000001030307) 
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:06 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:43:06 np0005486759.ooo.test podman[266441]: 2025-10-14 09:43:06.843391895 +0000 UTC m=+0.606649571 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:43:06 np0005486759.ooo.test podman[266441]: 2025-10-14 09:43:06.847842707 +0000 UTC m=+0.611100343 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible)
Oct 14 09:43:06 np0005486759.ooo.test podman[266453]: 2025-10-14 09:43:06.888475921 +0000 UTC m=+0.555672725 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:43:06 np0005486759.ooo.test podman[266415]: 2025-10-14 09:43:06.812552166 +0000 UTC m=+2.222547116 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct 14 09:43:06 np0005486759.ooo.test podman[266453]: 2025-10-14 09:43:06.924331452 +0000 UTC m=+0.591528296 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:43:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:43:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57761 DF PROTO=TCP SPT=47672 DPT=9100 SEQ=594404203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90DF410000000001030307) 
Oct 14 09:43:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:43:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:43:08 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:43:08 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:43:08 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:09.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:09.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:09.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:09.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:09.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:09 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:09.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57762 DF PROTO=TCP SPT=47672 DPT=9100 SEQ=594404203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90EF010000000001030307) 
Oct 14 09:43:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16435 DF PROTO=TCP SPT=38794 DPT=9882 SEQ=4124304485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90F2980000000001030307) 
Oct 14 09:43:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149-merged.mount: Deactivated successfully.
Oct 14 09:43:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:43:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b38c1a94310de94fdaff9837ece5582692a37d83810e45311fea0b8d6975fc83-merged.mount: Deactivated successfully.
Oct 14 09:43:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16436 DF PROTO=TCP SPT=38794 DPT=9882 SEQ=4124304485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F90F6810000000001030307) 
Oct 14 09:43:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully.
Oct 14 09:43:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:43:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully.
Oct 14 09:43:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully.
Oct 14 09:43:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:43:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:43:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:14.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:14.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:14.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:14.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:14.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:14 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:14.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully.
Oct 14 09:43:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50689 DF PROTO=TCP SPT=40740 DPT=9105 SEQ=96285991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9105420000000001030307) 
Oct 14 09:43:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:17 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:17 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:43:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:17 np0005486759.ooo.test podman[266480]: 2025-10-14 09:43:17.948956962 +0000 UTC m=+0.070679528 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:43:17 np0005486759.ooo.test podman[266480]: 2025-10-14 09:43:17.985239396 +0000 UTC m=+0.106961972 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:43:17 np0005486759.ooo.test podman[266480]: unhealthy
Oct 14 09:43:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16438 DF PROTO=TCP SPT=38794 DPT=9882 SEQ=4124304485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F910E410000000001030307) 
Oct 14 09:43:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:19 np0005486759.ooo.test podman[266415]: 
Oct 14 09:43:19 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:19 np0005486759.ooo.test podman[266415]: 2025-10-14 09:43:19.465085587 +0000 UTC m=+14.875080587 container create 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 14 09:43:19 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:43:19 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:43:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:19.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:19.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:19.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:19.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:19.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:19 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:19.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:20 np0005486759.ooo.test python3[266266]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct 14 09:43:20 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:43:20Z" level=error msg="Getting root fs size for \"17a1ea7777e0eaf7afbc337084967d8387027d7738a136a8a9179af455d42a92\": unmounting layer e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df: replacing mount point \"/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged\": device or resource busy"
Oct 14 09:43:21 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:21 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61256 DF PROTO=TCP SPT=57072 DPT=9102 SEQ=971862368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F911AC20000000001030307) 
Oct 14 09:43:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-91be3aaad6265d2bb7fe9d6d050be5bfb2e05615ab0638b195d7c1953f597149-merged.mount: Deactivated successfully.
Oct 14 09:43:23 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:23 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9353b4c9b77a60c02d5cd3c8f9b94918c7a607156d2f7e1365b30ffe1fa49c89-merged.mount: Deactivated successfully.
Oct 14 09:43:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully.
Oct 14 09:43:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:24.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:24.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:24.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:24.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:25.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:25.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:43:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6-merged.mount: Deactivated successfully.
Oct 14 09:43:26 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:26 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6-merged.mount: Deactivated successfully.
Oct 14 09:43:26 np0005486759.ooo.test sudo[266264]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:26 np0005486759.ooo.test podman[266526]: 2025-10-14 09:43:26.12917482 +0000 UTC m=+0.246073175 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct 14 09:43:26 np0005486759.ooo.test podman[266526]: 2025-10-14 09:43:26.157290732 +0000 UTC m=+0.274189107 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:43:26 np0005486759.ooo.test podman[266526]: unhealthy
Oct 14 09:43:26 np0005486759.ooo.test sudo[266650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcjgggrarcsksgrdrwydxhbedlldnnqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435006.2561743-742-90897182570097/AnsiballZ_stat.py
Oct 14 09:43:26 np0005486759.ooo.test sudo[266650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61257 DF PROTO=TCP SPT=57072 DPT=9102 SEQ=971862368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F912A810000000001030307) 
Oct 14 09:43:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:26.773 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:26.775 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:26 np0005486759.ooo.test python3.9[266652]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:43:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:26.797 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:26.798 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:43:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:26.798 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:43:26 np0005486759.ooo.test sudo[266650]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:43:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:43:26 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:43:26 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Failed with result 'exit-code'.
Oct 14 09:43:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:27.816 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:43:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:27.816 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:43:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:27.816 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:43:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:27.816 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:43:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:43:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:43:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:28 np0005486759.ooo.test sudo[266762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klxixjxqzqrfrnghypmcldezkcxjldvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435007.056703-751-127240861060298/AnsiballZ_file.py
Oct 14 09:43:28 np0005486759.ooo.test sudo[266762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-41d6d78d48a59c2a92b7ebbd672b507950bf0a9c199b961ef8dec56e0bf4d10d-merged.mount: Deactivated successfully.
Oct 14 09:43:28 np0005486759.ooo.test python3.9[266764]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:43:28 np0005486759.ooo.test sudo[266762]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:43:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99-merged.mount: Deactivated successfully.
Oct 14 09:43:29 np0005486759.ooo.test sudo[266871]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gduqrlgyjkdvqpvrptuiohcxphentxln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435008.610409-751-238039287381434/AnsiballZ_copy.py
Oct 14 09:43:29 np0005486759.ooo.test sudo[266871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:29 np0005486759.ooo.test python3.9[266873]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435008.610409-751-238039287381434/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:43:29 np0005486759.ooo.test sudo[266871]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:29 np0005486759.ooo.test sudo[266926]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qetzvyjtteykoweijsfoigzbjwmapvzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435008.610409-751-238039287381434/AnsiballZ_systemd.py
Oct 14 09:43:29 np0005486759.ooo.test sudo[266926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:29 np0005486759.ooo.test python3.9[266928]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:43:29 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:43:29 np0005486759.ooo.test systemd-rc-local-generator[266950]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:43:29 np0005486759.ooo.test systemd-sysv-generator[266953]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:43:29 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:30 np0005486759.ooo.test sudo[266926]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:30 np0005486759.ooo.test sudo[267017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvnviclvlwtnbxsxlnxocnaglntoxxlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435008.610409-751-238039287381434/AnsiballZ_systemd.py
Oct 14 09:43:30 np0005486759.ooo.test sudo[267017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:30 np0005486759.ooo.test python3.9[267019]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:43:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:30 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.873 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.889 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.890 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.890 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.891 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.891 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.891 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.892 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.892 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.892 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.893 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.912 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.912 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.912 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.913 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:43:30 np0005486759.ooo.test systemd-sysv-generator[267051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:43:30 np0005486759.ooo.test systemd-rc-local-generator[267047]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:43:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:30.986 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:43:31 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:43:31Z" level=error msg="Getting root fs size for \"2386c6750b0dfd17fba8abe41bffccede29b46bf69861f879469aa60afd7dc49\": getting diffsize of layer \"919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94\" and its parent \"948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca\": unmounting layer 948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca: replacing mount point \"/var/lib/containers/storage/overlay/948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca/merged\": device or resource busy"
Oct 14 09:43:31 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:31 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:31 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:31 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:31 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.064 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.065 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:43:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.134 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.135 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:43:31 np0005486759.ooo.test systemd[1]: Starting openstack_network_exporter container...
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.224 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.225 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.275 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.445 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.446 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12774MB free_disk=386.7216148376465GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.446 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.447 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.518 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.518 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.519 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.570 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.637 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.640 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:43:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:31.640 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:43:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:33 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:33 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:33 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:43:33 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6557bdf3e0a050cb0fea85c6d2636f32c9ec92db594c5952efcffc2b6f699532/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 14 09:43:33 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6557bdf3e0a050cb0fea85c6d2636f32c9ec92db594c5952efcffc2b6f699532/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 14 09:43:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:43:33 np0005486759.ooo.test podman[267069]: 2025-10-14 09:43:33.591746726 +0000 UTC m=+2.398108018 container init 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *bridge.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *coverage.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *datapath.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *iface.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *memory.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *ovnnorthd.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *ovn.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *ovsdbserver.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *pmd_perf.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *pmd_rxq.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: INFO    09:43:33 main.go:48: registering *vswitch.Collector
Oct 14 09:43:33 np0005486759.ooo.test openstack_network_exporter[267089]: NOTICE  09:43:33 main.go:82: listening on http://:9105/metrics
Oct 14 09:43:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:43:33 np0005486759.ooo.test podman[267069]: 2025-10-14 09:43:33.622889615 +0000 UTC m=+2.429250857 container start 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Oct 14 09:43:33 np0005486759.ooo.test podman[267069]: openstack_network_exporter
Oct 14 09:43:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-61ddfe988c2ff6f263641944ea2bd50466b1211d95dab96df436aa6600db66b6-merged.mount: Deactivated successfully.
Oct 14 09:43:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:43:34 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:43:34Z" level=error msg="Getting root fs size for \"254ec70fdcfd4480810cf0d91f6def813a6f961004b23ea455a93775ea33b49e\": getting diffsize of layer \"f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": unmounting layer 19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8: replacing mount point \"/var/lib/containers/storage/overlay/19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8/merged\": device or resource busy"
Oct 14 09:43:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6491 DF PROTO=TCP SPT=39170 DPT=9100 SEQ=1872951058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9148780000000001030307) 
Oct 14 09:43:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:34 np0005486759.ooo.test systemd[1]: Started openstack_network_exporter container.
Oct 14 09:43:34 np0005486759.ooo.test sudo[267017]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:34 np0005486759.ooo.test podman[267099]: 2025-10-14 09:43:34.637485191 +0000 UTC m=+1.007178269 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 14 09:43:34 np0005486759.ooo.test podman[267099]: 2025-10-14 09:43:34.67078362 +0000 UTC m=+1.040476678 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, architecture=x86_64)
Oct 14 09:43:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:35.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:35.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:35.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:35.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:35 np0005486759.ooo.test sudo[267228]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cufnglhyuzfkhpmxnjkofgmaweonlhrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435014.772924-775-57357722823645/AnsiballZ_systemd.py
Oct 14 09:43:35 np0005486759.ooo.test sudo[267228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:35.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:35.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6492 DF PROTO=TCP SPT=39170 DPT=9100 SEQ=1872951058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F914C810000000001030307) 
Oct 14 09:43:35 np0005486759.ooo.test python3.9[267230]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:43:35 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: Stopping openstack_network_exporter container...
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: libpod-60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.scope: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test podman[267234]: 2025-10-14 09:43:35.455560854 +0000 UTC m=+0.065992597 container died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.timer: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-261e46b7a6706b191f296a92dfc54b0ddf4ce0d7558859acd81c0ef7d979ba99-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44-userdata-shm.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6557bdf3e0a050cb0fea85c6d2636f32c9ec92db594c5952efcffc2b6f699532-merged.mount: Deactivated successfully.
Oct 14 09:43:35 np0005486759.ooo.test podman[267234]: 2025-10-14 09:43:35.752821721 +0000 UTC m=+0.363253494 container cleanup 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350)
Oct 14 09:43:35 np0005486759.ooo.test podman[267234]: openstack_network_exporter
Oct 14 09:43:35 np0005486759.ooo.test podman[267247]: 2025-10-14 09:43:35.817273408 +0000 UTC m=+0.363729709 container cleanup 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:43:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:43:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6493 DF PROTO=TCP SPT=39170 DPT=9100 SEQ=1872951058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9154810000000001030307) 
Oct 14 09:43:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:43:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:43:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4-merged.mount: Deactivated successfully.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 14 09:43:38 np0005486759.ooo.test podman[267262]: 2025-10-14 09:43:38.26794049 +0000 UTC m=+0.899384981 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 09:43:38 np0005486759.ooo.test podman[267260]: 2025-10-14 09:43:38.297785628 +0000 UTC m=+0.930109887 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:43:38 np0005486759.ooo.test podman[267260]: 2025-10-14 09:43:38.330449636 +0000 UTC m=+0.962773955 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:43:38 np0005486759.ooo.test podman[267291]: 2025-10-14 09:43:38.399727428 +0000 UTC m=+0.154458316 container cleanup 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible)
Oct 14 09:43:38 np0005486759.ooo.test podman[267291]: openstack_network_exporter
Oct 14 09:43:38 np0005486759.ooo.test podman[267262]: 2025-10-14 09:43:38.432234601 +0000 UTC m=+1.063679142 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Oct 14 09:43:38 np0005486759.ooo.test podman[267261]: 2025-10-14 09:43:38.521604408 +0000 UTC m=+1.150784226 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:43:38 np0005486759.ooo.test podman[267261]: 2025-10-14 09:43:38.556238959 +0000 UTC m=+1.185418717 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:43:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: Stopped openstack_network_exporter container.
Oct 14 09:43:39 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:39 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: Starting openstack_network_exporter container...
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:43:39 np0005486759.ooo.test podman[267337]: 2025-10-14 09:43:39.060072711 +0000 UTC m=+0.186630388 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 09:43:39 np0005486759.ooo.test podman[267336]: 2025-10-14 09:43:39.135986896 +0000 UTC m=+0.264274068 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 09:43:39 np0005486759.ooo.test podman[267337]: 2025-10-14 09:43:39.143369833 +0000 UTC m=+0.269927510 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 09:43:39 np0005486759.ooo.test podman[267336]: 2025-10-14 09:43:39.199263006 +0000 UTC m=+0.327550198 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:40.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:40.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:40.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:40.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:40.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:40.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6494 DF PROTO=TCP SPT=39170 DPT=9100 SEQ=1872951058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9164410000000001030307) 
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:41 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:41 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:43:41 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6557bdf3e0a050cb0fea85c6d2636f32c9ec92db594c5952efcffc2b6f699532/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 14 09:43:41 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6557bdf3e0a050cb0fea85c6d2636f32c9ec92db594c5952efcffc2b6f699532/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:43:41 np0005486759.ooo.test podman[267362]: 2025-10-14 09:43:41.805589152 +0000 UTC m=+2.784215174 container init 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *bridge.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *coverage.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *datapath.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *iface.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *memory.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *ovnnorthd.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *ovn.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *ovsdbserver.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *pmd_perf.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *pmd_rxq.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: INFO    09:43:41 main.go:48: registering *vswitch.Collector
Oct 14 09:43:41 np0005486759.ooo.test openstack_network_exporter[267388]: NOTICE  09:43:41 main.go:82: listening on http://:9105/metrics
Oct 14 09:43:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:43:41 np0005486759.ooo.test podman[267362]: 2025-10-14 09:43:41.835413919 +0000 UTC m=+2.814039911 container start 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container)
Oct 14 09:43:41 np0005486759.ooo.test podman[267362]: openstack_network_exporter
Oct 14 09:43:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52052 DF PROTO=TCP SPT=49656 DPT=9882 SEQ=1532142813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9167C70000000001030307) 
Oct 14 09:43:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52053 DF PROTO=TCP SPT=49656 DPT=9882 SEQ=1532142813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F916BC10000000001030307) 
Oct 14 09:43:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:43 np0005486759.ooo.test systemd[1]: Started openstack_network_exporter container.
Oct 14 09:43:43 np0005486759.ooo.test podman[267398]: 2025-10-14 09:43:43.737479384 +0000 UTC m=+1.899187003 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc.)
Oct 14 09:43:43 np0005486759.ooo.test sudo[267228]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:43 np0005486759.ooo.test podman[267398]: 2025-10-14 09:43:43.776280079 +0000 UTC m=+1.937987648 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6)
Oct 14 09:43:44 np0005486759.ooo.test sudo[267525]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuxgorfebwogwrzkodcodfuioceexgit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435023.9470236-783-37025596125132/AnsiballZ_find.py
Oct 14 09:43:44 np0005486759.ooo.test sudo[267525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:44 np0005486759.ooo.test python3.9[267527]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:43:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:43:44 np0005486759.ooo.test sudo[267525]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-40d714933e5b0d972646fbd568da7cc308b3f168911dd1b81f3db3e7d11dcd1a-merged.mount: Deactivated successfully.
Oct 14 09:43:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-40d714933e5b0d972646fbd568da7cc308b3f168911dd1b81f3db3e7d11dcd1a-merged.mount: Deactivated successfully.
Oct 14 09:43:44 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:43:44 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:44 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:45.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:45.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:45.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:45.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:45.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:45.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:45 np0005486759.ooo.test sudo[267635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sueudwizamvmccmospvbvzebmpraiufp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435024.8745213-793-104449936574622/AnsiballZ_podman_container_info.py
Oct 14 09:43:45 np0005486759.ooo.test sudo[267635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:45 np0005486759.ooo.test python3.9[267637]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct 14 09:43:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:43:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50823 DF PROTO=TCP SPT=35448 DPT=9105 SEQ=2373766349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F917A810000000001030307) 
Oct 14 09:43:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:43:47 np0005486759.ooo.test sudo[267635]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:47 np0005486759.ooo.test sudo[267758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksyqtdpbtdktexyfjogodssmgwftjcmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435027.490913-801-143751225239362/AnsiballZ_podman_container_exec.py
Oct 14 09:43:47 np0005486759.ooo.test sudo[267758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:48 np0005486759.ooo.test python3.9[267760]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:43:48 np0005486759.ooo.test systemd[1]: Started libpod-conmon-1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.scope.
Oct 14 09:43:48 np0005486759.ooo.test podman[267761]: 2025-10-14 09:43:48.232412667 +0000 UTC m=+0.102861137 container exec 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 14 09:43:48 np0005486759.ooo.test podman[267761]: 2025-10-14 09:43:48.264226691 +0000 UTC m=+0.134675111 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:43:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52055 DF PROTO=TCP SPT=49656 DPT=9882 SEQ=1532142813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9183810000000001030307) 
Oct 14 09:43:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:43:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:43:49 np0005486759.ooo.test sudo[267758]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:49 np0005486759.ooo.test podman[267790]: 2025-10-14 09:43:49.894284834 +0000 UTC m=+0.259495399 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:43:49 np0005486759.ooo.test podman[267790]: 2025-10-14 09:43:49.910308613 +0000 UTC m=+0.275519138 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:43:49 np0005486759.ooo.test podman[267790]: unhealthy
Oct 14 09:43:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-90434627a5b0158db55ab3b7ebef8c228ccf03571f4c140b821af91947b16dd4-merged.mount: Deactivated successfully.
Oct 14 09:43:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:50.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:50.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:50.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:43:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:50.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:50.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:43:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:50.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:43:50 np0005486759.ooo.test sudo[267919]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxzbipsiqrorakysunaqvdtpjrlxkkhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435029.998905-809-222356842215870/AnsiballZ_podman_container_exec.py
Oct 14 09:43:50 np0005486759.ooo.test sudo[267919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:43:50 np0005486759.ooo.test python3.9[267921]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: libpod-conmon-1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.scope: Deactivated successfully.
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: Started libpod-conmon-1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.scope.
Oct 14 09:43:50 np0005486759.ooo.test podman[267922]: 2025-10-14 09:43:50.950003507 +0000 UTC m=+0.391513286 container exec 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 14 09:43:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:50 np0005486759.ooo.test podman[267922]: 2025-10-14 09:43:50.982263204 +0000 UTC m=+0.423772913 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 14 09:43:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:43:51 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:43:51Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged: invalid argument"
Oct 14 09:43:51 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:43:51Z" level=error msg="Getting root fs size for \"3f4e17fbe1fbce44c81a667a68d3b48dfce4df38b64d4ac399998c44518159f4\": creating overlay mount to /var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/empty,upperdir=/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/diff,workdir=/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/work,nodev,metacopy=on\": no such file or directory"
Oct 14 09:43:51 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:51 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:51 np0005486759.ooo.test sudo[267919]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:43:52 np0005486759.ooo.test sudo[268057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syidrqwabjjbzwchtzgfyfydohjqpjra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435031.9016192-817-178698507445574/AnsiballZ_file.py
Oct 14 09:43:52 np0005486759.ooo.test sudo[268057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:43:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-40d714933e5b0d972646fbd568da7cc308b3f168911dd1b81f3db3e7d11dcd1a-merged.mount: Deactivated successfully.
Oct 14 09:43:52 np0005486759.ooo.test systemd[1]: libpod-conmon-1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.scope: Deactivated successfully.
Oct 14 09:43:52 np0005486759.ooo.test python3.9[268059]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:43:52 np0005486759.ooo.test sudo[268057]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9811 DF PROTO=TCP SPT=48954 DPT=9102 SEQ=2330065748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9190010000000001030307) 
Oct 14 09:43:52 np0005486759.ooo.test sudo[268167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzmeoyumyctclnfdvbemvcdsiipqekhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435032.6427314-826-52125348782489/AnsiballZ_podman_container_info.py
Oct 14 09:43:52 np0005486759.ooo.test sudo[268167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-40d714933e5b0d972646fbd568da7cc308b3f168911dd1b81f3db3e7d11dcd1a-merged.mount: Deactivated successfully.
Oct 14 09:43:53 np0005486759.ooo.test python3.9[268169]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct 14 09:43:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:43:54.143 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:43:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:43:54.144 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:43:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:43:54.145 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:43:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:43:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-137c1921cf29a18f5788ee7ca89cb32d77b40e4f2bc3359cd1d75d04c15761c5-merged.mount: Deactivated successfully.
Oct 14 09:43:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-137c1921cf29a18f5788ee7ca89cb32d77b40e4f2bc3359cd1d75d04c15761c5-merged.mount: Deactivated successfully.
Oct 14 09:43:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:43:55.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:43:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9812 DF PROTO=TCP SPT=48954 DPT=9102 SEQ=2330065748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F919FC10000000001030307) 
Oct 14 09:43:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:43:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:43:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:43:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:43:57 np0005486759.ooo.test podman[268183]: 2025-10-14 09:43:57.219727625 +0000 UTC m=+0.228088517 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 09:43:57 np0005486759.ooo.test sudo[268167]: pam_unix(sudo:session): session closed for user root
Oct 14 09:43:57 np0005486759.ooo.test podman[268183]: 2025-10-14 09:43:57.256348526 +0000 UTC m=+0.264709358 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 09:43:57 np0005486759.ooo.test podman[268183]: unhealthy
Oct 14 09:43:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:43:57 np0005486759.ooo.test sudo[268307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gprjsskmuqypnqlcbxxjdqxzmnaveaws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435037.4334753-834-56938936398584/AnsiballZ_podman_container_exec.py
Oct 14 09:43:57 np0005486759.ooo.test sudo[268307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:43:57 np0005486759.ooo.test python3.9[268309]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:43:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:43:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:43:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:43:59 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:43:59 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:43:59 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Failed with result 'exit-code'.
Oct 14 09:43:59 np0005486759.ooo.test systemd[1]: Started libpod-conmon-d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.scope.
Oct 14 09:43:59 np0005486759.ooo.test podman[268310]: 2025-10-14 09:43:59.561116307 +0000 UTC m=+1.580472696 container exec d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:43:59 np0005486759.ooo.test podman[268310]: 2025-10-14 09:43:59.596470999 +0000 UTC m=+1.615827418 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:43:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:44:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:00.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:00.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:00.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:00.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:00.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:00.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:01 np0005486759.ooo.test sudo[268307]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:01 np0005486759.ooo.test sudo[268446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzqtaqatmztktuqhvurfkjejtcjymibi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435041.4914184-842-52408577313733/AnsiballZ_podman_container_exec.py
Oct 14 09:44:01 np0005486759.ooo.test sudo[268446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:02 np0005486759.ooo.test python3.9[268448]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:02 np0005486759.ooo.test systemd[1]: libpod-conmon-d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.scope: Deactivated successfully.
Oct 14 09:44:02 np0005486759.ooo.test systemd[1]: Started libpod-conmon-d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.scope.
Oct 14 09:44:02 np0005486759.ooo.test podman[268449]: 2025-10-14 09:44:02.572484994 +0000 UTC m=+0.544610740 container exec d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 09:44:02 np0005486759.ooo.test podman[268449]: 2025-10-14 09:44:02.604425812 +0000 UTC m=+0.576551578 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:44:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:03 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:03 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:03 np0005486759.ooo.test systemd[1]: libpod-conmon-d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.scope: Deactivated successfully.
Oct 14 09:44:03 np0005486759.ooo.test sudo[268446]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:03 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:44:03Z" level=error msg="Getting root fs size for \"44137e33ae7d9eb666dda704d90f56f7a806eea4dd58a6d326886bf9e5f9d929\": unmounting layer e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df: replacing mount point \"/var/lib/containers/storage/overlay/e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df/merged\": device or resource busy"
Oct 14 09:44:03 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:03 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:03 np0005486759.ooo.test sudo[268587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdluatjsyijwyknpuwotvpgaiuxuyive ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435043.508878-850-123871077529736/AnsiballZ_file.py
Oct 14 09:44:03 np0005486759.ooo.test sudo[268587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:03 np0005486759.ooo.test python3.9[268589]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:44:03 np0005486759.ooo.test sudo[268587]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17334 DF PROTO=TCP SPT=57782 DPT=9100 SEQ=4273658542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91BDA80000000001030307) 
Oct 14 09:44:04 np0005486759.ooo.test sudo[268697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dllsqlbwwtdimztcfvrtltmiaeqrkstl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435044.1417396-859-143258404149710/AnsiballZ_podman_container_info.py
Oct 14 09:44:04 np0005486759.ooo.test sudo[268697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:04 np0005486759.ooo.test python3.9[268699]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct 14 09:44:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:05.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:05.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:05.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:05.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:05.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:05.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17335 DF PROTO=TCP SPT=57782 DPT=9100 SEQ=4273658542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91C1C10000000001030307) 
Oct 14 09:44:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:44:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-137c1921cf29a18f5788ee7ca89cb32d77b40e4f2bc3359cd1d75d04c15761c5-merged.mount: Deactivated successfully.
Oct 14 09:44:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:44:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:44:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17336 DF PROTO=TCP SPT=57782 DPT=9100 SEQ=4273658542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91C9C20000000001030307) 
Oct 14 09:44:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:44:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:44:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:44:08 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:08 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:08 np0005486759.ooo.test sudo[268697]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:09 np0005486759.ooo.test sudo[268820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlnbiyfouqlrktonjdssjiorlknlnkfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435048.7482-867-208733877277756/AnsiballZ_podman_container_exec.py
Oct 14 09:44:09 np0005486759.ooo.test sudo[268820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:44:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:44:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:44:09 np0005486759.ooo.test podman[268823]: 2025-10-14 09:44:09.130850911 +0000 UTC m=+0.067756763 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 09:44:09 np0005486759.ooo.test podman[268825]: 2025-10-14 09:44:09.190565479 +0000 UTC m=+0.123478268 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:44:09 np0005486759.ooo.test podman[268824]: 2025-10-14 09:44:09.211159958 +0000 UTC m=+0.152590448 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:44:09 np0005486759.ooo.test podman[268823]: 2025-10-14 09:44:09.220234876 +0000 UTC m=+0.157140728 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 14 09:44:09 np0005486759.ooo.test podman[268825]: 2025-10-14 09:44:09.270742561 +0000 UTC m=+0.203655380 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 09:44:09 np0005486759.ooo.test python3.9[268822]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:09 np0005486759.ooo.test podman[268824]: 2025-10-14 09:44:09.321822043 +0000 UTC m=+0.263252533 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:44:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:10.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:10.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:10.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:10.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:10.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:10.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:44:10 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:10 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:44:10 np0005486759.ooo.test systemd[1]: Started libpod-conmon-895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.scope.
Oct 14 09:44:10 np0005486759.ooo.test podman[268882]: 2025-10-14 09:44:10.924695255 +0000 UTC m=+1.607828584 container exec 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251009)
Oct 14 09:44:10 np0005486759.ooo.test podman[268882]: 2025-10-14 09:44:10.95297085 +0000 UTC m=+1.636104149 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid)
Oct 14 09:44:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17337 DF PROTO=TCP SPT=57782 DPT=9100 SEQ=4273658542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91D9810000000001030307) 
Oct 14 09:44:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49898 DF PROTO=TCP SPT=51422 DPT=9882 SEQ=3230592373 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91DCF80000000001030307) 
Oct 14 09:44:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:44:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:44:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:12 np0005486759.ooo.test sudo[268820]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:12 np0005486759.ooo.test podman[268914]: 2025-10-14 09:44:12.897878573 +0000 UTC m=+0.527607200 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 09:44:12 np0005486759.ooo.test podman[268914]: 2025-10-14 09:44:12.907204018 +0000 UTC m=+0.536932665 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 09:44:12 np0005486759.ooo.test podman[268913]: 2025-10-14 09:44:12.952016919 +0000 UTC m=+0.583557392 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.vendor=CentOS)
Oct 14 09:44:12 np0005486759.ooo.test podman[268913]: 2025-10-14 09:44:12.958191649 +0000 UTC m=+0.589732122 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:44:13 np0005486759.ooo.test sudo[269058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umdbbpokvybgegwejiivacgoqyudptuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435052.996378-875-265514663012383/AnsiballZ_podman_container_exec.py
Oct 14 09:44:13 np0005486759.ooo.test sudo[269058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49899 DF PROTO=TCP SPT=51422 DPT=9882 SEQ=3230592373 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91E1010000000001030307) 
Oct 14 09:44:13 np0005486759.ooo.test python3.9[269060]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:13 np0005486759.ooo.test systemd[1]: libpod-conmon-895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.scope: Deactivated successfully.
Oct 14 09:44:13 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:44:13 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:44:13 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:13 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:13 np0005486759.ooo.test systemd[1]: Started libpod-conmon-895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.scope.
Oct 14 09:44:13 np0005486759.ooo.test podman[269061]: 2025-10-14 09:44:13.742634604 +0000 UTC m=+0.251128173 container exec 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:44:13 np0005486759.ooo.test podman[269061]: 2025-10-14 09:44:13.774457218 +0000 UTC m=+0.282950777 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:44:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:44:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c760467e4bcded5cc756a9c9f562bd63684ba8491ef5a99de42ba004cfc34cbd-merged.mount: Deactivated successfully.
Oct 14 09:44:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:44:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:15.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:15.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:15.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:15.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:15.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:15.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:16 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:16 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:16 np0005486759.ooo.test sudo[269058]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:16 np0005486759.ooo.test podman[269089]: 2025-10-14 09:44:16.244208427 +0000 UTC m=+1.442470476 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 09:44:16 np0005486759.ooo.test podman[269089]: 2025-10-14 09:44:16.283349574 +0000 UTC m=+1.481611613 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 14 09:44:16 np0005486759.ooo.test sudo[269215]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idabmpoklqyflxpcrgyjsocwvhfpitnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435056.354454-883-226244080806040/AnsiballZ_file.py
Oct 14 09:44:16 np0005486759.ooo.test sudo[269215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:16 np0005486759.ooo.test python3.9[269217]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:44:16 np0005486759.ooo.test sudo[269215]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8565 DF PROTO=TCP SPT=39482 DPT=9105 SEQ=596309565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91EFC20000000001030307) 
Oct 14 09:44:17 np0005486759.ooo.test sudo[269325]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icxmfkmfgtjpnxgxihpisdbytgtdeufi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435057.0531495-892-79276362101058/AnsiballZ_podman_container_info.py
Oct 14 09:44:17 np0005486759.ooo.test sudo[269325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:17 np0005486759.ooo.test python3.9[269327]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct 14 09:44:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:18 np0005486759.ooo.test systemd[1]: libpod-conmon-895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.scope: Deactivated successfully.
Oct 14 09:44:18 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:18 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:18 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:44:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:19 np0005486759.ooo.test sudo[269325]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49901 DF PROTO=TCP SPT=51422 DPT=9882 SEQ=3230592373 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F91F8C10000000001030307) 
Oct 14 09:44:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:20.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:20.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:20.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:20.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:20.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:20.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:20 np0005486759.ooo.test sudo[269447]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymzdajdxpshxmseehxyoncnrahparrvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435060.533215-900-217977833776909/AnsiballZ_podman_container_exec.py
Oct 14 09:44:20 np0005486759.ooo.test sudo[269447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:44:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:44:21 np0005486759.ooo.test python3.9[269449]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:21 np0005486759.ooo.test podman[269450]: 2025-10-14 09:44:21.164172806 +0000 UTC m=+0.058294375 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:44:21 np0005486759.ooo.test podman[269450]: 2025-10-14 09:44:21.19343246 +0000 UTC m=+0.087554039 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:44:21 np0005486759.ooo.test podman[269450]: unhealthy
Oct 14 09:44:21 np0005486759.ooo.test systemd[1]: Started libpod-conmon-b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.scope.
Oct 14 09:44:21 np0005486759.ooo.test podman[269460]: 2025-10-14 09:44:21.23592836 +0000 UTC m=+0.103282429 container exec b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:44:21 np0005486759.ooo.test podman[269460]: 2025-10-14 09:44:21.269401784 +0000 UTC m=+0.136755843 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Oct 14 09:44:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:44:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:44:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6240 DF PROTO=TCP SPT=60998 DPT=9102 SEQ=1726341803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9205410000000001030307) 
Oct 14 09:44:22 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:22 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:22 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:44:22 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:44:22 np0005486759.ooo.test sudo[269447]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-699db7de424016e8e9e98e1f1065f647b839bf0571fc867d8e43c26f1a3cbb5b-merged.mount: Deactivated successfully.
Oct 14 09:44:23 np0005486759.ooo.test sudo[269609]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uidpqmzdvvnghfqczqkilohanryyzlqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435063.029661-908-151143692638367/AnsiballZ_podman_container_exec.py
Oct 14 09:44:23 np0005486759.ooo.test sudo[269609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:23 np0005486759.ooo.test python3.9[269611]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.445 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.445 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.483 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.483 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b90a18f6-ef47-4a71-9900-c10b54054dcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.446098', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e311bfa-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '20a750b1563bd19e10d12dc8cb4471db2ec3315f2a2a8bba12d95589c5ad14b3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.446098', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e31271c-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '2c108cbc95bdf35c998681694e58cae87b91b1fca95a7348939a083f26d2430e'}]}, 'timestamp': '2025-10-14 09:44:24.484132', '_unique_id': 'a3a4c90267194546bbbe7ffc7a8322cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.492 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4f0b87d-f7ad-4911-ae25-cca14d35a112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.485801', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e328328-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '017d5b2c05575c4be4633236524231393df7f5f86302c665c8188ec8d760cd0e'}]}, 'timestamp': '2025-10-14 09:44:24.493110', '_unique_id': '5947de5c975b4eebb2b5768d6c0c4ab1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dbe20e4-de85-4957-b4f8-f52332bde5a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.494579', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e32c7d4-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '7dfb29973d00eaf760e1766ea4ecdad12fdf280447c2f27af957aecaa7e44f6d'}]}, 'timestamp': '2025-10-14 09:44:24.494803', '_unique_id': '450564a3bd4a44d2a9b02fbbdd20729b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.496 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.496 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6467e6a-0da3-47d2-8fa6-031307940430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.495983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e330046-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '12478d89226a03ae8b7c1977b183833a5ba277f0dddc76b7f9bcb95bf0eec189'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.495983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e330afa-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '78132fccf50649788ab1cf38fb6e413b93693f0d07b17c55db4330456e47ba3b'}]}, 'timestamp': '2025-10-14 09:44:24.496542', '_unique_id': 'caeeeb4e99d24c829b2c4761e5bb5746'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.497 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 48200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22f88a0e-ed96-47c4-8a97-4d1d11ff9501', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48200000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:44:24.497628', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5e37ce96-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.722267533, 'message_signature': 'd25d61a92df0678dea1e4d5c7721a8e68a98d57f3f48e1cc6dda9822627f50e1'}]}, 'timestamp': '2025-10-14 09:44:24.527850', '_unique_id': 'a79abed8cb2441739f24faad4d2ecd9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.528 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.529 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.529 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54fc3b4d-bc19-4f6b-bba6-96dce90831a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.529841', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e382bf2-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': 'd18988989dfaa4a754ed4fa6633cd70f55a4ba17705419963d5ae59f1885af73'}]}, 'timestamp': '2025-10-14 09:44:24.530192', '_unique_id': 'b98e82894bfd46abb3c7c8da32e983e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.552 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.552 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1262745-414b-48b5-a5ce-e06dddc11815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.531631', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3b8da6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.726695289, 'message_signature': '34e76fe91bd1a24bcea8ef7c44eadea50ac8bbb376b041f8e3ee2bbd7bee104b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.531631', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3b9666-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.726695289, 'message_signature': '1e84ee8d36a74e891968a2412191f07eb7ba86005870a5b928a89e5cec531d91'}]}, 'timestamp': '2025-10-14 09:44:24.552505', '_unique_id': '69c11023afc44b91aedfec8d014b1647'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.553 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0dec6f-cfc0-42a3-ac1b-9e089994a8a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.553706', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3bcd52-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '1b5d98c61125d56b65b7f87cf5ebde099c3589854ad739f16d85863d7b5daf0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.553706', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3bd554-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '787e99fd92cf32f68d38e4519e43f197bb12232d2dbb58747d638311bbbeeb28'}]}, 'timestamp': '2025-10-14 09:44:24.554112', '_unique_id': 'ef631391c6a94faab48c6521275fb473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50c24b65-0982-490f-acef-e45516a44404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.555136', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3c0524-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '7feceb7d31e5a902cba16f9aed2ff7e62226512099e8a135a30c4fa16ffad8ac'}]}, 'timestamp': '2025-10-14 09:44:24.555348', '_unique_id': 'd200d08bbf604ef993fbdf65ebdf044c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '896fc41b-ea9f-4a4e-bc98-d3b92f26dd54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.556312', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3c32f6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '6386b5b93fabe55782077db13a4039bdb234b7e56f490a8c413b044651962b90'}]}, 'timestamp': '2025-10-14 09:44:24.556519', '_unique_id': '9aa98d9984034c128da8c0656881e807'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddf056b9-2b04-4071-ba2f-b6817ebd2646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.557474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3c6046-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.726695289, 'message_signature': 'ff093274c13419553386226b11f03934e0cbee7293ba6106cfe8c3a77b826052'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.557474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3c6744-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.726695289, 'message_signature': '92734850fe5614f17753aa73663a4c2042cbb0fe211ab5699d902a88b70ed1ee'}]}, 'timestamp': '2025-10-14 09:44:24.557845', '_unique_id': 'f33e22553aba4f8e843c23836b33fd43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc0db9c8-c288-49cd-8bea-e73374e08b4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.558847', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3c9692-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '52f5e5cee50a47a5788483321cb678ed2ade089f751ccf875fb5666e9c11e38f'}]}, 'timestamp': '2025-10-14 09:44:24.559081', '_unique_id': '6c1e8e7314db43c7bdd5225ef5e178cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0b56383-0cb4-4a4a-b2bd-e1ff2761ab13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.560048', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3cc4dc-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': 'dc973618d6a687ef9866a2069be97b9c2510663b44a17751f2e3c35b186bab9c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.560048', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3ccbee-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '6d2dfe3cb4221a4fbbfd03ac6d074aa0fda5f5207e9eb164d3b1d07e99037bdf'}]}, 'timestamp': '2025-10-14 09:44:24.560424', '_unique_id': 'f5da4473d1a84b1b9f295add8b434631'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.561 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d10bdee-7883-4ebb-b3db-9b222c74705c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.561429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3cfad8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.726695289, 'message_signature': '6c723cfcf677b1b4b816ef27678ad6470b97dc6828ca44845baa91a5f004f14c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.561429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3d01d6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.726695289, 'message_signature': '443c7db5530030eb37545d801dfc44cc79295632929e2b5661efaf3be37c8f85'}]}, 'timestamp': '2025-10-14 09:44:24.561802', '_unique_id': '106453fb92e4444f9273c2ab30ab9d8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.562 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '730ebb5b-03f5-4445-8737-4c5a793580b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.562803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3d305c-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': 'f6461dc1bce7d44f17dc9a5d03197294858db113f33c264d5cdecefbb4c1d462'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.562803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3d3818-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '7d4f969dcf102d35bd84fa4a3d597f6332e3897d36e39548a2c227e9d4a1e2d8'}]}, 'timestamp': '2025-10-14 09:44:24.563191', '_unique_id': '2a7f109d9863496592827582f41d76cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b218eab4-d0a0-496d-95db-1e82562e766d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.564167', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3d65c2-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '7a89fc0a009b46be0907e98929819f3b6662a3ff21549fa194398a52817a0967'}]}, 'timestamp': '2025-10-14 09:44:24.564372', '_unique_id': 'cb6b51bfa42e466bb15705e74a58ceb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.564 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac18073-b11a-48cb-a318-d890274a167d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.565313', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3d9290-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': 'dab2a6c657075cd8b2afc28e676af72580f6c189d1153b966f7ee926f30c7c8a'}]}, 'timestamp': '2025-10-14 09:44:24.565520', '_unique_id': '3ddd9c2520ae4e39aaf859ff56efb4e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.565 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.566 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0104e2ea-de10-42bc-b2cd-36aae3b333b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.566515', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3dc18e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '625d9a31e22132a24276cb193f3dd3e5375919261871cc2533c2345eebfd5ba5'}]}, 'timestamp': '2025-10-14 09:44:24.566725', '_unique_id': '243cfe307d39451683949ecd46846448'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.567 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '092c41cf-8eb4-49ad-afac-67b6fcdb379f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:44:24.567787', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '5e3df33e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.680877218, 'message_signature': '96ca3daf7b49bb1aea46e62391a25bbbcf98902ed1cdff58f14965f826a59e89'}]}, 'timestamp': '2025-10-14 09:44:24.568011', '_unique_id': 'f73042aa5fc041558d09988fab3fe133'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.568 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4783c142-86d0-4b27-8e2e-186865114e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:44:24.568943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e3e2156-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': 'f121959a31b5d0598077eeb0f2fd7f27c348d2a3e3ff207c0d6336d16c01da79'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:44:24.568943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e3e285e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.641176653, 'message_signature': '9b829c25368c3086bd4762049798e48bf748df21428d36c5103e7fb89f8b286e'}]}, 'timestamp': '2025-10-14 09:44:24.569343', '_unique_id': '5e381d3e3d554cdd8b500a260c528e27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.569 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.570 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.570 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.570 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54a9e17b-41a8-4bc7-a569-694a321fd26f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:44:24.570417', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5e3e59e6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11084.722267533, 'message_signature': '5548a889b9c1f8032ba9f759dbbc323dd0bfc05a8b7a0e59895bad16b8a441be'}]}, 'timestamp': '2025-10-14 09:44:24.570618', '_unique_id': '2d619a81b6104612aa2b02196a03fbc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:44:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:44:24.571 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:44:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully.
Oct 14 09:44:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully.
Oct 14 09:44:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:25 np0005486759.ooo.test systemd[1]: libpod-conmon-b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.scope: Deactivated successfully.
Oct 14 09:44:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.scope.
Oct 14 09:44:25 np0005486759.ooo.test podman[269612]: 2025-10-14 09:44:25.268249248 +0000 UTC m=+1.776689699 container exec b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 14 09:44:25 np0005486759.ooo.test podman[269612]: 2025-10-14 09:44:25.296612145 +0000 UTC m=+1.805052616 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:44:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:25.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6241 DF PROTO=TCP SPT=60998 DPT=9102 SEQ=1726341803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9215010000000001030307) 
Oct 14 09:44:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:27 np0005486759.ooo.test sudo[269609]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:27 np0005486759.ooo.test sudo[269748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrlkacwycssgfwbrxzguofqdtuoffxoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435067.459086-916-136622649485737/AnsiballZ_file.py
Oct 14 09:44:27 np0005486759.ooo.test sudo[269748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:27 np0005486759.ooo.test python3.9[269750]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:44:27 np0005486759.ooo.test sudo[269748]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:44:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c760467e4bcded5cc756a9c9f562bd63684ba8491ef5a99de42ba004cfc34cbd-merged.mount: Deactivated successfully.
Oct 14 09:44:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c760467e4bcded5cc756a9c9f562bd63684ba8491ef5a99de42ba004cfc34cbd-merged.mount: Deactivated successfully.
Oct 14 09:44:28 np0005486759.ooo.test systemd[1]: libpod-conmon-b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.scope: Deactivated successfully.
Oct 14 09:44:28 np0005486759.ooo.test sudo[269858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzhjrittcxhyrtcirfamqfsbvwburzjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435068.1917257-925-165935771378790/AnsiballZ_podman_container_info.py
Oct 14 09:44:28 np0005486759.ooo.test sudo[269858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:28 np0005486759.ooo.test python3.9[269860]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct 14 09:44:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:44:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:30.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:30.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:30.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:30.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:30.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:30.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:30 np0005486759.ooo.test podman[269874]: 2025-10-14 09:44:30.943547933 +0000 UTC m=+0.567841152 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:44:30 np0005486759.ooo.test podman[269874]: 2025-10-14 09:44:30.972914491 +0000 UTC m=+0.597207670 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct 14 09:44:30 np0005486759.ooo.test podman[269874]: unhealthy
Oct 14 09:44:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.642 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.642 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.643 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.643 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.814 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.814 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.814 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:44:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:31.815 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.265 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.287 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.287 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.287 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.288 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.288 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.289 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.289 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.289 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.289 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.290 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.310 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.311 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.311 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.311 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.386 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.454 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.456 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.519 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.520 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:44:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.589 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.592 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.647 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.801 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.803 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12783MB free_disk=386.72205352783203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.804 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.804 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.874 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.875 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.875 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:44:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:32 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.918 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:44:32 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Failed with result 'exit-code'.
Oct 14 09:44:32 np0005486759.ooo.test sudo[269858]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.934 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.936 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:44:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:32.936 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:44:33 np0005486759.ooo.test sudo[270011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uewomtxugzguigbjpkcyalxtprjzenxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435073.101367-933-87675130906710/AnsiballZ_podman_container_exec.py
Oct 14 09:44:33 np0005486759.ooo.test sudo[270011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:33 np0005486759.ooo.test python3.9[270013]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:33 np0005486759.ooo.test systemd[1]: Started libpod-conmon-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.scope.
Oct 14 09:44:33 np0005486759.ooo.test podman[270014]: 2025-10-14 09:44:33.670675144 +0000 UTC m=+0.074612893 container exec f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:44:33 np0005486759.ooo.test podman[270014]: 2025-10-14 09:44:33.702200828 +0000 UTC m=+0.106138577 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:44:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:34 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2932 DF PROTO=TCP SPT=51232 DPT=9100 SEQ=195022179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9232D80000000001030307) 
Oct 14 09:44:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully.
Oct 14 09:44:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b917dec3e2eef93d51865cb5ed3798b1af37c60222c9bce06119f7a2c331ff68-merged.mount: Deactivated successfully.
Oct 14 09:44:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b917dec3e2eef93d51865cb5ed3798b1af37c60222c9bce06119f7a2c331ff68-merged.mount: Deactivated successfully.
Oct 14 09:44:35 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:35 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:35 np0005486759.ooo.test systemd[1]: libpod-conmon-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.scope: Deactivated successfully.
Oct 14 09:44:35 np0005486759.ooo.test sudo[270011]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:35 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2933 DF PROTO=TCP SPT=51232 DPT=9100 SEQ=195022179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9236C10000000001030307) 
Oct 14 09:44:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:35.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:35.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:35.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:35.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:35.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:35.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:35 np0005486759.ooo.test sudo[270150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwisyrghduehrirjxwwujpszzebhgwzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435075.3545635-941-38032616526733/AnsiballZ_podman_container_exec.py
Oct 14 09:44:35 np0005486759.ooo.test sudo[270150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:35 np0005486759.ooo.test python3.9[270152]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:35 np0005486759.ooo.test systemd[1]: Started libpod-conmon-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.scope.
Oct 14 09:44:35 np0005486759.ooo.test podman[270153]: 2025-10-14 09:44:35.943675265 +0000 UTC m=+0.091578883 container exec f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:44:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:35 np0005486759.ooo.test podman[270153]: 2025-10-14 09:44:35.975336383 +0000 UTC m=+0.123240031 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:44:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:44:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:44:36 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:44:36 np0005486759.ooo.test sudo[270150]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:36 np0005486759.ooo.test sudo[270288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sslrqizehtnxudiawmrqmksnkoiwtjke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435076.3981848-949-140048554178210/AnsiballZ_file.py
Oct 14 09:44:36 np0005486759.ooo.test sudo[270288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:36 np0005486759.ooo.test python3.9[270290]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:44:36 np0005486759.ooo.test sudo[270288]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:44:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:44:37 np0005486759.ooo.test systemd[1]: libpod-conmon-f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.scope: Deactivated successfully.
Oct 14 09:44:37 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:37 np0005486759.ooo.test sudo[270398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyfacwdoilvjsqjijbzpaclbmxjmajqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435077.0957925-958-150833838962376/AnsiballZ_podman_container_info.py
Oct 14 09:44:37 np0005486759.ooo.test sudo[270398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:37 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2934 DF PROTO=TCP SPT=51232 DPT=9100 SEQ=195022179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F923EC10000000001030307) 
Oct 14 09:44:37 np0005486759.ooo.test python3.9[270400]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct 14 09:44:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-699db7de424016e8e9e98e1f1065f647b839bf0571fc867d8e43c26f1a3cbb5b-merged.mount: Deactivated successfully.
Oct 14 09:44:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:40.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:41 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2935 DF PROTO=TCP SPT=51232 DPT=9100 SEQ=195022179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F924E810000000001030307) 
Oct 14 09:44:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:44:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:44:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:44:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully.
Oct 14 09:44:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully.
Oct 14 09:44:41 np0005486759.ooo.test sudo[270398]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:41 np0005486759.ooo.test podman[270415]: 2025-10-14 09:44:41.786845793 +0000 UTC m=+0.412316904 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:44:41 np0005486759.ooo.test podman[270415]: 2025-10-14 09:44:41.81714908 +0000 UTC m=+0.442620191 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:44:41 np0005486759.ooo.test podman[270416]: 2025-10-14 09:44:41.831100746 +0000 UTC m=+0.451716348 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 14 09:44:41 np0005486759.ooo.test podman[270416]: 2025-10-14 09:44:41.841696341 +0000 UTC m=+0.462311913 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct 14 09:44:41 np0005486759.ooo.test podman[270414]: 2025-10-14 09:44:41.890239145 +0000 UTC m=+0.518229973 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 14 09:44:41 np0005486759.ooo.test podman[270414]: 2025-10-14 09:44:41.925641809 +0000 UTC m=+0.553632627 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:44:42 np0005486759.ooo.test sudo[270587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egfpfuzsitpmkjnyudtrejryazfhwcas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435081.9757156-966-85855301907485/AnsiballZ_podman_container_exec.py
Oct 14 09:44:42 np0005486759.ooo.test sudo[270587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:42 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54471 DF PROTO=TCP SPT=42992 DPT=9882 SEQ=227353523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9252280000000001030307) 
Oct 14 09:44:42 np0005486759.ooo.test python3.9[270589]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:44:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-7e3b7dbefcbb84080782963e37dee9e7b27d279c0d8fc921be0c707bdde182ef-merged.mount: Deactivated successfully.
Oct 14 09:44:43 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54472 DF PROTO=TCP SPT=42992 DPT=9882 SEQ=227353523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9256420000000001030307) 
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:44:43 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:43 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:44:43 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:44:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:45.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:44:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:45.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:45.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:44:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:45.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:45.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:44:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:45.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:44:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:44:46 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:44:46 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:46 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:46 np0005486759.ooo.test systemd[1]: Started libpod-conmon-347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.scope.
Oct 14 09:44:46 np0005486759.ooo.test podman[270590]: 2025-10-14 09:44:46.496079076 +0000 UTC m=+4.010948344 container exec 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:44:46 np0005486759.ooo.test podman[270601]: 2025-10-14 09:44:46.514035676 +0000 UTC m=+2.708695149 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:44:46 np0005486759.ooo.test podman[270601]: 2025-10-14 09:44:46.552720529 +0000 UTC m=+2.747379992 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:44:46 np0005486759.ooo.test podman[270590]: 2025-10-14 09:44:46.581786748 +0000 UTC m=+4.096656026 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:44:46 np0005486759.ooo.test podman[270602]: 2025-10-14 09:44:46.554903525 +0000 UTC m=+2.769172988 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 09:44:46 np0005486759.ooo.test podman[270602]: 2025-10-14 09:44:46.637450151 +0000 UTC m=+2.851719654 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:44:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:47 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31858 DF PROTO=TCP SPT=45598 DPT=9105 SEQ=4216612537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9265010000000001030307) 
Oct 14 09:44:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test sudo[270587]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:48 np0005486759.ooo.test podman[270655]: 2025-10-14 09:44:48.478781436 +0000 UTC m=+0.287919709 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 14 09:44:48 np0005486759.ooo.test podman[270655]: 2025-10-14 09:44:48.515288283 +0000 UTC m=+0.324426546 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 09:44:48 np0005486759.ooo.test sudo[270781]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayfrpmvafbtxncpasauqogamcmrxlnjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435088.5676918-974-183086302773805/AnsiballZ_podman_container_exec.py
Oct 14 09:44:48 np0005486759.ooo.test sudo[270781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:48 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test systemd[1]: libpod-conmon-347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.scope: Deactivated successfully.
Oct 14 09:44:48 np0005486759.ooo.test python3.9[270783]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:49 np0005486759.ooo.test systemd[1]: Started libpod-conmon-347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.scope.
Oct 14 09:44:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:49 np0005486759.ooo.test podman[270784]: 2025-10-14 09:44:49.073448679 +0000 UTC m=+0.077111638 container exec 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:44:49 np0005486759.ooo.test podman[270784]: 2025-10-14 09:44:49.106412798 +0000 UTC m=+0.110075757 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:44:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54474 DF PROTO=TCP SPT=42992 DPT=9882 SEQ=227353523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F926E010000000001030307) 
Oct 14 09:44:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:50.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:50.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6b7ccf96a28197636c7a5b8f45056e04db2357d7c2dc122633e916788515691d-merged.mount: Deactivated successfully.
Oct 14 09:44:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b917dec3e2eef93d51865cb5ed3798b1af37c60222c9bce06119f7a2c331ff68-merged.mount: Deactivated successfully.
Oct 14 09:44:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b917dec3e2eef93d51865cb5ed3798b1af37c60222c9bce06119f7a2c331ff68-merged.mount: Deactivated successfully.
Oct 14 09:44:50 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:50 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:50 np0005486759.ooo.test systemd[1]: libpod-conmon-347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.scope: Deactivated successfully.
Oct 14 09:44:50 np0005486759.ooo.test sudo[270781]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:51 np0005486759.ooo.test sudo[270918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnpicqpcyvvxsuuqcqalkukqkynwying ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435091.1289876-982-176531676615953/AnsiballZ_file.py
Oct 14 09:44:51 np0005486759.ooo.test sudo[270918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:51 np0005486759.ooo.test python3.9[270920]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:44:51 np0005486759.ooo.test sudo[270918]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:52 np0005486759.ooo.test sudo[271028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsdgmahthimduyycyzxtriaspnpkltxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435091.8276272-991-207016530594752/AnsiballZ_podman_container_info.py
Oct 14 09:44:52 np0005486759.ooo.test sudo[271028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:44:52 np0005486759.ooo.test python3.9[271030]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct 14 09:44:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48661 DF PROTO=TCP SPT=47164 DPT=9102 SEQ=3359132338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F927A410000000001030307) 
Oct 14 09:44:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:44:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:44:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:44:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:44:54.145 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:44:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:44:54.146 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:44:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:44:54.148 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:44:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:44:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25-merged.mount: Deactivated successfully.
Oct 14 09:44:54 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:54 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:54 np0005486759.ooo.test podman[271042]: 2025-10-14 09:44:54.665024928 +0000 UTC m=+1.292119784 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:44:54 np0005486759.ooo.test podman[271042]: 2025-10-14 09:44:54.669503318 +0000 UTC m=+1.296598174 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:44:54 np0005486759.ooo.test podman[271042]: unhealthy
Oct 14 09:44:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:44:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:55.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:44:55.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:44:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48662 DF PROTO=TCP SPT=47164 DPT=9102 SEQ=3359132338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F928A020000000001030307) 
Oct 14 09:44:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:44:57 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:57 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:44:57 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:44:57 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:44:57 np0005486759.ooo.test sudo[271028]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:57 np0005486759.ooo.test sudo[271171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojkosskvsfxzhxfvxlygxosglfkunzhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435097.311953-999-118395908513586/AnsiballZ_podman_container_exec.py
Oct 14 09:44:57 np0005486759.ooo.test sudo[271171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:44:57 np0005486759.ooo.test python3.9[271173]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:44:57 np0005486759.ooo.test systemd[1]: Started libpod-conmon-8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.scope.
Oct 14 09:44:57 np0005486759.ooo.test podman[271174]: 2025-10-14 09:44:57.960898247 +0000 UTC m=+0.099382953 container exec 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:44:57 np0005486759.ooo.test podman[271174]: 2025-10-14 09:44:57.989213881 +0000 UTC m=+0.127698577 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:44:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:44:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-7e3b7dbefcbb84080782963e37dee9e7b27d279c0d8fc921be0c707bdde182ef-merged.mount: Deactivated successfully.
Oct 14 09:44:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:44:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:44:59 np0005486759.ooo.test sudo[271171]: pam_unix(sudo:session): session closed for user root
Oct 14 09:44:59 np0005486759.ooo.test sudo[271310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gweleswsuvyrmipqyhlhlwcldgvktpvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435099.3395789-1007-255037062683021/AnsiballZ_podman_container_exec.py
Oct 14 09:44:59 np0005486759.ooo.test sudo[271310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:44:59 np0005486759.ooo.test python3.9[271312]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:45:00 np0005486759.ooo.test rsyslogd[758]: imjournal: 2531 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct 14 09:45:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:00.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:00.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:01 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:01 np0005486759.ooo.test systemd[1]: libpod-conmon-8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.scope: Deactivated successfully.
Oct 14 09:45:01 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:01 np0005486759.ooo.test systemd[1]: Started libpod-conmon-8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.scope.
Oct 14 09:45:01 np0005486759.ooo.test podman[271313]: 2025-10-14 09:45:01.898679284 +0000 UTC m=+2.025989222 container exec 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:45:01 np0005486759.ooo.test podman[271313]: 2025-10-14 09:45:01.926784351 +0000 UTC m=+2.054094289 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:45:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:45:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:03 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:03 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:03 np0005486759.ooo.test systemd[1]: libpod-conmon-8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.scope: Deactivated successfully.
Oct 14 09:45:03 np0005486759.ooo.test sudo[271310]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:03 np0005486759.ooo.test podman[271343]: 2025-10-14 09:45:03.891356674 +0000 UTC m=+0.517014660 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Oct 14 09:45:03 np0005486759.ooo.test podman[271343]: 2025-10-14 09:45:03.927651967 +0000 UTC m=+0.553309963 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:45:04 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23160 DF PROTO=TCP SPT=40886 DPT=9100 SEQ=471976819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92A8080000000001030307) 
Oct 14 09:45:04 np0005486759.ooo.test sudo[271469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-navfhkkphjpyhdzzmksysporjpszdxcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435104.0226138-1015-119835059054838/AnsiballZ_file.py
Oct 14 09:45:04 np0005486759.ooo.test sudo[271469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:04 np0005486759.ooo.test python3.9[271471]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:04 np0005486759.ooo.test sudo[271469]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:04 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:04 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:04 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:45:04 np0005486759.ooo.test sudo[271579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqkquwlrzwawwwyijmnnabjurkmegsus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435104.648514-1024-12611995153132/AnsiballZ_podman_container_info.py
Oct 14 09:45:04 np0005486759.ooo.test sudo[271579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:05 np0005486759.ooo.test python3.9[271581]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct 14 09:45:05 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23161 DF PROTO=TCP SPT=40886 DPT=9100 SEQ=471976819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92AC010000000001030307) 
Oct 14 09:45:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:05.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:05.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:45:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-2a1f8014015fac1f54a9bc082942615ea4555a297c6729790a93ba597cfe8e5d-merged.mount: Deactivated successfully.
Oct 14 09:45:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-2a1f8014015fac1f54a9bc082942615ea4555a297c6729790a93ba597cfe8e5d-merged.mount: Deactivated successfully.
Oct 14 09:45:07 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23162 DF PROTO=TCP SPT=40886 DPT=9100 SEQ=471976819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92B4010000000001030307) 
Oct 14 09:45:07 np0005486759.ooo.test sudo[271579]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:08 np0005486759.ooo.test sudo[271702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpeymqeyddasrblxwjausshbiaiczjqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435107.8197517-1032-273505365134788/AnsiballZ_podman_container_exec.py
Oct 14 09:45:08 np0005486759.ooo.test sudo[271702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:08 np0005486759.ooo.test python3.9[271704]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:45:08 np0005486759.ooo.test systemd[1]: Started libpod-conmon-60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.scope.
Oct 14 09:45:08 np0005486759.ooo.test podman[271705]: 2025-10-14 09:45:08.336143886 +0000 UTC m=+0.092067624 container exec 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, distribution-scope=public)
Oct 14 09:45:08 np0005486759.ooo.test podman[271705]: 2025-10-14 09:45:08.365177442 +0000 UTC m=+0.121101250 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Oct 14 09:45:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:45:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:45:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:45:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:10 np0005486759.ooo.test sudo[271702]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:10.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-4c5dce4b64ecba9518378f20b244143b3b2cf38f6212cc72879147a1b19f6a25-merged.mount: Deactivated successfully.
Oct 14 09:45:10 np0005486759.ooo.test sudo[271841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcemryvuubtinfxpzkjaxrcszwboehmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435110.4009674-1040-52181072996581/AnsiballZ_podman_container_exec.py
Oct 14 09:45:10 np0005486759.ooo.test sudo[271841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:10 np0005486759.ooo.test python3.9[271843]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 14 09:45:11 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23163 DF PROTO=TCP SPT=40886 DPT=9100 SEQ=471976819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92C3C10000000001030307) 
Oct 14 09:45:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:45:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:45:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:45:12 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36683 DF PROTO=TCP SPT=43576 DPT=9882 SEQ=1473861565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92C7580000000001030307) 
Oct 14 09:45:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:45:12 np0005486759.ooo.test systemd[1]: libpod-conmon-60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.scope: Deactivated successfully.
Oct 14 09:45:12 np0005486759.ooo.test systemd[1]: Started libpod-conmon-60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.scope.
Oct 14 09:45:12 np0005486759.ooo.test podman[271844]: 2025-10-14 09:45:12.746508324 +0000 UTC m=+1.802309679 container exec 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 09:45:12 np0005486759.ooo.test podman[271844]: 2025-10-14 09:45:12.777312315 +0000 UTC m=+1.833113710 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 09:45:13 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36684 DF PROTO=TCP SPT=43576 DPT=9882 SEQ=1473861565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92CB410000000001030307) 
Oct 14 09:45:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:45:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:45:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:45:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:45:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:14 np0005486759.ooo.test systemd[1]: libpod-conmon-60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.scope: Deactivated successfully.
Oct 14 09:45:14 np0005486759.ooo.test sudo[271841]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:14 np0005486759.ooo.test podman[271874]: 2025-10-14 09:45:14.687400468 +0000 UTC m=+0.343697749 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:45:14 np0005486759.ooo.test podman[271874]: 2025-10-14 09:45:14.694940423 +0000 UTC m=+0.351237684 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 09:45:14 np0005486759.ooo.test podman[271873]: 2025-10-14 09:45:14.733675342 +0000 UTC m=+0.389995084 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:45:14 np0005486759.ooo.test podman[271873]: 2025-10-14 09:45:14.741059243 +0000 UTC m=+0.397378985 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:45:15 np0005486759.ooo.test sudo[272021]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxjeytudffdhybdbnxnbwjwtcyyqzxtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435114.852282-1048-239173258959170/AnsiballZ_file.py
Oct 14 09:45:15 np0005486759.ooo.test sudo[272021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:15 np0005486759.ooo.test python3.9[272023]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:15 np0005486759.ooo.test sudo[272021]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:15.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:15 np0005486759.ooo.test sudo[272131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urfazduwmhuhtpguvqrxpaakrrhblcgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435115.584327-1057-22780470859901/AnsiballZ_file.py
Oct 14 09:45:15 np0005486759.ooo.test sudo[272131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:45:15 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:45:16 np0005486759.ooo.test python3.9[272133]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:16 np0005486759.ooo.test sudo[272131]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:16 np0005486759.ooo.test sudo[272241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqlxsivfxzukyebrmhepcdrgxueezoex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435116.2380505-1065-107132056690165/AnsiballZ_stat.py
Oct 14 09:45:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:45:16 np0005486759.ooo.test sudo[272241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:16 np0005486759.ooo.test podman[272243]: 2025-10-14 09:45:16.57568158 +0000 UTC m=+0.066367463 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:45:16 np0005486759.ooo.test podman[272243]: 2025-10-14 09:45:16.700218727 +0000 UTC m=+0.190904570 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, tcib_managed=true)
Oct 14 09:45:16 np0005486759.ooo.test python3.9[272244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:16 np0005486759.ooo.test sudo[272241]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:17 np0005486759.ooo.test sudo[272354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svrpacmuntulvaajnggtsoowfheshijy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435116.2380505-1065-107132056690165/AnsiballZ_copy.py
Oct 14 09:45:17 np0005486759.ooo.test sudo[272354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:17 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5489 DF PROTO=TCP SPT=36174 DPT=9105 SEQ=3786653329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92DA010000000001030307) 
Oct 14 09:45:17 np0005486759.ooo.test python3.9[272356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760435116.2380505-1065-107132056690165/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:17 np0005486759.ooo.test sudo[272354]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:17 np0005486759.ooo.test sudo[272464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usmokpkqwalegujoewutpmtqzdljcctl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435117.5742772-1081-202411065147930/AnsiballZ_file.py
Oct 14 09:45:17 np0005486759.ooo.test sudo[272464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:17 np0005486759.ooo.test python3.9[272466]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:18 np0005486759.ooo.test sudo[272464]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:45:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7-merged.mount: Deactivated successfully.
Oct 14 09:45:18 np0005486759.ooo.test sudo[272574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tauhqrkymuvapoygtsaabmsbrdpjcxnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435118.1163292-1089-248516224240599/AnsiballZ_stat.py
Oct 14 09:45:18 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:45:18 np0005486759.ooo.test sudo[272574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:18 np0005486759.ooo.test python3.9[272576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:18 np0005486759.ooo.test sudo[272574]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:18 np0005486759.ooo.test sudo[272631]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhlkttlewbidnnpayptezbwaqxvxozxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435118.1163292-1089-248516224240599/AnsiballZ_file.py
Oct 14 09:45:18 np0005486759.ooo.test sudo[272631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:45:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:45:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:45:18 np0005486759.ooo.test podman[272635]: 2025-10-14 09:45:18.9200961 +0000 UTC m=+0.075935561 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:45:18 np0005486759.ooo.test podman[272633]: 2025-10-14 09:45:18.930196535 +0000 UTC m=+0.085704986 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:45:18 np0005486759.ooo.test podman[272633]: 2025-10-14 09:45:18.965371193 +0000 UTC m=+0.120879664 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:45:19 np0005486759.ooo.test podman[272635]: 2025-10-14 09:45:19.001500681 +0000 UTC m=+0.157340162 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:45:19 np0005486759.ooo.test podman[272636]: 2025-10-14 09:45:18.970060639 +0000 UTC m=+0.120536944 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 09:45:19 np0005486759.ooo.test python3.9[272634]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:19 np0005486759.ooo.test podman[272636]: 2025-10-14 09:45:19.052341898 +0000 UTC m=+0.202818223 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, version=9.6)
Oct 14 09:45:19 np0005486759.ooo.test sudo[272631]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36686 DF PROTO=TCP SPT=43576 DPT=9882 SEQ=1473861565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92E3010000000001030307) 
Oct 14 09:45:19 np0005486759.ooo.test sudo[272793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqyjzdwychxuvhtfpyxnzjljyvcstnbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435119.2271388-1101-143651350875157/AnsiballZ_stat.py
Oct 14 09:45:19 np0005486759.ooo.test sudo[272793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:19 np0005486759.ooo.test python3.9[272795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully.
Oct 14 09:45:19 np0005486759.ooo.test sudo[272793]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully.
Oct 14 09:45:19 np0005486759.ooo.test sudo[272850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-novudnermsdqqgdeqeschvobcjyeyvlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435119.2271388-1101-143651350875157/AnsiballZ_file.py
Oct 14 09:45:19 np0005486759.ooo.test sudo[272850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:20 np0005486759.ooo.test python3.9[272852]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.edy0wf8n recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:20 np0005486759.ooo.test sudo[272850]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:20.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:45:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:45:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:45:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:20.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:45:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:20.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:20 np0005486759.ooo.test sudo[272960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veueyvuaukhesdxyjqhhnzqamulsfuqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435120.3076222-1113-233079987047886/AnsiballZ_stat.py
Oct 14 09:45:20 np0005486759.ooo.test sudo[272960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-182f4b56e6e8809f2ffde261aea7a82f597fbc875533d1efd7f59fe7c8a139ed-merged.mount: Deactivated successfully.
Oct 14 09:45:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-2a1f8014015fac1f54a9bc082942615ea4555a297c6729790a93ba597cfe8e5d-merged.mount: Deactivated successfully.
Oct 14 09:45:20 np0005486759.ooo.test python3.9[272962]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:20 np0005486759.ooo.test sudo[272960]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:20 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:45:20 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:45:20 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:45:21 np0005486759.ooo.test sudo[273017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eymvnochfkyievnnwpegcojnbnlxxshv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435120.3076222-1113-233079987047886/AnsiballZ_file.py
Oct 14 09:45:21 np0005486759.ooo.test sudo[273017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:21 np0005486759.ooo.test python3.9[273019]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:21 np0005486759.ooo.test sudo[273017]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:21 np0005486759.ooo.test sudo[273127]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfbghracmedjecxwdilnwgpfvilfhdci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435121.3713005-1126-226484803856058/AnsiballZ_command.py
Oct 14 09:45:21 np0005486759.ooo.test sudo[273127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:21 np0005486759.ooo.test python3.9[273129]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:45:21 np0005486759.ooo.test sudo[273127]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:22 np0005486759.ooo.test sudo[273238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzifqeiygmflvkfybguaoyzsopwvfysy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435122.0199158-1134-69388631259345/AnsiballZ_edpm_nftables_from_files.py
Oct 14 09:45:22 np0005486759.ooo.test sudo[273238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=438 DF PROTO=TCP SPT=44444 DPT=9102 SEQ=3917219747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92EF810000000001030307) 
Oct 14 09:45:22 np0005486759.ooo.test python3[273240]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 14 09:45:22 np0005486759.ooo.test sudo[273238]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:45:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:45:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:23 np0005486759.ooo.test sudo[273348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzdfpvtzzimdspmrbonetpyvoaaknxwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435122.8462977-1142-215143928043504/AnsiballZ_stat.py
Oct 14 09:45:23 np0005486759.ooo.test sudo[273348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:23 np0005486759.ooo.test python3.9[273350]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:23 np0005486759.ooo.test sudo[273348]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:23 np0005486759.ooo.test sudo[273405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iacyjuitrmadraslgwzelmnuktaiopae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435122.8462977-1142-215143928043504/AnsiballZ_file.py
Oct 14 09:45:23 np0005486759.ooo.test sudo[273405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:45:23 np0005486759.ooo.test python3.9[273407]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:23 np0005486759.ooo.test sudo[273405]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully.
Oct 14 09:45:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a6039781ecbdbb8b912b1fabb353ef7c9e921a35bfb1c76a3538df9595de5862-merged.mount: Deactivated successfully.
Oct 14 09:45:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a6039781ecbdbb8b912b1fabb353ef7c9e921a35bfb1c76a3538df9595de5862-merged.mount: Deactivated successfully.
Oct 14 09:45:24 np0005486759.ooo.test sudo[273515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuztbslasmiacdyezqoawdadupayfppx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435124.0082011-1154-159168345600601/AnsiballZ_stat.py
Oct 14 09:45:24 np0005486759.ooo.test sudo[273515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:24 np0005486759.ooo.test python3.9[273517]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:24 np0005486759.ooo.test sudo[273515]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:45:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:45:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:25 np0005486759.ooo.test sudo[273572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjxocfysgxvarjiqrllhijrqvxtsggdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435124.0082011-1154-159168345600601/AnsiballZ_file.py
Oct 14 09:45:25 np0005486759.ooo.test sudo[273572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:45:25 np0005486759.ooo.test python3.9[273574]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:25 np0005486759.ooo.test sudo[273572]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.785 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.812 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.812 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.812 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:25.812 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:45:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:45:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:45:26 np0005486759.ooo.test sudo[273682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beffaaircxadxmckwctihuogtlvdexjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435125.8074539-1166-6965703291587/AnsiballZ_stat.py
Oct 14 09:45:26 np0005486759.ooo.test sudo[273682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:26 np0005486759.ooo.test python3.9[273684]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:26 np0005486759.ooo.test sudo[273682]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:26.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:26.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:26.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:26.499 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:26 np0005486759.ooo.test sudo[273739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzygbnoerhjmdjhqpwkbgysaewfcbaqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435125.8074539-1166-6965703291587/AnsiballZ_file.py
Oct 14 09:45:26 np0005486759.ooo.test sudo[273739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=439 DF PROTO=TCP SPT=44444 DPT=9102 SEQ=3917219747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F92FF420000000001030307) 
Oct 14 09:45:26 np0005486759.ooo.test python3.9[273741]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:26 np0005486759.ooo.test sudo[273739]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:45:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:45:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:45:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:45:27 np0005486759.ooo.test podman[273775]: 2025-10-14 09:45:27.202357747 +0000 UTC m=+0.062288055 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:45:27 np0005486759.ooo.test podman[273775]: 2025-10-14 09:45:27.23708976 +0000 UTC m=+0.097020048 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:45:27 np0005486759.ooo.test podman[273775]: unhealthy
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.499 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.872 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.872 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.873 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:45:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:27.873 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:45:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:45:28 np0005486759.ooo.test sudo[273872]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhmpkjtptyhcmzaplwlhjlqhglvfvzka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435126.8876953-1178-125719418026442/AnsiballZ_stat.py
Oct 14 09:45:28 np0005486759.ooo.test sudo[273872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7-merged.mount: Deactivated successfully.
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.242 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:45:28 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:45:28 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:45:28 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.257 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.258 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:45:28 np0005486759.ooo.test python3.9[273874]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:28 np0005486759.ooo.test sudo[273872]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.527 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.527 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.528 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.528 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:45:28 np0005486759.ooo.test sudo[273929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dptlsibjdsnxcvnycyspkddvyxpvkyjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435126.8876953-1178-125719418026442/AnsiballZ_file.py
Oct 14 09:45:28 np0005486759.ooo.test sudo[273929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.586 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.659 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.661 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.704 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.705 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.758 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.760 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:45:28 np0005486759.ooo.test python3.9[273931]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.812 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:45:28 np0005486759.ooo.test sudo[273929]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.997 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.998 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12855MB free_disk=386.72341537475586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:45:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:28.999 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.000 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.073 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.075 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.075 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.119 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.135 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.137 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:45:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:29.138 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:45:29 np0005486759.ooo.test sudo[274051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thjskhdgkxwmtmyolryrombuvbbivmva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435128.9732118-1190-102927619992750/AnsiballZ_stat.py
Oct 14 09:45:29 np0005486759.ooo.test sudo[274051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully.
Oct 14 09:45:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:29 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:29 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:29 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:29 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:45:29Z" level=error msg="Getting root fs size for \"7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6\": getting diffsize of layer \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": unmounting layer d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610: replacing mount point \"/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged\": device or resource busy"
Oct 14 09:45:29 np0005486759.ooo.test python3.9[274053]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:29 np0005486759.ooo.test sudo[274051]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully.
Oct 14 09:45:29 np0005486759.ooo.test sudo[274141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfwqzfcjcvuzyvagohqnxfugvhygtmaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435128.9732118-1190-102927619992750/AnsiballZ_copy.py
Oct 14 09:45:29 np0005486759.ooo.test sudo[274141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:29 np0005486759.ooo.test python3.9[274143]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760435128.9732118-1190-102927619992750/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:29 np0005486759.ooo.test sudo[274141]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:30 np0005486759.ooo.test sudo[274251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgdsbklqiwlxinynmvwsozwbyjtpbepw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435130.1267602-1205-10276845120367/AnsiballZ_file.py
Oct 14 09:45:30 np0005486759.ooo.test sudo[274251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:30.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:30 np0005486759.ooo.test python3.9[274253]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:30 np0005486759.ooo.test sudo[274251]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:45:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d3e85029358dd8d56349cc32f50b1a52c41f0f3e6ed9b12e81f053986ea14f71-merged.mount: Deactivated successfully.
Oct 14 09:45:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:31 np0005486759.ooo.test sudo[274361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffkcnhpxujvfwqxlxjvfaglcpiqvjaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435130.812086-1213-99989105825854/AnsiballZ_command.py
Oct 14 09:45:31 np0005486759.ooo.test sudo[274361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:31 np0005486759.ooo.test python3.9[274363]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:45:31 np0005486759.ooo.test sudo[274361]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:31 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:31 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:31 np0005486759.ooo.test sudo[274474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnajuqmlfrkpuwgxdinbrhhrgcmotysd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435131.4507184-1221-37352909594329/AnsiballZ_blockinfile.py
Oct 14 09:45:31 np0005486759.ooo.test sudo[274474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:32 np0005486759.ooo.test python3.9[274476]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/edpm-chains.nft"
                                                         include "/etc/nftables/edpm-rules.nft"
                                                         include "/etc/nftables/edpm-jumps.nft"
                                                          path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:32 np0005486759.ooo.test sudo[274474]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:32 np0005486759.ooo.test sudo[274584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avfkujultrvgnusjvffcnozuistqwigk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435132.3625643-1230-234117191075526/AnsiballZ_command.py
Oct 14 09:45:32 np0005486759.ooo.test sudo[274584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46476cb54c317ede576986c939135db930b5a6eeb4db9b988aa8d7ddee484bf8-merged.mount: Deactivated successfully.
Oct 14 09:45:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a6039781ecbdbb8b912b1fabb353ef7c9e921a35bfb1c76a3538df9595de5862-merged.mount: Deactivated successfully.
Oct 14 09:45:32 np0005486759.ooo.test python3.9[274586]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:45:32 np0005486759.ooo.test sudo[274584]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:45:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-39a1665d335f40f9d6ad1b2f0d2d8e770e46fe1b712ba5efdccec54f84e9ea3f-merged.mount: Deactivated successfully.
Oct 14 09:45:33 np0005486759.ooo.test sudo[274695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sozhmeuhcvqtzxchzylgzojwpkvkwdmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435133.0090115-1238-213266621021464/AnsiballZ_stat.py
Oct 14 09:45:33 np0005486759.ooo.test sudo[274695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:33 np0005486759.ooo.test python3.9[274697]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:45:33 np0005486759.ooo.test sudo[274695]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:33 np0005486759.ooo.test sudo[274807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqlpgfansecybexvzmrjpjijyeqicosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435133.735289-1246-8291126807724/AnsiballZ_command.py
Oct 14 09:45:33 np0005486759.ooo.test sudo[274807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:34 np0005486759.ooo.test python3.9[274809]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:45:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:45:34 np0005486759.ooo.test sudo[274807]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:45:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:45:34 np0005486759.ooo.test sudo[274920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clvifaducvvhetjiwwvnyaumfmimgska ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435134.342645-1254-25662543662700/AnsiballZ_file.py
Oct 14 09:45:34 np0005486759.ooo.test sudo[274920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:34 np0005486759.ooo.test python3.9[274922]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:34 np0005486759.ooo.test sudo[274920]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:35 np0005486759.ooo.test sshd[255797]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: session-39.scope: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: session-39.scope: Consumed 1min 22.757s CPU time.
Oct 14 09:45:35 np0005486759.ooo.test systemd-logind[759]: Session 39 logged out. Waiting for processes to exit.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:45:35 np0005486759.ooo.test systemd-logind[759]: Removed session 39.
Oct 14 09:45:35 np0005486759.ooo.test podman[274940]: 2025-10-14 09:45:35.286898732 +0000 UTC m=+0.081120493 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 09:45:35 np0005486759.ooo.test podman[274940]: 2025-10-14 09:45:35.328389897 +0000 UTC m=+0.122611658 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute)
Oct 14 09:45:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:35.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:45:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:35.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:45:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:35.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:45:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:35.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:45:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:35.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:35.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:45:35 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:35 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:36 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:45:36Z" level=error msg="Getting root fs size for \"7caf389deee67e572f6ee38450f66737455584b27c252100268d48b58aa62bf6\": getting diffsize of layer \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": unmounting layer d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610: replacing mount point \"/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged\": device or resource busy"
Oct 14 09:45:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:45:38 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d3e85029358dd8d56349cc32f50b1a52c41f0f3e6ed9b12e81f053986ea14f71-merged.mount: Deactivated successfully.
Oct 14 09:45:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:45:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:40 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:40.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:45:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:40.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:40.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:45:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:40.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:45:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:40.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:45:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:40.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:40 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:45:40 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-39a1665d335f40f9d6ad1b2f0d2d8e770e46fe1b712ba5efdccec54f84e9ea3f-merged.mount: Deactivated successfully.
Oct 14 09:45:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb-merged.mount: Deactivated successfully.
Oct 14 09:45:42 np0005486759.ooo.test sshd[274958]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:45:42 np0005486759.ooo.test sshd[274958]: Accepted publickey for zuul from 192.168.122.31 port 54544 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:45:42 np0005486759.ooo.test systemd-logind[759]: New session 40 of user zuul.
Oct 14 09:45:42 np0005486759.ooo.test systemd[1]: Started Session 40 of User zuul.
Oct 14 09:45:42 np0005486759.ooo.test sshd[274958]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:45:43 np0005486759.ooo.test sudo[275069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skrvzvnnubdmwiioyxpgionujzkiqoza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435142.8475802-22-127861987438312/AnsiballZ_file.py
Oct 14 09:45:43 np0005486759.ooo.test sudo[275069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:45:43 np0005486759.ooo.test python3.9[275071]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:43 np0005486759.ooo.test sudo[275069]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully.
Oct 14 09:45:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully.
Oct 14 09:45:43 np0005486759.ooo.test sudo[275179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shcnopnsgybfkgmdoestgnctmbqrhzes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435143.662885-22-184898543524703/AnsiballZ_file.py
Oct 14 09:45:43 np0005486759.ooo.test sudo[275179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:45:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:45:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:45:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:45:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:45:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:45:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:45:44 np0005486759.ooo.test python3.9[275181]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:44 np0005486759.ooo.test sudo[275179]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:44 np0005486759.ooo.test sudo[275294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuuaijbdwahzgribxqziesnwmtlhzfsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435144.2309997-22-148514829981983/AnsiballZ_file.py
Oct 14 09:45:44 np0005486759.ooo.test sudo[275294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:45:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:45:44 np0005486759.ooo.test python3.9[275296]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:44 np0005486759.ooo.test sudo[275294]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:45:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:45:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:45.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:45 np0005486759.ooo.test python3.9[275404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:46 np0005486759.ooo.test python3.9[275490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435144.859661-48-201284431914378/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully.
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:45:46 np0005486759.ooo.test podman[275523]: 2025-10-14 09:45:46.425114816 +0000 UTC m=+0.069240922 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:45:46 np0005486759.ooo.test podman[275521]: 2025-10-14 09:45:46.465783555 +0000 UTC m=+0.109080555 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:45:46 np0005486759.ooo.test podman[275523]: 2025-10-14 09:45:46.490362903 +0000 UTC m=+0.134489029 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:45:46 np0005486759.ooo.test podman[275521]: 2025-10-14 09:45:46.498181427 +0000 UTC m=+0.141478417 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Oct 14 09:45:46 np0005486759.ooo.test python3.9[275640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:47 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:45:47 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:45:47 np0005486759.ooo.test python3.9[275726]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435146.3619256-63-210622468909271/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:47 np0005486759.ooo.test python3.9[275834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:45:48 np0005486759.ooo.test python3.9[275920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435147.451559-63-66445543670639/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:45:48 np0005486759.ooo.test podman[275960]: 2025-10-14 09:45:48.648998944 +0000 UTC m=+0.082833127 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Oct 14 09:45:48 np0005486759.ooo.test podman[275960]: 2025-10-14 09:45:48.678544226 +0000 UTC m=+0.112378399 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:45:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:48 np0005486759.ooo.test python3.9[276053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:45:49 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:49 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:45:49 np0005486759.ooo.test python3.9[276139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435148.519893-63-189852439629108/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=248f5462dee325deb15bea2155b02608eaf726ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:49 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:49 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:49 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:49 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:45:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26743 DF PROTO=TCP SPT=59748 DPT=9102 SEQ=310825694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9358B80000000001030307) 
Oct 14 09:45:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:50.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26744 DF PROTO=TCP SPT=59748 DPT=9102 SEQ=310825694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F935CC10000000001030307) 
Oct 14 09:45:50 np0005486759.ooo.test python3.9[276247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:45:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:45:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:45:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:45:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4-merged.mount: Deactivated successfully.
Oct 14 09:45:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4-merged.mount: Deactivated successfully.
Oct 14 09:45:50 np0005486759.ooo.test podman[276308]: 2025-10-14 09:45:50.950185594 +0000 UTC m=+0.062733099 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Oct 14 09:45:50 np0005486759.ooo.test podman[276308]: 2025-10-14 09:45:50.962225761 +0000 UTC m=+0.074773306 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, name=ubi9-minimal, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 14 09:45:51 np0005486759.ooo.test python3.9[276364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435150.1759923-121-205257406226153/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=95b1899b47d8e4bacfcacfb8a591296c275f90be backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb-merged.mount: Deactivated successfully.
Oct 14 09:45:51 np0005486759.ooo.test python3.9[276479]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:45:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fcdcb3112f63ebf2e44f7c3ef5008c08b3f042acc83b4c694afd4ebe8b8aadbb-merged.mount: Deactivated successfully.
Oct 14 09:45:52 np0005486759.ooo.test sudo[276589]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgasgajquavedelahlitwawznporzdyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435151.9443498-145-133055632236764/AnsiballZ_file.py
Oct 14 09:45:52 np0005486759.ooo.test sudo[276589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:52 np0005486759.ooo.test python3.9[276591]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:52 np0005486759.ooo.test sudo[276589]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26745 DF PROTO=TCP SPT=59748 DPT=9102 SEQ=310825694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9364C10000000001030307) 
Oct 14 09:45:52 np0005486759.ooo.test sudo[276699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bazoqmuqopggqgyaxvduzrfuymzrhhyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435152.6182945-153-171938148797142/AnsiballZ_stat.py
Oct 14 09:45:52 np0005486759.ooo.test sudo[276699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:53 np0005486759.ooo.test python3.9[276701]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:53 np0005486759.ooo.test sudo[276699]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:45:53 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:45:53 np0005486759.ooo.test podman[276315]: 2025-10-14 09:45:53.335852942 +0000 UTC m=+2.445403783 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:45:53 np0005486759.ooo.test podman[276315]: 2025-10-14 09:45:53.364150005 +0000 UTC m=+2.473700866 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 09:45:53 np0005486759.ooo.test sudo[276763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvttaouufemlywthjwlkissivzzjtjnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435152.6182945-153-171938148797142/AnsiballZ_file.py
Oct 14 09:45:53 np0005486759.ooo.test sudo[276763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:53 np0005486759.ooo.test python3.9[276765]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:53 np0005486759.ooo.test sudo[276763]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:54 np0005486759.ooo.test sudo[276873]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhcovrsfbsmtlcmkjjlpyswronmtpydi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435153.7861452-153-39790435768615/AnsiballZ_stat.py
Oct 14 09:45:54 np0005486759.ooo.test sudo[276873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:45:54.147 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:45:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:45:54.147 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:45:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:45:54.149 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:45:54 np0005486759.ooo.test python3.9[276875]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:54 np0005486759.ooo.test sudo[276873]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:45:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully.
Oct 14 09:45:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully.
Oct 14 09:45:54 np0005486759.ooo.test sudo[276930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkgnushtblegcazejlyixtqizsxkdjno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435153.7861452-153-39790435768615/AnsiballZ_file.py
Oct 14 09:45:54 np0005486759.ooo.test sudo[276930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:54 np0005486759.ooo.test python3.9[276932]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:45:54 np0005486759.ooo.test sudo[276930]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:55 np0005486759.ooo.test sudo[277040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeaqndsegbbxmliqqomgpyevbhmhtfkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435154.878897-176-230555909344398/AnsiballZ_file.py
Oct 14 09:45:55 np0005486759.ooo.test sudo[277040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:55 np0005486759.ooo.test python3.9[277042]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:55 np0005486759.ooo.test sudo[277040]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:45:55 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:45:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:45:55.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:45:55 np0005486759.ooo.test sudo[277150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyeziftmqbyfpmuuukgbkfqvtiqevslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435155.4574265-184-49884105449192/AnsiballZ_stat.py
Oct 14 09:45:55 np0005486759.ooo.test sudo[277150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:55 np0005486759.ooo.test python3.9[277152]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:55 np0005486759.ooo.test sudo[277150]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:45:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:45:56 np0005486759.ooo.test sudo[277207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdzpqynacoosqgoatyobtyahtkzjdjdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435155.4574265-184-49884105449192/AnsiballZ_file.py
Oct 14 09:45:56 np0005486759.ooo.test sudo[277207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:45:56 np0005486759.ooo.test python3.9[277209]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:56 np0005486759.ooo.test sudo[277207]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26746 DF PROTO=TCP SPT=59748 DPT=9102 SEQ=310825694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9374810000000001030307) 
Oct 14 09:45:56 np0005486759.ooo.test sudo[277317]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkkejvfbiiiywpfzuhplvnlzopewafcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435156.538459-196-13481046887906/AnsiballZ_stat.py
Oct 14 09:45:56 np0005486759.ooo.test sudo[277317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:45:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:45:57 np0005486759.ooo.test python3.9[277319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:57 np0005486759.ooo.test sudo[277317]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:45:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:45:57 np0005486759.ooo.test sudo[277374]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijelowxweuldumkisobdecgbbfybzgza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435156.538459-196-13481046887906/AnsiballZ_file.py
Oct 14 09:45:57 np0005486759.ooo.test sudo[277374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:57 np0005486759.ooo.test python3.9[277376]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:45:57 np0005486759.ooo.test sudo[277374]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:57 np0005486759.ooo.test podman[276312]: 2025-10-14 09:45:57.949710613 +0000 UTC m=+7.060174472 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Oct 14 09:45:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997-merged.mount: Deactivated successfully.
Oct 14 09:45:57 np0005486759.ooo.test podman[276312]: 2025-10-14 09:45:57.983012653 +0000 UTC m=+7.093476442 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:45:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1e604deea57dbda554a168861cff1238f93b8c6c69c863c43aed37d9d99c5fed-merged.mount: Deactivated successfully.
Oct 14 09:45:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Oct 14 09:45:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Oct 14 09:45:58 np0005486759.ooo.test sudo[277490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djcfqzpcpgmtdkubjsohewatjiaxuobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435157.6181972-208-187055751325212/AnsiballZ_systemd.py
Oct 14 09:45:58 np0005486759.ooo.test sudo[277490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:45:58 np0005486759.ooo.test python3.9[277492]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:45:58 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:45:58 np0005486759.ooo.test systemd-sysv-generator[277528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:45:58 np0005486759.ooo.test systemd-rc-local-generator[277525]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:45:58 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:45:58 np0005486759.ooo.test sudo[277490]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:59 np0005486759.ooo.test sudo[277649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwbmaqmafqohvvgtyzctnicnmjysndve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435158.9596522-216-195350907704536/AnsiballZ_stat.py
Oct 14 09:45:59 np0005486759.ooo.test sudo[277649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:45:59 np0005486759.ooo.test python3.9[277651]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:45:59 np0005486759.ooo.test sudo[277649]: pam_unix(sudo:session): session closed for user root
Oct 14 09:45:59 np0005486759.ooo.test sshd[277654]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:45:59 np0005486759.ooo.test sshd[277654]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 09:45:59 np0005486759.ooo.test sshd[277654]: Connection closed by 113.53.49.145 port 56096
Oct 14 09:45:59 np0005486759.ooo.test sshd[277655]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:45:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-cfd4f5857600cb26cb7d3ab11c417725f8adfe1c222d34600a4b303096399cf3-merged.mount: Deactivated successfully.
Oct 14 09:46:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-cfd4f5857600cb26cb7d3ab11c417725f8adfe1c222d34600a4b303096399cf3-merged.mount: Deactivated successfully.
Oct 14 09:46:00 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:46:00 np0005486759.ooo.test podman[277493]: 2025-10-14 09:46:00.287733036 +0000 UTC m=+1.911444006 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:46:00 np0005486759.ooo.test podman[277493]: 2025-10-14 09:46:00.295806283 +0000 UTC m=+1.919517273 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:46:00 np0005486759.ooo.test sudo[277709]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dekatxnxaqkvlnnmjveghqzkxxiqmkwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435158.9596522-216-195350907704536/AnsiballZ_file.py
Oct 14 09:46:00 np0005486759.ooo.test podman[277493]: unhealthy
Oct 14 09:46:00 np0005486759.ooo.test sudo[277709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:00.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:00 np0005486759.ooo.test python3.9[277721]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:46:00 np0005486759.ooo.test sudo[277709]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:00 np0005486759.ooo.test sudo[277829]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kybmxsbuyuhlmuznmlmzibqcesvcctby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435160.696428-228-78290200996395/AnsiballZ_stat.py
Oct 14 09:46:00 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1e604deea57dbda554a168861cff1238f93b8c6c69c863c43aed37d9d99c5fed-merged.mount: Deactivated successfully.
Oct 14 09:46:00 np0005486759.ooo.test sudo[277829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:01 np0005486759.ooo.test sshd[277655]: Invalid user a from 113.53.49.145 port 57880
Oct 14 09:46:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:46:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:46:01 np0005486759.ooo.test python3.9[277831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:46:01 np0005486759.ooo.test sudo[277829]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:01 np0005486759.ooo.test sshd[277655]: pam_unix(sshd:auth): check pass; user unknown
Oct 14 09:46:01 np0005486759.ooo.test sshd[277655]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.53.49.145
Oct 14 09:46:01 np0005486759.ooo.test sudo[277886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivxbsoebminevlganuuwqdjpqpoyaetw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435160.696428-228-78290200996395/AnsiballZ_file.py
Oct 14 09:46:01 np0005486759.ooo.test sudo[277886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:01 np0005486759.ooo.test python3.9[277888]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:46:01 np0005486759.ooo.test sudo[277886]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:02 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:46:02 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:46:02 np0005486759.ooo.test sudo[277996]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlwaseooqqztikvuqqlmrrputpcbsdqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435162.6398757-240-269901729870186/AnsiballZ_systemd.py
Oct 14 09:46:02 np0005486759.ooo.test sudo[277996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:03 np0005486759.ooo.test python3.9[277998]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:46:03 np0005486759.ooo.test systemd-sysv-generator[278026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:46:03 np0005486759.ooo.test systemd-rc-local-generator[278023]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:46:03 np0005486759.ooo.test sudo[277996]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:03 np0005486759.ooo.test sshd[277655]: Failed password for invalid user a from 113.53.49.145 port 57880 ssh2
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5e0d5b365d1d4f2cbdec218bcecccb17a52487dea7c1e0a1ce7e4461f7c3a058-merged.mount: Deactivated successfully.
Oct 14 09:46:04 np0005486759.ooo.test sudo[278148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibkdivteusukkhumhrrscwnohvggluuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435163.984078-250-68202751940148/AnsiballZ_file.py
Oct 14 09:46:04 np0005486759.ooo.test sudo[278148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:04 np0005486759.ooo.test python3.9[278150]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:46:04 np0005486759.ooo.test sudo[278148]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:04 np0005486759.ooo.test sshd[278168]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:46:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:05 np0005486759.ooo.test sudo[278260]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkphdzishkhhivoxrarlggomoqlfergn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435164.742432-258-15020204078011/AnsiballZ_stat.py
Oct 14 09:46:05 np0005486759.ooo.test sudo[278260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:05 np0005486759.ooo.test python3.9[278262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:46:05 np0005486759.ooo.test sudo[278260]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:05.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5c6de20ee9f73151254b053a0024fcbdd9b55691492d339c494637f80bb81826-merged.mount: Deactivated successfully.
Oct 14 09:46:05 np0005486759.ooo.test sudo[278348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lartvnbqgmyyruhdjbdehaicunyizkta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435164.742432-258-15020204078011/AnsiballZ_copy.py
Oct 14 09:46:05 np0005486759.ooo.test sudo[278348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4-merged.mount: Deactivated successfully.
Oct 14 09:46:05 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:05 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f7b5852a5167f2f2df0b0a26053100da875d5bbadcb30af9335e17d51a040dc4-merged.mount: Deactivated successfully.
Oct 14 09:46:05 np0005486759.ooo.test python3.9[278350]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760435164.742432-258-15020204078011/.source.json _original_basename=.01bwcem5 follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:46:05 np0005486759.ooo.test sudo[278348]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:46:06 np0005486759.ooo.test podman[278422]: 2025-10-14 09:46:06.224548368 +0000 UTC m=+0.075544046 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 09:46:06 np0005486759.ooo.test sudo[278477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edrgiqsmbeaghowvojpklsozembpmguf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435165.9480915-273-161841693737860/AnsiballZ_file.py
Oct 14 09:46:06 np0005486759.ooo.test sudo[278477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:06 np0005486759.ooo.test podman[278422]: 2025-10-14 09:46:06.260648487 +0000 UTC m=+0.111644175 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:46:06 np0005486759.ooo.test python3.9[278479]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:46:06 np0005486759.ooo.test sudo[278477]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:06 np0005486759.ooo.test sshd[278168]: Invalid user nil from 113.53.49.145 port 50672
Oct 14 09:46:06 np0005486759.ooo.test sudo[278587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igbpgsauhayerkeyqtbwqnqbrjrpajlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435166.6494253-281-247805345180674/AnsiballZ_stat.py
Oct 14 09:46:06 np0005486759.ooo.test sudo[278587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:06 np0005486759.ooo.test sshd[278168]: Failed none for invalid user nil from 113.53.49.145 port 50672 ssh2
Oct 14 09:46:07 np0005486759.ooo.test sudo[278587]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:07 np0005486759.ooo.test sshd[278168]: Connection closed by invalid user nil 113.53.49.145 port 50672 [preauth]
Oct 14 09:46:07 np0005486759.ooo.test sudo[278675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aenolwxtaabdvfjupyuovxqyofqocquu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435166.6494253-281-247805345180674/AnsiballZ_copy.py
Oct 14 09:46:07 np0005486759.ooo.test sudo[278675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:07 np0005486759.ooo.test sudo[278675]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:07 np0005486759.ooo.test sshd[278695]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:46:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:08 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:46:08 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:08 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:08 np0005486759.ooo.test sudo[278787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkbompiftcnrfxmezdossrothicgrvjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435167.8989658-298-254721253494346/AnsiballZ_container_config_data.py
Oct 14 09:46:08 np0005486759.ooo.test sudo[278787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:08 np0005486759.ooo.test python3.9[278789]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Oct 14 09:46:08 np0005486759.ooo.test sudo[278787]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:09 np0005486759.ooo.test sshd[278695]: Invalid user admin from 113.53.49.145 port 34862
Oct 14 09:46:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:09 np0005486759.ooo.test sudo[278897]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hljkgedbxhmqlrpjhiiotfnmnmrmvlgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435168.85231-307-136388167568008/AnsiballZ_container_config_hash.py
Oct 14 09:46:09 np0005486759.ooo.test sudo[278897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:09 np0005486759.ooo.test sshd[278695]: pam_unix(sshd:auth): check pass; user unknown
Oct 14 09:46:09 np0005486759.ooo.test sshd[278695]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.53.49.145
Oct 14 09:46:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:09 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:09 np0005486759.ooo.test python3.9[278899]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:46:09 np0005486759.ooo.test sudo[278897]: pam_unix(sudo:session): session closed for user root
Oct 14 09:46:10 np0005486759.ooo.test sudo[279007]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfunpiniahoozmccqbcnklypfegkklcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435169.6956203-316-1589800654866/AnsiballZ_podman_container_info.py
Oct 14 09:46:10 np0005486759.ooo.test sudo[279007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:46:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:10 np0005486759.ooo.test python3.9[279009]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:46:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:10.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:11 np0005486759.ooo.test sshd[278695]: Failed password for invalid user admin from 113.53.49.145 port 34862 ssh2
Oct 14 09:46:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-28829b7b07eca0acf5593dbdc913962aa4bb6cc59399d9fd0278f6a01f103c03-merged.mount: Deactivated successfully.
Oct 14 09:46:12 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:12 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:12 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:12 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:46:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:46:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:46:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:46:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6160dcf8522bce4a4dbb2d001d64131000991ae258c303cf32d4eb798c35afe5-merged.mount: Deactivated successfully.
Oct 14 09:46:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-cfd4f5857600cb26cb7d3ab11c417725f8adfe1c222d34600a4b303096399cf3-merged.mount: Deactivated successfully.
Oct 14 09:46:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-cfd4f5857600cb26cb7d3ab11c417725f8adfe1c222d34600a4b303096399cf3-merged.mount: Deactivated successfully.
Oct 14 09:46:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:46:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully.
Oct 14 09:46:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:15.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully.
Oct 14 09:46:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:46:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:46:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:17 np0005486759.ooo.test podman[279022]: 2025-10-14 09:46:17.414027382 +0000 UTC m=+0.082797867 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct 14 09:46:17 np0005486759.ooo.test podman[279022]: 2025-10-14 09:46:17.448609223 +0000 UTC m=+0.117379698 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:46:17 np0005486759.ooo.test podman[279021]: 2025-10-14 09:46:17.45481198 +0000 UTC m=+0.126192538 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:46:17 np0005486759.ooo.test podman[279021]: 2025-10-14 09:46:17.537532184 +0000 UTC m=+0.208912742 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:46:17 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:46:17 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:46:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully.
Oct 14 09:46:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:46:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:46:19 np0005486759.ooo.test podman[279063]: 2025-10-14 09:46:19.345903948 +0000 UTC m=+0.074595205 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 14 09:46:19 np0005486759.ooo.test podman[279063]: 2025-10-14 09:46:19.37364189 +0000 UTC m=+0.102333087 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 09:46:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:46:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16774 DF PROTO=TCP SPT=50100 DPT=9102 SEQ=3188327378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F93CDE70000000001030307) 
Oct 14 09:46:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:20 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:46:20 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:20 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:20 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:46:20Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged: invalid argument"
Oct 14 09:46:20 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:46:20Z" level=error msg="Getting root fs size for \"a914f6ae47800bdf201741ae383e5c957111c119cff239bea1342a1bbbac16e2\": getting diffsize of layer \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": creating overlay mount to /var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/R6BH2E3I6E3FR56LIDSQX5YQLH:/var/lib/containers/storage/overlay/l/TOVAZRZK2YBJ5TE4OXQVVN3ALH,upperdir=/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/diff,workdir=/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/work,nodev,metacopy=on\": no such file or directory"
Oct 14 09:46:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:20.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16775 DF PROTO=TCP SPT=50100 DPT=9102 SEQ=3188327378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F93D2020000000001030307) 
Oct 14 09:46:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16776 DF PROTO=TCP SPT=50100 DPT=9102 SEQ=3188327378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F93DA010000000001030307) 
Oct 14 09:46:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:46:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-28829b7b07eca0acf5593dbdc913962aa4bb6cc59399d9fd0278f6a01f103c03-merged.mount: Deactivated successfully.
Oct 14 09:46:23 np0005486759.ooo.test podman[279088]: 2025-10-14 09:46:23.427265706 +0000 UTC m=+0.063160801 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Oct 14 09:46:23 np0005486759.ooo.test podman[279088]: 2025-10-14 09:46:23.439285309 +0000 UTC m=+0.075180394 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal)
Oct 14 09:46:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:23.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:23.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 14 09:46:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:23.534 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 14 09:46:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:23.535 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:23.536 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 14 09:46:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:46:23 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:23.555 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:46:24 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:24 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully.
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fd45300f46d8608b2dbe48ea53730ffee2e8be48f3fb0ebbc0a31a116495a92c-merged.mount: Deactivated successfully.
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.444 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.445 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.462 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3c8f0e0-39f3-486a-8f1f-4bdd9035aad6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:46:24.445696', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a5b473be-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.657558354, 'message_signature': '3f02b7521f9e1b25d09dcb8b9fe1a9f17c025bb4ba923b79a9c894e0f01eca67'}]}, 'timestamp': '2025-10-14 09:46:24.462891', '_unique_id': '0c2729fefe024e1e8ef52f848ff93aa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.463 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.464 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.465 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6f240fd-4be9-41df-bd0d-8f6d61ecea4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.464512', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5b4f708-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': 'c590503cb4be9402e02dd4e281c23ee0c66945be21066b167edad2c94b17cc20'}]}, 'timestamp': '2025-10-14 09:46:24.466217', '_unique_id': '98db5aa2ce56417bb5dfa54cacf1de97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.466 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.467 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.480 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.480 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f21568d6-3e8f-4eed-b162-a569b7708cfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.467253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5b72208-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.662301566, 'message_signature': '34c1d1605db641545ca3005fa34e83c2ef44d531f7a3d54c79653b50ce87f926'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.467253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5b72ea6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.662301566, 'message_signature': '7d2ad0d4490862168231247b04301d5153e6adcf84f692ac72a28cb33178e54e'}]}, 'timestamp': '2025-10-14 09:46:24.480782', '_unique_id': '69bdc7447ccc4066b0a844431256c610'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.481 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.482 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.482 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.482 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 49140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de9574f2-cfd9-41f3-8d45-d0eb551aba85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49140000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:46:24.482601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a5b781c6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.657558354, 'message_signature': '5e104a9f1f763328c6ac9df10d31097c51880ab339a10d869eb676d764afe87e'}]}, 'timestamp': '2025-10-14 09:46:24.482906', '_unique_id': '5b0f2a50be934376920b4f31b737bfa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.483 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.484 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.508 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.508 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0166236-2312-43f2-9caf-1fa29ee09e9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.484335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5bb7132-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '10b1446de45f942b9be2236414746e62d8b80ffb3d4ae062b115f6f2042635ba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.484335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5bb7e34-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '0ffb84f0f43b747d5f133690671a7ae6fcc1f767f178cb25308c399c8b3d9f12'}]}, 'timestamp': '2025-10-14 09:46:24.509056', '_unique_id': '106d42a632eb48f19c8137cb8f70d776'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.509 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.510 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04ecf3fc-a26b-43ed-84af-11741abab42d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.510875', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bbd3ac-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '2a38bb5126bd394bea96cb07317926f20e5293995e2e84ebbe95cff054e15195'}]}, 'timestamp': '2025-10-14 09:46:24.511232', '_unique_id': 'f236a51a36ae45df834286191fb9ead2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29b14d37-8e0b-45ef-a816-79263d8a3f1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.512625', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bc1682-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '8abb15c29189b0c390708bf7549abe4dec64d041f1e4ac741104d412c31d6097'}]}, 'timestamp': '2025-10-14 09:46:24.512942', '_unique_id': '890722022e5745f9a0291c59512c0837'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.514 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.514 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.514 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aac7d7fa-f800-4e7c-bace-9bcd203bb642', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.514390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5bc5b38-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '054935a247d156ce7436b6ab324162285df2f4571bc447bf678c7ae7cbf4dd19'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.514390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5bc661e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': 'd59dbb8300379378204ca104a8850d5d1b1b08c8b69eb8c79212fa693010a850'}]}, 'timestamp': '2025-10-14 09:46:24.514980', '_unique_id': 'c58daefe719e4cb8a4e92058f68ce9fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.515 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.516 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.516 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '679f2d8f-1064-4065-a3a8-866c7e4da4a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.516407', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bcaa20-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': 'f29bf46e8d723e07cf011d9249dac6d606d8cbf380bd55432110453d156d83ea'}]}, 'timestamp': '2025-10-14 09:46:24.516716', '_unique_id': '8f5e62260c164c66acba478bbd807926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.517 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.518 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.518 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76737469-9313-4db6-ba59-19854c22f079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.518164', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bceeb8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '419bf47d6773c96479ff3bed822ccab4df552e6a8543a102a9c176947750a839'}]}, 'timestamp': '2025-10-14 09:46:24.518473', '_unique_id': '634dd6bdfac044b9beb446154d90ff65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.519 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.520 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ad33500-b4c1-400f-8709-e7ce16319544', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.519857', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5bd31b6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '79e6113bc58a9702a9a5dfa6f44193372cd5865e037b550e2cfa603ce78a1277'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.519857', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5bd3d0a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '96585ed09cc79a819eaec3be0b49f2a2c4598f5e9826271e5c9a251b58ed8ade'}]}, 'timestamp': '2025-10-14 09:46:24.520468', '_unique_id': 'bbc70350e28b4a99bba959e935a56c89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.522 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b706095-9791-4738-9299-aa0c905d3105', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.521915', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5bd82d8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': 'aeffa72c967b60737a7f6d897c388f0f8eb68251d7d2fe2a826876300a67d234'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.521915', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5bd8e0e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': 'd10a8f8f38f0733ed741a768a13955f97ca9e6a62e6af423d463e22185722b21'}]}, 'timestamp': '2025-10-14 09:46:24.522540', '_unique_id': '0548627bbe604641b7720ae9033355a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.523 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0abd5d86-558f-496e-9e5b-c42b55fa32f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.524099', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bdd6d4-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '4e0b8a9e0c23d4b81b31f81dfbc5fc25e80a14b697b19a077c03da6dfea7bd5f'}]}, 'timestamp': '2025-10-14 09:46:24.524418', '_unique_id': 'bd2e561180aa4b93a7424af40e2dc652'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.525 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.526 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0233cc65-4060-4d92-a259-4bd7d613146d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.525920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5be1f86-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.662301566, 'message_signature': 'f43cb0a22fe7087f39a00a84e3723d38b2d671b686b1a2db21e54028af230cb4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.525920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5be2aa8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.662301566, 'message_signature': 'd2f687ee02c36d52a76c112d486bb6808fe66b928924fda4e08cf231e6bb2679'}]}, 'timestamp': '2025-10-14 09:46:24.526551', '_unique_id': '8d3595190bce4036ac0e3400d9abaf37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.528 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.528 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9cf3834-bb18-45e8-a615-3e89ffba14bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.527986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5be6e6e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': 'b73c3285fad5f6a7411993d8e97bf48f49b629ebf99fd1d76e787b75d9740ea4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.527986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5be794a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '48c217ea973200dd4d1b21c746fde4fc0b758b03081981c4b12de8740b0811d7'}]}, 'timestamp': '2025-10-14 09:46:24.528562', '_unique_id': '2a705f7f9f8446a4996ba19f9f7b85f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28f9af3f-62af-4051-906d-94511d3516bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.530038', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5bebd1a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.662301566, 'message_signature': '858fbf5f9f1bb857c92b7860521b93d6f033dd776fff57375412c7530f98d9f1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.530038', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5bec6d4-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.662301566, 'message_signature': '3c0a1725994e182fed266dffe57a9496122f69d5cd646220a17f6073cdf6de3e'}]}, 'timestamp': '2025-10-14 09:46:24.530536', '_unique_id': '2f68f3224008470184c7d3c0e4cc929e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dafeadfe-724a-4426-91fe-e54f27b73a16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.531985', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bf0ad6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '5bf7e5647274ed97e2b87f430b57d0a081326eb192519bfdebbc52916725906f'}]}, 'timestamp': '2025-10-14 09:46:24.532306', '_unique_id': '61d77297d5d248d9b1de244ed3090bc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.533 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44811f7-ac53-4396-99af-b8d23717912b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.533693', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bf4d3e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': 'b9c7f0f795e7c216638b95eedc0c9eb95c6361b5b60f31ae0e7bbaffbe087371'}]}, 'timestamp': '2025-10-14 09:46:24.534048', '_unique_id': '0b4da6b0a8344617bc8dd429108e427f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.535 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.535 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3e84681-7ed7-4b60-93ba-65e4a25c9260', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.535684', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5bf9b4a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '353906f625e3d52a90c55c1ea1da04f92489e1bd9f0e35eec5b8734e6230569d'}]}, 'timestamp': '2025-10-14 09:46:24.536018', '_unique_id': 'fd1b0c3e84044a63acc8e60fef828060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.537 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.537 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '698f09cd-a67c-4f1e-8a59-b5824ddf801c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:46:24.537401', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5bfde16-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': 'c022e7cae21640d451eaf9d62be0b30e6c6e92f06947c82ecb4de14edf25db6f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:46:24.537401', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5bfe924-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.67940942, 'message_signature': '6815973964c2e76514d524674b3d9f592aa343115bd6e24c93760f4415be0dc0'}]}, 'timestamp': '2025-10-14 09:46:24.538015', '_unique_id': 'be20db0d1c3e4cac8d0775b3955d1a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25d81996-af13-4281-8901-bba2f941058a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:46:24.539420', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'a5c02ce0-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11204.659561208, 'message_signature': '7a0dfb16bfdf4a1be5f7542e126191cf8644559187a5836fffc167fc257828fa'}]}, 'timestamp': '2025-10-14 09:46:24.539720', '_unique_id': '6be5956f133e4f40be14f032caac65d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:46:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:46:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:46:24 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:24.571 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:46:25 np0005486759.ooo.test podman[279106]: 2025-10-14 09:46:25.458823255 +0000 UTC m=+0.042917897 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 09:46:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:25.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:25 np0005486759.ooo.test podman[279106]: 2025-10-14 09:46:25.473843653 +0000 UTC m=+0.057938275 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 09:46:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:25.496 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:25.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:25.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:46:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8035b846284d335d9393ab62c801f2456eb25851b24c50a7b13196117676086c-merged.mount: Deactivated successfully.
Oct 14 09:46:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6160dcf8522bce4a4dbb2d001d64131000991ae258c303cf32d4eb798c35afe5-merged.mount: Deactivated successfully.
Oct 14 09:46:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16777 DF PROTO=TCP SPT=50100 DPT=9102 SEQ=3188327378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F93E9C10000000001030307) 
Oct 14 09:46:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:26 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:46:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully.
Oct 14 09:46:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully.
Oct 14 09:46:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:46:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:46:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.493 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:46:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:28 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815-merged.mount: Deactivated successfully.
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.878 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.878 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.878 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:46:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:28.878 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:46:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:29.277 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:46:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:29.299 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:46:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:29.300 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:46:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:29.301 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:29.302 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:29.302 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:30 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:46:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully.
Oct 14 09:46:30 np0005486759.ooo.test podman[279126]: 2025-10-14 09:46:30.450539791 +0000 UTC m=+0.078187009 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251009)
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:30 np0005486759.ooo.test podman[279126]: 2025-10-14 09:46:30.483908274 +0000 UTC m=+0.111555472 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.523 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.523 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.523 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.523 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.598 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:46:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.679 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.680 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.734 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.735 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.776 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.776 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.816 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.963 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.964 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12734MB free_disk=386.72294998168945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.964 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:46:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:30.965 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.306 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.307 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.307 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:46:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.534 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 09:46:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:46:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:46:31 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:46:31 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:31 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.788 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.789 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.802 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.833 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.875 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.895 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.896 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:46:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:31.896 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:46:32 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:32 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:46:32 np0005486759.ooo.test podman[279157]: 2025-10-14 09:46:32.939663236 +0000 UTC m=+0.066367604 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:46:32 np0005486759.ooo.test podman[279157]: 2025-10-14 09:46:32.943394854 +0000 UTC m=+0.070099202 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:46:32 np0005486759.ooo.test podman[279157]: unhealthy
Oct 14 09:46:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-340db1f32c5b8494030b19fa09e01376495b270982b3deb650613f6581d7039b-merged.mount: Deactivated successfully.
Oct 14 09:46:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-340db1f32c5b8494030b19fa09e01376495b270982b3deb650613f6581d7039b-merged.mount: Deactivated successfully.
Oct 14 09:46:34 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:46:34 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:46:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:46:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:46:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:35.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:35.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f58f2b4f8f560729736f5941b846f416eb5c90f8a03f52e63e224ade26f2e564-merged.mount: Deactivated successfully.
Oct 14 09:46:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:46:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46bcb444d4e48307c55b4ae7ac02d3cb4b8dbda16b75b41e26f9c580726eafbe-merged.mount: Deactivated successfully.
Oct 14 09:46:37 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fd45300f46d8608b2dbe48ea53730ffee2e8be48f3fb0ebbc0a31a116495a92c-merged.mount: Deactivated successfully.
Oct 14 09:46:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:46:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully.
Oct 14 09:46:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:39 np0005486759.ooo.test podman[279180]: 2025-10-14 09:46:39.038182644 +0000 UTC m=+0.068971866 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 09:46:39 np0005486759.ooo.test podman[279180]: 2025-10-14 09:46:39.071619399 +0000 UTC m=+0.102408641 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:46:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully.
Oct 14 09:46:39 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:46:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:40.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:40.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:40.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:46:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:40.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:46:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:40.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:46:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:40.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully.
Oct 14 09:46:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully.
Oct 14 09:46:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:46:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:46:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully.
Oct 14 09:46:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:46:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:46:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:46:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:46:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:46:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:45.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:46:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:45.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:45.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:46:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:45.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:46:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:45.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:46:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:45.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:46:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-340db1f32c5b8494030b19fa09e01376495b270982b3deb650613f6581d7039b-merged.mount: Deactivated successfully.
Oct 14 09:46:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-340db1f32c5b8494030b19fa09e01376495b270982b3deb650613f6581d7039b-merged.mount: Deactivated successfully.
Oct 14 09:46:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully.
Oct 14 09:46:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:46:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:46:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:46:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:46:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:46:48 np0005486759.ooo.test podman[279202]: 2025-10-14 09:46:48.222058156 +0000 UTC m=+0.078940414 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 09:46:48 np0005486759.ooo.test podman[279202]: 2025-10-14 09:46:48.230321149 +0000 UTC m=+0.087203357 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, tcib_managed=true)
Oct 14 09:46:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f098c0017d0da3f1457e04ccb48f16a39779d6b090c6b44cae8dda4d8a38938b-merged.mount: Deactivated successfully.
Oct 14 09:46:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46bcb444d4e48307c55b4ae7ac02d3cb4b8dbda16b75b41e26f9c580726eafbe-merged.mount: Deactivated successfully.
Oct 14 09:46:49 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:46:49 np0005486759.ooo.test podman[279201]: 2025-10-14 09:46:49.094252961 +0000 UTC m=+0.969415680 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:46:49 np0005486759.ooo.test podman[279201]: 2025-10-14 09:46:49.126225988 +0000 UTC m=+1.001388697 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:46:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-46bcb444d4e48307c55b4ae7ac02d3cb4b8dbda16b75b41e26f9c580726eafbe-merged.mount: Deactivated successfully.
Oct 14 09:46:49 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:46:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11614 DF PROTO=TCP SPT=46172 DPT=9102 SEQ=946100579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9443170000000001030307) 
Oct 14 09:46:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:46:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:46:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:50.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11615 DF PROTO=TCP SPT=46172 DPT=9102 SEQ=946100579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9447010000000001030307) 
Oct 14 09:46:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:46:50 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully.
Oct 14 09:46:51 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:46:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:51 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:51 np0005486759.ooo.test podman[279244]: 2025-10-14 09:46:51.064122626 +0000 UTC m=+0.693446306 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 14 09:46:51 np0005486759.ooo.test podman[279244]: 2025-10-14 09:46:51.14844267 +0000 UTC m=+0.777766380 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:46:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully.
Oct 14 09:46:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:46:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11616 DF PROTO=TCP SPT=46172 DPT=9102 SEQ=946100579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F944F020000000001030307) 
Oct 14 09:46:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully.
Oct 14 09:46:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully.
Oct 14 09:46:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3d32571c90c517218e75b400153bfe2946f348989aeee2613f1e17f32183ce41-merged.mount: Deactivated successfully.
Oct 14 09:46:53 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:53 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:46:53 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:46:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:46:54.148 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:46:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:46:54.149 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:46:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:46:54.150 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:46:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:46:54 np0005486759.ooo.test podman[279268]: 2025-10-14 09:46:54.40493497 +0000 UTC m=+0.039905591 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:46:54 np0005486759.ooo.test podman[279268]: 2025-10-14 09:46:54.414221066 +0000 UTC m=+0.049191697 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc.)
Oct 14 09:46:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:54 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully.
Oct 14 09:46:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a10bb81cada1063fdd09337579a73ba5c07dabd1b81c2bfe70924b91722bf534-merged.mount: Deactivated successfully.
Oct 14 09:46:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:46:55.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:46:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:46:55 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:46:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11617 DF PROTO=TCP SPT=46172 DPT=9102 SEQ=946100579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F945EC10000000001030307) 
Oct 14 09:46:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:46:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:46:56 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:46:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:46:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:46:57 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully.
Oct 14 09:46:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:46:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully.
Oct 14 09:46:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully.
Oct 14 09:46:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9dce2160573984ba54f17e563b839daf8c243479b9d2f49c1195fe30690bd2c9-merged.mount: Deactivated successfully.
Oct 14 09:46:59 np0005486759.ooo.test podman[279286]: 2025-10-14 09:46:59.692602799 +0000 UTC m=+2.713576521 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 09:46:59 np0005486759.ooo.test podman[279286]: 2025-10-14 09:46:59.723384624 +0000 UTC m=+2.744358376 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3)
Oct 14 09:46:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:46:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:47:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:00.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:47:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:47:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:47:01 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:47:01 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:47:02 np0005486759.ooo.test podman[279302]: 2025-10-14 09:47:02.006023164 +0000 UTC m=+0.219485131 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 09:47:02 np0005486759.ooo.test podman[279302]: 2025-10-14 09:47:02.045414173 +0000 UTC m=+0.258876150 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:47:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:47:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:47:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:47:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:47:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36-merged.mount: Deactivated successfully.
Oct 14 09:47:03 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:47:03 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:03 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:03 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:47:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a-merged.mount: Deactivated successfully.
Oct 14 09:47:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:47:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:47:04 np0005486759.ooo.test systemd[1]: tmp-crun.SJ3On6.mount: Deactivated successfully.
Oct 14 09:47:04 np0005486759.ooo.test podman[279321]: 2025-10-14 09:47:04.779928779 +0000 UTC m=+0.095080669 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:47:04 np0005486759.ooo.test podman[279321]: 2025-10-14 09:47:04.813461415 +0000 UTC m=+0.128613305 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:47:04 np0005486759.ooo.test podman[279321]: unhealthy
Oct 14 09:47:05 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:47:05 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:47:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:05.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:05.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861-merged.mount: Deactivated successfully.
Oct 14 09:47:05 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully.
Oct 14 09:47:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:47:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:47:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe-merged.mount: Deactivated successfully.
Oct 14 09:47:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:47:09 np0005486759.ooo.test systemd[1]: tmp-crun.pUi18t.mount: Deactivated successfully.
Oct 14 09:47:09 np0005486759.ooo.test podman[279355]: 2025-10-14 09:47:09.389016426 +0000 UTC m=+0.076040648 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 09:47:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:47:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:10.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:10.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully.
Oct 14 09:47:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully.
Oct 14 09:47:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully.
Oct 14 09:47:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:47:11 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:47:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-56898ab6d39b47764ac69f563001cff1a6e38a16fd0080c65298dff54892d790-merged.mount: Deactivated successfully.
Oct 14 09:47:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:12 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c-merged.mount: Deactivated successfully.
Oct 14 09:47:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec-merged.mount: Deactivated successfully.
Oct 14 09:47:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:47:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:47:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424-merged.mount: Deactivated successfully.
Oct 14 09:47:14 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-3a5231add129a89d0adead7ab11bea3dfa286b532e456cc25a1ad81207e8880c-merged.mount: Deactivated successfully.
Oct 14 09:47:14 np0005486759.ooo.test podman[279355]: 2025-10-14 09:47:14.838178053 +0000 UTC m=+5.525202275 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute)
Oct 14 09:47:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:15.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:47:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c30bb5e31956af76c03d50d112f6bb34230c8fc11c69fcfa029a4de38d1c951d-merged.mount: Deactivated successfully.
Oct 14 09:47:16 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c30bb5e31956af76c03d50d112f6bb34230c8fc11c69fcfa029a4de38d1c951d-merged.mount: Deactivated successfully.
Oct 14 09:47:16 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:47:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:47:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:47:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:47:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:47:18 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:47:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:47:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:47:19 np0005486759.ooo.test podman[279375]: 2025-10-14 09:47:19.248183664 +0000 UTC m=+0.082001105 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:47:19 np0005486759.ooo.test podman[279375]: 2025-10-14 09:47:19.281223585 +0000 UTC m=+0.115040986 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:47:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60788 DF PROTO=TCP SPT=35144 DPT=9102 SEQ=3196706924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F94B8470000000001030307) 
Oct 14 09:47:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:47:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:20.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60789 DF PROTO=TCP SPT=35144 DPT=9102 SEQ=3196706924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F94BC410000000001030307) 
Oct 14 09:47:20 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:47:20 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:20 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-93c0822e715760ae283b5dfa3c054d7f162a497c51033e354a5256453c1ce67c-merged.mount: Deactivated successfully.
Oct 14 09:47:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:47:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:21 np0005486759.ooo.test podman[279395]: 2025-10-14 09:47:21.425233696 +0000 UTC m=+0.056568867 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:47:21 np0005486759.ooo.test podman[279395]: 2025-10-14 09:47:21.460893244 +0000 UTC m=+0.092228405 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:47:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:47:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60790 DF PROTO=TCP SPT=35144 DPT=9102 SEQ=3196706924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F94C4410000000001030307) 
Oct 14 09:47:22 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d123e2bf97cc7b3622c68162b04c29912e1822cdbe31a1ddf70016995925bac-merged.mount: Deactivated successfully.
Oct 14 09:47:22 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:47:22 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:22 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:47:23 np0005486759.ooo.test podman[279418]: 2025-10-14 09:47:23.434774268 +0000 UTC m=+0.066304743 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251009, managed_by=edpm_ansible)
Oct 14 09:47:23 np0005486759.ooo.test podman[279418]: 2025-10-14 09:47:23.494145285 +0000 UTC m=+0.125675790 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:47:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:47:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0e661e53b0e669d3de31c0b2a91724e8c1c9edc8f2ed2c51ea6a3d77739f7d57-merged.mount: Deactivated successfully.
Oct 14 09:47:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0e661e53b0e669d3de31c0b2a91724e8c1c9edc8f2ed2c51ea6a3d77739f7d57-merged.mount: Deactivated successfully.
Oct 14 09:47:24 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:24 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:47:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:47:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:25 np0005486759.ooo.test sudo[279007]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:25.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:25 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:47:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:47:26 np0005486759.ooo.test podman[279459]: 2025-10-14 09:47:26.210819946 +0000 UTC m=+0.085519448 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:47:26 np0005486759.ooo.test podman[279459]: 2025-10-14 09:47:26.231585487 +0000 UTC m=+0.106285049 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7)
Oct 14 09:47:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60791 DF PROTO=TCP SPT=35144 DPT=9102 SEQ=3196706924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F94D4020000000001030307) 
Oct 14 09:47:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:26.892 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:26.914 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:26.915 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:26 np0005486759.ooo.test systemd[1]: tmp-crun.YYNfpt.mount: Deactivated successfully.
Oct 14 09:47:26 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:27 np0005486759.ooo.test sudo[279568]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxqvbpionqfuvdzegwreebqjnmcubisd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435246.8331578-329-237464268515686/AnsiballZ_edpm_container_manage.py
Oct 14 09:47:27 np0005486759.ooo.test sudo[279568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:27.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:27.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:47:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b6fff9c8e433cbfc969f016d7c00977424b6e0fe3f5e8a6774343b30e6ab0953-merged.mount: Deactivated successfully.
Oct 14 09:47:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c30bb5e31956af76c03d50d112f6bb34230c8fc11c69fcfa029a4de38d1c951d-merged.mount: Deactivated successfully.
Oct 14 09:47:27 np0005486759.ooo.test python3[279570]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:47:27 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c30bb5e31956af76c03d50d112f6bb34230c8fc11c69fcfa029a4de38d1c951d-merged.mount: Deactivated successfully.
Oct 14 09:47:27 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:47:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:28.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:47:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:47:29 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:47:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:47:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:47:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:47:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:30.523 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:30.523 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:30.523 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:47:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:30.524 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:47:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:30.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:47:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee47c660ea26d21ce84215704612469c43166e04b223dbf8f0a2a895de34e216-merged.mount: Deactivated successfully.
Oct 14 09:47:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:31.893 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:47:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:31.893 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:47:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:31.893 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:47:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:31.893 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:47:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:47:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:47:32 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:47:32Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged: invalid argument"
Oct 14 09:47:32 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:47:32Z" level=error msg="Getting root fs size for \"f87e0847bec5e01fff51532a48689c9f79b6f700e533f619db8db106569d512d\": getting diffsize of layer \"d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610\" and its parent \"19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8\": creating overlay mount to /var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/R6BH2E3I6E3FR56LIDSQX5YQLH:/var/lib/containers/storage/overlay/l/TOVAZRZK2YBJ5TE4OXQVVN3ALH,upperdir=/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/diff,workdir=/var/lib/containers/storage/overlay/d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610/work,nodev,metacopy=on\": no such file or directory"
Oct 14 09:47:32 np0005486759.ooo.test kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:32 np0005486759.ooo.test kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:32 np0005486759.ooo.test podman[279585]: 2025-10-14 09:47:32.544003377 +0000 UTC m=+0.348927389 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 09:47:32 np0005486759.ooo.test podman[279585]: 2025-10-14 09:47:32.577355869 +0000 UTC m=+0.382279671 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.043 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.082 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.082 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.082 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.083 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.083 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.097 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.097 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.097 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.097 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.141 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.237 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.238 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.294 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.296 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.348 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.350 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.390 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:47:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.546 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.547 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12620MB free_disk=386.7228813171387GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.548 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.548 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.651 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.652 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.652 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:47:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.700 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.724 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.726 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:47:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:33.726 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:47:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-beb1941435aa71e3442bb0ecaccd1897b68b01e215767a88dee6f86d4122e113-merged.mount: Deactivated successfully.
Oct 14 09:47:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:47:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-0e661e53b0e669d3de31c0b2a91724e8c1c9edc8f2ed2c51ea6a3d77739f7d57-merged.mount: Deactivated successfully.
Oct 14 09:47:34 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:34 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:47:34 np0005486759.ooo.test podman[279624]: 2025-10-14 09:47:34.28335243 +0000 UTC m=+0.149076459 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009)
Oct 14 09:47:34 np0005486759.ooo.test podman[279624]: 2025-10-14 09:47:34.317335899 +0000 UTC m=+0.183059868 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid)
Oct 14 09:47:34 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:34 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:34 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:47:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:47:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:35.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:35.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:35 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:35 np0005486759.ooo.test kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Oct 14 09:47:35 np0005486759.ooo.test podman[279663]: 2025-10-14 09:47:35.614788128 +0000 UTC m=+0.241160357 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:47:35 np0005486759.ooo.test podman[279663]: 2025-10-14 09:47:35.624477263 +0000 UTC m=+0.250849452 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:47:35 np0005486759.ooo.test podman[279663]: unhealthy
Oct 14 09:47:35 np0005486759.ooo.test podman[279652]: 2025-10-14 09:47:34.974712213 +0000 UTC m=+0.668918844 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Oct 14 09:47:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:47:38 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5-merged.mount: Deactivated successfully.
Oct 14 09:47:38 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:47:38Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Oct 14 09:47:38 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:42:43 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Oct 14 09:47:38 np0005486759.ooo.test podman[279652]: 
Oct 14 09:47:38 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:47:38 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Failed with result 'exit-code'.
Oct 14 09:47:38 np0005486759.ooo.test podman[279652]: 2025-10-14 09:47:38.382901393 +0000 UTC m=+4.077108004 container create 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 09:47:39 np0005486759.ooo.test python3[279570]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Oct 14 09:47:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f479750c98f4a67ffae355a1e79b3c9a76d56699a79b842b4363e69f089cca49-merged.mount: Deactivated successfully.
Oct 14 09:47:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19103ee73e807f98da7458e81c363a7b60ad569c451bd8f50c0daf39d2101319-merged.mount: Deactivated successfully.
Oct 14 09:47:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:40.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:40.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:40.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:47:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:40.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:40.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:40.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:47:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:47:41 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:47:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:47:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-948d63d72c90238568600bb4ced3a347f3a772760aabfa54019ccce9078bd0ca-merged.mount: Deactivated successfully.
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:47:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:47:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:47:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:44 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d02971ddf65d005a908e4946d9530a2c20c528ccdcb222adb37714b18dbf1610-merged.mount: Deactivated successfully.
Oct 14 09:47:44 np0005486759.ooo.test sudo[279568]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:45 np0005486759.ooo.test sudo[279819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efqynesgfkouanflllxmzldfhuzlmwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435265.0969963-337-166195477988155/AnsiballZ_stat.py
Oct 14 09:47:45 np0005486759.ooo.test sudo[279819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:45.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:45.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:45.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:47:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:45.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:45.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:45.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:45 np0005486759.ooo.test python3.9[279821]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:47:45 np0005486759.ooo.test sudo[279819]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-19b5df687512785465f13112d48e85c216168957a07bbef3f89b587f68ca7ca8-merged.mount: Deactivated successfully.
Oct 14 09:47:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e0f86229f02c4331620c9ec8e21be769ac9cff125fc1f01f8404fcae9b59e3df-merged.mount: Deactivated successfully.
Oct 14 09:47:46 np0005486759.ooo.test sudo[279931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcupxllnzviqmlsqkzeqwclcvwlhxcqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435265.9174464-346-28885034532486/AnsiballZ_file.py
Oct 14 09:47:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:47:46 np0005486759.ooo.test sudo[279931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:46 np0005486759.ooo.test podman[279933]: 2025-10-14 09:47:46.45623585 +0000 UTC m=+0.084640811 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:47:46 np0005486759.ooo.test podman[279933]: 2025-10-14 09:47:46.498437912 +0000 UTC m=+0.126842923 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2)
Oct 14 09:47:46 np0005486759.ooo.test python3.9[279934]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:47:46 np0005486759.ooo.test sudo[279931]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:46 np0005486759.ooo.test sudo[280004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqrzmgqqgmmroivmsnbljdnpqyklhxhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435265.9174464-346-28885034532486/AnsiballZ_stat.py
Oct 14 09:47:46 np0005486759.ooo.test sudo[280004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:46 np0005486759.ooo.test python3.9[280006]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:47:46 np0005486759.ooo.test sudo[280004]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:47 np0005486759.ooo.test sudo[280113]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhiubeevrouczhzwvxdwfcmgntrdhtjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435267.0274808-346-190673403130850/AnsiballZ_copy.py
Oct 14 09:47:47 np0005486759.ooo.test sudo[280113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:47 np0005486759.ooo.test python3.9[280115]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435267.0274808-346-190673403130850/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:47:47 np0005486759.ooo.test sudo[280113]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:48 np0005486759.ooo.test sudo[280168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myirlvbcahonzywvrbdrinhahndmalqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435267.0274808-346-190673403130850/AnsiballZ_systemd.py
Oct 14 09:47:48 np0005486759.ooo.test sudo[280168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-919c2496449756819846525fbfb351457636bf59d0964ccd47919cff1ec5dc94-merged.mount: Deactivated successfully.
Oct 14 09:47:48 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-dd2af32155cf3aeb6b7d17e8ddc6441e853b3e3b11e0ce4cf44671ff7ab02cc5-merged.mount: Deactivated successfully.
Oct 14 09:47:48 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:42:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 119943 "" "Go-http-client/1.1"
Oct 14 09:47:48 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:47:48.497Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 14 09:47:48 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:47:48.497Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 14 09:47:48 np0005486759.ooo.test podman_exporter[265707]: ts=2025-10-14T09:47:48.497Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Oct 14 09:47:48 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:47:48 np0005486759.ooo.test python3.9[280170]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:47:48 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:47:48 np0005486759.ooo.test systemd-rc-local-generator[280193]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:47:48 np0005486759.ooo.test systemd-sysv-generator[280196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:47:48 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:47:48 np0005486759.ooo.test sudo[280168]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53433 DF PROTO=TCP SPT=48088 DPT=9102 SEQ=2597629901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F952D770000000001030307) 
Oct 14 09:47:49 np0005486759.ooo.test sudo[280259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnhpfikqkcuqgftxsvhofwdiwhjxjeoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435267.0274808-346-190673403130850/AnsiballZ_systemd.py
Oct 14 09:47:49 np0005486759.ooo.test sudo[280259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:50 np0005486759.ooo.test python3.9[280261]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:47:50 np0005486759.ooo.test systemd-rc-local-generator[280287]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:47:50 np0005486759.ooo.test systemd-sysv-generator[280292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:47:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53434 DF PROTO=TCP SPT=48088 DPT=9102 SEQ=2597629901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9531820000000001030307) 
Oct 14 09:47:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:50.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:50.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:50.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:47:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:50.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:50.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:50.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: Starting neutron_sriov_agent container...
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:47:50 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afcd0bdba9d279d8716c426d32d4e836d35f92987983cb6c156b42fabf4d7527/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 14 09:47:50 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afcd0bdba9d279d8716c426d32d4e836d35f92987983cb6c156b42fabf4d7527/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:47:50 np0005486759.ooo.test podman[280301]: 2025-10-14 09:47:50.829428586 +0000 UTC m=+0.133442558 container init 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3)
Oct 14 09:47:50 np0005486759.ooo.test podman[280301]: 2025-10-14 09:47:50.836849405 +0000 UTC m=+0.140863407 container start 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:47:50 np0005486759.ooo.test podman[280301]: neutron_sriov_agent
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: tmp-crun.b7fEVT.mount: Deactivated successfully.
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + sudo -E kolla_set_configs
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: Started neutron_sriov_agent container.
Oct 14 09:47:50 np0005486759.ooo.test sudo[280259]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:50 np0005486759.ooo.test podman[280313]: 2025-10-14 09:47:50.909614036 +0000 UTC m=+0.123342110 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Validating config file
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Copying service configuration files
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Writing out command to execute
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf
Oct 14 09:47:50 np0005486759.ooo.test podman[280313]: 2025-10-14 09:47:50.922292849 +0000 UTC m=+0.136020923 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: ++ cat /run_command
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + CMD=/usr/bin/neutron-sriov-nic-agent
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + ARGS=
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + sudo kolla_copy_cacerts
Oct 14 09:47:50 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + [[ ! -n '' ]]
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + . kolla_extend_start
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + umask 0022
Oct 14 09:47:50 np0005486759.ooo.test neutron_sriov_agent[280320]: + exec /usr/bin/neutron-sriov-nic-agent
Oct 14 09:47:52 np0005486759.ooo.test sudo[280455]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxftsarnwntdvslaojukjnbtuursxicf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435271.2022445-374-49208628091737/AnsiballZ_systemd.py
Oct 14 09:47:52 np0005486759.ooo.test sudo[280455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.549 2 INFO neutron.common.config [-] Logging enabled!
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.549 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.550 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.550 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.550 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.550 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.550 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005486759.ooo.test'}
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.551 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-4b4b8320-db3d-4c9d-82ac-ca1bc8127995 - - - - - -] RPC agent_id: nic-switch-agent.np0005486759.ooo.test
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.556 2 INFO neutron.agent.agent_extensions_manager [None req-4b4b8320-db3d-4c9d-82ac-ca1bc8127995 - - - - - -] Loaded agent extensions: ['qos']
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280320]: 2025-10-14 09:47:52.556 2 INFO neutron.agent.agent_extensions_manager [None req-4b4b8320-db3d-4c9d-82ac-ca1bc8127995 - - - - - -] Initializing agent extension 'qos'
Oct 14 09:47:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53435 DF PROTO=TCP SPT=48088 DPT=9102 SEQ=2597629901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9539810000000001030307) 
Oct 14 09:47:52 np0005486759.ooo.test python3.9[280457]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: Stopping neutron_sriov_agent container...
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: libpod-8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e.scope: Deactivated successfully.
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: libpod-8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e.scope: Consumed 1.741s CPU time.
Oct 14 09:47:52 np0005486759.ooo.test podman[280462]: 2025-10-14 09:47:52.709605692 +0000 UTC m=+0.073494613 container died 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=neutron_sriov_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e-userdata-shm.mount: Deactivated successfully.
Oct 14 09:47:52 np0005486759.ooo.test podman[280462]: 2025-10-14 09:47:52.743939492 +0000 UTC m=+0.107828413 container cleanup 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:47:52 np0005486759.ooo.test podman[280462]: neutron_sriov_agent
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-afcd0bdba9d279d8716c426d32d4e836d35f92987983cb6c156b42fabf4d7527-merged.mount: Deactivated successfully.
Oct 14 09:47:52 np0005486759.ooo.test podman[280488]: 2025-10-14 09:47:52.803208337 +0000 UTC m=+0.037602978 container cleanup 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 09:47:52 np0005486759.ooo.test podman[280488]: neutron_sriov_agent
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: Stopped neutron_sriov_agent container.
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: Starting neutron_sriov_agent container...
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:47:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afcd0bdba9d279d8716c426d32d4e836d35f92987983cb6c156b42fabf4d7527/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 14 09:47:52 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afcd0bdba9d279d8716c426d32d4e836d35f92987983cb6c156b42fabf4d7527/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:47:52 np0005486759.ooo.test podman[280501]: 2025-10-14 09:47:52.957518278 +0000 UTC m=+0.123289119 container init 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 09:47:52 np0005486759.ooo.test podman[280500]: 2025-10-14 09:47:52.959416953 +0000 UTC m=+0.125470063 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:47:52 np0005486759.ooo.test podman[280501]: 2025-10-14 09:47:52.965787841 +0000 UTC m=+0.131558682 container start 8f58b18ea7240472c1f223e1f4ea08e74db4b4e4ea73a28a9ecdd1dded76361e (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'abb2c277836587c529ef17a5ea5019c03a8bb946148041071ba529d87c2159fb'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:47:52 np0005486759.ooo.test neutron_sriov_agent[280524]: + sudo -E kolla_set_configs
Oct 14 09:47:52 np0005486759.ooo.test podman[280501]: neutron_sriov_agent
Oct 14 09:47:52 np0005486759.ooo.test systemd[1]: Started neutron_sriov_agent container.
Oct 14 09:47:52 np0005486759.ooo.test podman[280500]: 2025-10-14 09:47:52.991075175 +0000 UTC m=+0.157128225 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:47:53 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:47:53 np0005486759.ooo.test sudo[280455]: pam_unix(sudo:session): session closed for user root
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Validating config file
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Copying service configuration files
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Writing out command to execute
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/cd0de74397aa76b626744172300028943e2372ca220b3e27b1c7d2b66ff2832c
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: ++ cat /run_command
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + CMD=/usr/bin/neutron-sriov-nic-agent
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + ARGS=
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + sudo kolla_copy_cacerts
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + [[ ! -n '' ]]
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + . kolla_extend_start
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + umask 0022
Oct 14 09:47:53 np0005486759.ooo.test neutron_sriov_agent[280524]: + exec /usr/bin/neutron-sriov-nic-agent
Oct 14 09:47:53 np0005486759.ooo.test sshd[274958]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:47:53 np0005486759.ooo.test systemd[1]: session-40.scope: Deactivated successfully.
Oct 14 09:47:53 np0005486759.ooo.test systemd[1]: session-40.scope: Consumed 21.068s CPU time.
Oct 14 09:47:53 np0005486759.ooo.test systemd-logind[759]: Session 40 logged out. Waiting for processes to exit.
Oct 14 09:47:53 np0005486759.ooo.test systemd-logind[759]: Removed session 40.
Oct 14 09:47:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:47:54.150 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:47:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:47:54.151 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:47:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:47:54.152 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.568 2 INFO neutron.common.config [-] Logging enabled!
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.568 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.568 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.569 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.569 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.569 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.569 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005486759.ooo.test'}
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.569 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f1da914e-39a5-4f2b-9888-4e5acb66d673 - - - - - -] RPC agent_id: nic-switch-agent.np0005486759.ooo.test
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.574 2 INFO neutron.agent.agent_extensions_manager [None req-f1da914e-39a5-4f2b-9888-4e5acb66d673 - - - - - -] Loaded agent extensions: ['qos']
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.574 2 INFO neutron.agent.agent_extensions_manager [None req-f1da914e-39a5-4f2b-9888-4e5acb66d673 - - - - - -] Initializing agent extension 'qos'
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.943 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f1da914e-39a5-4f2b-9888-4e5acb66d673 - - - - - -] Agent initialized successfully, now running... 
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.943 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f1da914e-39a5-4f2b-9888-4e5acb66d673 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Oct 14 09:47:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 09:47:54.944 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f1da914e-39a5-4f2b-9888-4e5acb66d673 - - - - - -] Agent out of sync with plugin!
Oct 14 09:47:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:47:55 np0005486759.ooo.test podman[280568]: 2025-10-14 09:47:55.443507361 +0000 UTC m=+0.072799023 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 09:47:55 np0005486759.ooo.test podman[280568]: 2025-10-14 09:47:55.477752078 +0000 UTC m=+0.107043760 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 14 09:47:55 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:47:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:55.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:55.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:47:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:55.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:47:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:55.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:55.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:47:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:47:55.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:47:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53436 DF PROTO=TCP SPT=48088 DPT=9102 SEQ=2597629901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9549410000000001030307) 
Oct 14 09:47:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:47:58 np0005486759.ooo.test podman[280593]: 2025-10-14 09:47:58.431921147 +0000 UTC m=+0.062591633 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9)
Oct 14 09:47:58 np0005486759.ooo.test podman[280593]: 2025-10-14 09:47:58.443254951 +0000 UTC m=+0.073925427 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible)
Oct 14 09:47:58 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:47:58 np0005486759.ooo.test sshd[280611]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:47:59 np0005486759.ooo.test sshd[280611]: Accepted publickey for zuul from 192.168.122.31 port 56920 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:47:59 np0005486759.ooo.test systemd-logind[759]: New session 41 of user zuul.
Oct 14 09:47:59 np0005486759.ooo.test systemd[1]: Started Session 41 of User zuul.
Oct 14 09:47:59 np0005486759.ooo.test sshd[280611]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:47:59 np0005486759.ooo.test sshd[277655]: fatal: Timeout before authentication for 113.53.49.145 port 57880
Oct 14 09:48:00 np0005486759.ooo.test python3.9[280722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:48:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:00.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:00.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:00.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:48:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:00.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:00.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:00.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:01 np0005486759.ooo.test sudo[280834]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plsukzdrpkwrscoclwumoglnsegjncof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435280.7571855-35-132467124877400/AnsiballZ_setup.py
Oct 14 09:48:01 np0005486759.ooo.test sudo[280834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:02 np0005486759.ooo.test python3.9[280836]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:48:02 np0005486759.ooo.test sudo[280834]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:02 np0005486759.ooo.test sudo[280897]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrribduoqdvjxdcorujinoftlozwubnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435280.7571855-35-132467124877400/AnsiballZ_dnf.py
Oct 14 09:48:02 np0005486759.ooo.test sudo[280897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:03 np0005486759.ooo.test python3.9[280899]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:48:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:48:04 np0005486759.ooo.test podman[280902]: 2025-10-14 09:48:04.401317913 +0000 UTC m=+0.077539063 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:48:04 np0005486759.ooo.test podman[280902]: 2025-10-14 09:48:04.407945717 +0000 UTC m=+0.084166797 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:48:04 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:48:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:48:05 np0005486759.ooo.test systemd[1]: tmp-crun.69X8WH.mount: Deactivated successfully.
Oct 14 09:48:05 np0005486759.ooo.test podman[280920]: 2025-10-14 09:48:05.44419768 +0000 UTC m=+0.078937353 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, config_id=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 14 09:48:05 np0005486759.ooo.test podman[280920]: 2025-10-14 09:48:05.456299296 +0000 UTC m=+0.091038999 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:48:05 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:48:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:05.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:05.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:05.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:48:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:05.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:05.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:05.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:06 np0005486759.ooo.test sudo[280897]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:06 np0005486759.ooo.test sudo[281046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfkwyuesrqmxeqwnopzrnywwyyqwsoxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435286.3457787-47-258182119952147/AnsiballZ_systemd.py
Oct 14 09:48:06 np0005486759.ooo.test sudo[281046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:07 np0005486759.ooo.test python3.9[281048]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 14 09:48:07 np0005486759.ooo.test sshd[278695]: fatal: Timeout before authentication for 113.53.49.145 port 34862
Oct 14 09:48:07 np0005486759.ooo.test sudo[281046]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:08 np0005486759.ooo.test sudo[281159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anbrdnqcliivpgbvlvidblrazjybiksu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435287.586656-56-10857563956708/AnsiballZ_file.py
Oct 14 09:48:08 np0005486759.ooo.test sudo[281159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:08 np0005486759.ooo.test python3.9[281161]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:08 np0005486759.ooo.test sudo[281159]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:08 np0005486759.ooo.test sudo[281269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfsdktezkzlljqrqpwnnhsjmnxkehwot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435288.4108846-56-128248667141993/AnsiballZ_file.py
Oct 14 09:48:08 np0005486759.ooo.test sudo[281269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:48:08 np0005486759.ooo.test podman[281272]: 2025-10-14 09:48:08.79155491 +0000 UTC m=+0.083972322 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:48:08 np0005486759.ooo.test podman[281272]: 2025-10-14 09:48:08.802398409 +0000 UTC m=+0.094815861 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:48:08 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:48:08 np0005486759.ooo.test python3.9[281271]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:08 np0005486759.ooo.test sudo[281269]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:09 np0005486759.ooo.test sudo[281402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xspjwwylyzyuiizauyrgdvwbcwdvfkid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435289.02091-56-46913530331679/AnsiballZ_file.py
Oct 14 09:48:09 np0005486759.ooo.test sudo[281402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:09 np0005486759.ooo.test python3.9[281404]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:09 np0005486759.ooo.test sudo[281402]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:09 np0005486759.ooo.test sudo[281512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyyyijicvvbxgwsgmkuybdqiovjqkzfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435289.6753058-56-254938895217163/AnsiballZ_file.py
Oct 14 09:48:09 np0005486759.ooo.test sudo[281512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:10 np0005486759.ooo.test python3.9[281514]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:10 np0005486759.ooo.test sudo[281512]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:10 np0005486759.ooo.test sudo[281622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulyswbtuwnbqfrzprubunbiwvuqisvrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435290.2663105-56-87088149487292/AnsiballZ_file.py
Oct 14 09:48:10 np0005486759.ooo.test sudo[281622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:10.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:10 np0005486759.ooo.test python3.9[281624]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:10 np0005486759.ooo.test sudo[281622]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:11 np0005486759.ooo.test sudo[281732]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwifmujzjgciidkdwjgyppvycapxbxon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435290.9182196-56-250688781288747/AnsiballZ_file.py
Oct 14 09:48:11 np0005486759.ooo.test sudo[281732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:11 np0005486759.ooo.test python3.9[281734]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:11 np0005486759.ooo.test sudo[281732]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:11 np0005486759.ooo.test sudo[281842]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhdkdrkpvfwgwbiodxkodkcorjaagend ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435291.595694-56-7177563309228/AnsiballZ_file.py
Oct 14 09:48:11 np0005486759.ooo.test sudo[281842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:12 np0005486759.ooo.test python3.9[281844]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:12 np0005486759.ooo.test sudo[281842]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:48:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:48:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:48:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 123700 "" "Go-http-client/1.1"
Oct 14 09:48:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:48:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 14788 "" "Go-http-client/1.1"
Oct 14 09:48:12 np0005486759.ooo.test sudo[281954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoijhztiwxsmqunekwegvolehtcrduiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435292.2550902-106-229623400684092/AnsiballZ_stat.py
Oct 14 09:48:12 np0005486759.ooo.test sudo[281954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:12 np0005486759.ooo.test python3.9[281956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:13 np0005486759.ooo.test sudo[281954]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:13 np0005486759.ooo.test sudo[282042]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llbrvxvvtjljkiqxdfdqqiigqtsimefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435292.2550902-106-229623400684092/AnsiballZ_copy.py
Oct 14 09:48:13 np0005486759.ooo.test sudo[282042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:13 np0005486759.ooo.test python3.9[282044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435292.2550902-106-229623400684092/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:13 np0005486759.ooo.test sudo[282042]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:48:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:48:14 np0005486759.ooo.test python3.9[282152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:15.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:15 np0005486759.ooo.test python3.9[282238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435293.9561918-121-107295571803695/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:16 np0005486759.ooo.test python3.9[282346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:17 np0005486759.ooo.test python3.9[282432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435296.1239064-121-170049472136506/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:18 np0005486759.ooo.test python3.9[282540]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:18 np0005486759.ooo.test python3.9[282626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435297.298652-121-156869688764004/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=62a00b1f6b35bfe68ffb3c580fcd2e62ee493c4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:48:19 np0005486759.ooo.test podman[282644]: 2025-10-14 09:48:19.444331019 +0000 UTC m=+0.067551168 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:48:19 np0005486759.ooo.test podman[282644]: 2025-10-14 09:48:19.462429163 +0000 UTC m=+0.085649312 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:48:19 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:48:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30308 DF PROTO=TCP SPT=39256 DPT=9102 SEQ=2250804102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F95A2A70000000001030307) 
Oct 14 09:48:20 np0005486759.ooo.test python3.9[282753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30309 DF PROTO=TCP SPT=39256 DPT=9102 SEQ=2250804102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F95A6C20000000001030307) 
Oct 14 09:48:20 np0005486759.ooo.test python3.9[282839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435299.6538815-179-227612575494085/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=95b1899b47d8e4bacfcacfb8a591296c275f90be backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:20.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:21 np0005486759.ooo.test python3.9[282947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:48:21 np0005486759.ooo.test systemd[1]: tmp-crun.nu1pVo.mount: Deactivated successfully.
Oct 14 09:48:21 np0005486759.ooo.test podman[282948]: 2025-10-14 09:48:21.454282605 +0000 UTC m=+0.077394499 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:48:21 np0005486759.ooo.test podman[282948]: 2025-10-14 09:48:21.468181894 +0000 UTC m=+0.091293758 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 09:48:21 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:48:21 np0005486759.ooo.test python3.9[283053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435300.8352616-194-137368988221511/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:22 np0005486759.ooo.test python3.9[283161]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30310 DF PROTO=TCP SPT=39256 DPT=9102 SEQ=2250804102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F95AEC10000000001030307) 
Oct 14 09:48:23 np0005486759.ooo.test python3.9[283247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435302.076246-194-56811384778371/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:48:23 np0005486759.ooo.test podman[283278]: 2025-10-14 09:48:23.435904756 +0000 UTC m=+0.063557571 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:48:23 np0005486759.ooo.test podman[283278]: 2025-10-14 09:48:23.440970895 +0000 UTC m=+0.068623730 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:48:23 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:48:23 np0005486759.ooo.test python3.9[283378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:24 np0005486759.ooo.test python3.9[283433]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.445 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.445 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.449 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74c42ea4-a8be-45f1-a33b-8b7f12c44a43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.445995', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed38fb42-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '2033ee36a75164f161e44bf5640916bedb7a660d83944ad2dbc4375d2513ab4c'}]}, 'timestamp': '2025-10-14 09:48:24.449679', '_unique_id': '3b148902e3b84b10b9ad239ad1e17cef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.450 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0034ecda-6457-4e3a-9830-8149ac1f9923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.451295', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed39452a-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '45b5c2bfafa5b64d4519a3fd99234b7f1d8f57ee017af2da0418787dae1b63da'}]}, 'timestamp': '2025-10-14 09:48:24.451523', '_unique_id': '22facc9eb4134f7a87dee867b0060868'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.451 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.452 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.484 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.485 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05558bb4-31e0-4839-b9e3-e5e79bfdaca3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.452578', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed3e6064-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '992db0a5b98dccf94d52844b44a5b10e4977d602745102d17fbe3908490203ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.452578', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed3e71c6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '8715111d098f91d676189ab1e3be36eae04b529a2209970efa637b942b067a93'}]}, 'timestamp': '2025-10-14 09:48:24.485481', '_unique_id': '891801a228c14aa5aab8f2c7917e54a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.486 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.487 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.487 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4670da9f-b3b3-4b29-b8b3-62e0b2454957', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.487862', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed3edf62-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '5c6169c135f68fd51a0245ac93819bf93431a9627d4d69477b1a9e00e2ad1ec6'}]}, 'timestamp': '2025-10-14 09:48:24.488298', '_unique_id': 'c07a53cd69dd482090eb5db0e14b9b40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.490 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab829a40-0d74-4121-903d-5031a79cac32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.489942', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed3f30fc-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '3aa00ab5c590105d918b1fb4a3e121a0135f28fd15009c28a3cbac9cc0abf264'}]}, 'timestamp': '2025-10-14 09:48:24.490393', '_unique_id': '901bfb1322ce4f91902c923fc6dc57ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.492 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6365d89-9401-4c3f-b8f3-0ae1ebee40b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.492005', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed3f7fee-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '55c0538b04a4ba254e61d372684501bf0c6fd3d4c967881a9532dede62842b6c'}]}, 'timestamp': '2025-10-14 09:48:24.492430', '_unique_id': '4a0e56e38a234e838c35482cda8b34c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3de17d5-2166-4d07-bddb-e606d9544532', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.494044', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed3fd098-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '4de5c6bbc74baa04cdb35445b167e7aa0df8951149570b3a24b0cf7469747e02'}]}, 'timestamp': '2025-10-14 09:48:24.494473', '_unique_id': '4981e9c462ab4ef9a6a50cb62d956b7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.496 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.496 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '187a9bc0-6ad5-40fd-9eec-254a846e3915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.496017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed401c10-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '3500a6c427d575d4095fb23470f30804da679b68b41e43fbc05c072f80e66297'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.496017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed402958-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': 'f618574bcd495ce766b8db6206bb4208d5a4b1c90c7267849d1e45ef13a760ce'}]}, 'timestamp': '2025-10-14 09:48:24.496718', '_unique_id': 'ea7440951273453f83aa0e257d882bd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.498 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.498 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55a6d3a6-07f9-42d2-b524-08b1b00374fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.498460', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed4079bc-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': 'fc9afa405f11bc5da6e950204889f0ad8c52323333c5dcf84ee8884e996298dc'}]}, 'timestamp': '2025-10-14 09:48:24.498796', '_unique_id': 'e4556f547829489bac3b4c1342c7f5ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.499 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.500 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b676e9eb-109a-4abf-89a3-3ad750c71f13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.500376', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed40c430-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '9b15936a99dc4f128a93c873e588056f63db6c6142a0bb680bd6f51dd55dd0f5'}]}, 'timestamp': '2025-10-14 09:48:24.500697', '_unique_id': '7c47bfed329f40a086490dc4809d336e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.502 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.517 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ae3174b-eba5-40f8-a8a2-341bfcd6884b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:48:24.502191', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ed436686-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.712482975, 'message_signature': '8f61eec26022cb51c4ddc6bc10e73836dca27f59de638122e065fb112f4e46a8'}]}, 'timestamp': '2025-10-14 09:48:24.518016', '_unique_id': '4c439150e87e4aba84da7d543830bcff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.519 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.532 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.533 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e7200cd-eb0f-4900-a0e8-7dda921bc7f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.519942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed45bed6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.715056571, 'message_signature': 'a679cb1d81591a0102e40ad642365fba4fdc67e2a17e80c00499b21c8ce6d763'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.519942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed45cea8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.715056571, 'message_signature': '46dcd59375c306475e2fa6f89168068a349982712db816b7f88f339e2df41771'}]}, 'timestamp': '2025-10-14 09:48:24.533761', '_unique_id': '16a6e8fd56744c5d8ea824ac8c795038'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1206d17a-3201-4769-846b-3f9f9d250c08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.535945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed463636-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': 'b8c75ecb024e4a9b75771fe9ca956ab619496a40bbe708f10338f85b2d98aeb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.535945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed4646bc-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': 'a9a3765a4b19653d6f952a53dc103a3a2f4ebf3d005674b299843137ea9c5d33'}]}, 'timestamp': '2025-10-14 09:48:24.536860', '_unique_id': '167776307eb848d6b46e1c49dc2728c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99ea5d84-ee94-4616-a696-cdba00557f95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.539100', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed46b872-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': '39e4bc5655cc3cd1169bd665069d5096c4b3460a254922c0354517cddf528521'}]}, 'timestamp': '2025-10-14 09:48:24.539792', '_unique_id': '4ec40145ea774553a0f51b6fedfc5187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.541 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.542 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.542 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c550b9fb-a21d-4de9-901a-bacd1e1ef27c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.542072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed4724b0-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': 'cd5493e4197246d115d5e8603d3b2f87382c8de17689cf1f11567ffa89234454'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.542072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed4734dc-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '421899fee053e42bc6a129aac459c71b20ebc6c91c5a0668a1aa450c9182ed5c'}]}, 'timestamp': '2025-10-14 09:48:24.542974', '_unique_id': '4676249042de4134b7ed776b57fddd08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd5b5371-f7b8-45d6-ba9f-48c18d534e97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.545101', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed479a26-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.715056571, 'message_signature': 'c26c9815f382948a924b5403ceb95c0f698c894a2ae2c747776e17ed21ddab9b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.545101', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed47aa3e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.715056571, 'message_signature': 'e5e4337e2316d0c4ff52ca490bfb394caab43ffd56cc3e816dc20b4505ac06ba'}]}, 'timestamp': '2025-10-14 09:48:24.545987', '_unique_id': '8597d35b98684f529b3aab3757da9676'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.548 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.548 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f246c50-814e-4190-99a7-9f6b317ff86b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:48:24.548134', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'ed4810c8-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.641051334, 'message_signature': 'ee8bd92d4fdfdb8791941962a963616b71a6ad758375ec5fdc4694641a4883cf'}]}, 'timestamp': '2025-10-14 09:48:24.548605', '_unique_id': 'a63fd22f62a0468e948811631ec3e3df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.550 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.550 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.551 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0019078-45af-4081-a2da-abe2046568cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.550709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed487518-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.715056571, 'message_signature': '3b18029d1016e12bf647592208abd0b49dc295129f4b1f9117758c3753ecf46e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.550709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed488648-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.715056571, 'message_signature': '918fe30e7f54d69077be883512f910540e8b89a8e3947ed517f83a0f76759a09'}]}, 'timestamp': '2025-10-14 09:48:24.551590', '_unique_id': 'ba642335b76d4d2eb4ce11e0ad7599cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.554 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.554 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.554 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94417631-9a89-4638-bbfd-1a44662ed48f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.554176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed48fdc6-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '88bb7098e92a85b9c536ee40eee995c95767dd17651d0ef7ac08aedfdad6cd26'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.554176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed490e38-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '4234c36e54002ed606bcaa9bab88c4c6af552acd99d2694e8409143ba298239b'}]}, 'timestamp': '2025-10-14 09:48:24.555104', '_unique_id': '05e25918ff224f8bbc8e9d49350c3334'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6160b40-be04-40a7-b873-94b0253ad510', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:48:24.557373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed497940-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': '59f7a6e37f045499a1aae443a508a46342b6fd30a50b8e1c010e66d77e291602'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:48:24.557373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed498a70-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.647646927, 'message_signature': 'c9f5501ac4156461d7f9e70015e303ed87ce4a630ebc21561c876d8ab52a4585'}]}, 'timestamp': '2025-10-14 09:48:24.558332', '_unique_id': '4f724441cd364738a79f050e29e95bdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.560 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 50150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '762a995c-a206-43a5-b4fe-7f04c25b85a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50150000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:48:24.560470', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ed49f24e-a8e2-11f0-b515-fa163eba5220', 'monotonic_time': 11324.712482975, 'message_signature': 'ead44c9913853fd2fe780d5e73b9bcf588251260c79bf69f09c6de2647e1b6ec'}]}, 'timestamp': '2025-10-14 09:48:24.560916', '_unique_id': '532c02a4d0a844ceb58ddb33733cb458'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:48:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:48:24.562 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:48:24 np0005486759.ooo.test python3.9[283541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:25 np0005486759.ooo.test python3.9[283627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435304.4852264-223-201463368720884/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:25.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:25.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:25.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:48:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:25.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:25.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:25.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:26 np0005486759.ooo.test python3.9[283735]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:48:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:48:26 np0005486759.ooo.test podman[283807]: 2025-10-14 09:48:26.452105582 +0000 UTC m=+0.079543622 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 09:48:26 np0005486759.ooo.test podman[283807]: 2025-10-14 09:48:26.518395433 +0000 UTC m=+0.145833463 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 14 09:48:26 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:48:26 np0005486759.ooo.test sudo[283871]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cifythdegqrwbpsokbkvpwyooxsqqfyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435306.2395937-258-261280230552838/AnsiballZ_file.py
Oct 14 09:48:26 np0005486759.ooo.test sudo[283871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30311 DF PROTO=TCP SPT=39256 DPT=9102 SEQ=2250804102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F95BE820000000001030307) 
Oct 14 09:48:26 np0005486759.ooo.test python3.9[283873]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:26 np0005486759.ooo.test sudo[283871]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:27 np0005486759.ooo.test sudo[283981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnbgtvkdqseouqokiellkkqbvjtazozp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435306.9580553-266-96990509342875/AnsiballZ_stat.py
Oct 14 09:48:27 np0005486759.ooo.test sudo[283981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:27 np0005486759.ooo.test python3.9[283983]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:27 np0005486759.ooo.test sudo[283981]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:27 np0005486759.ooo.test sudo[284038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnhdarecqmhdkehykxbnlhpjiwssldfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435306.9580553-266-96990509342875/AnsiballZ_file.py
Oct 14 09:48:27 np0005486759.ooo.test sudo[284038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:27 np0005486759.ooo.test python3.9[284040]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:27 np0005486759.ooo.test sudo[284038]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:28.141 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:28 np0005486759.ooo.test sudo[284148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khxwxinkqzkhwvejhyxktdhvgdhobfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435308.0733428-266-165862885434724/AnsiballZ_stat.py
Oct 14 09:48:28 np0005486759.ooo.test sudo[284148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:28.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:28.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:28 np0005486759.ooo.test python3.9[284150]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:28 np0005486759.ooo.test sudo[284148]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:28 np0005486759.ooo.test sudo[284205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unhxwmlgagqtzzofkucuqzdgesetneed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435308.0733428-266-165862885434724/AnsiballZ_file.py
Oct 14 09:48:28 np0005486759.ooo.test sudo[284205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:48:28 np0005486759.ooo.test systemd[1]: tmp-crun.RpKZIQ.mount: Deactivated successfully.
Oct 14 09:48:28 np0005486759.ooo.test podman[284208]: 2025-10-14 09:48:28.888895497 +0000 UTC m=+0.093327717 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Oct 14 09:48:28 np0005486759.ooo.test podman[284208]: 2025-10-14 09:48:28.898558541 +0000 UTC m=+0.102990781 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter)
Oct 14 09:48:28 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:48:29 np0005486759.ooo.test python3.9[284207]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:29 np0005486759.ooo.test sudo[284205]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:29.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:29.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:48:30 np0005486759.ooo.test sudo[284335]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-budtgygisarvzdgukuiaotxcferljmhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435309.1654787-289-48859741661933/AnsiballZ_file.py
Oct 14 09:48:30 np0005486759.ooo.test sudo[284335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:30 np0005486759.ooo.test python3.9[284337]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:30 np0005486759.ooo.test sudo[284335]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.499 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.499 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.499 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:30 np0005486759.ooo.test sudo[284445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uinifjuyvbwhmpqcdvdhchbfyojovjhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435310.5371253-297-178922295844835/AnsiballZ_stat.py
Oct 14 09:48:30 np0005486759.ooo.test sudo[284445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.898 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.899 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.899 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:48:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:30.899 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:48:31 np0005486759.ooo.test python3.9[284447]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:31 np0005486759.ooo.test sudo[284445]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:31.289 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:48:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:31.304 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:48:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:31.304 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:48:31 np0005486759.ooo.test sudo[284502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unraaspvxagkleaaitqbhoelrjbzrojd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435310.5371253-297-178922295844835/AnsiballZ_file.py
Oct 14 09:48:31 np0005486759.ooo.test sudo[284502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:31 np0005486759.ooo.test python3.9[284504]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:31 np0005486759.ooo.test sudo[284502]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.299 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.532 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.532 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.532 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.532 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.588 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:48:32 np0005486759.ooo.test sudo[284613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stljkqxzovrwqwmzcjhwltyiuecshlio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435311.7117546-309-144059635016426/AnsiballZ_stat.py
Oct 14 09:48:32 np0005486759.ooo.test sudo[284613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.649 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.651 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.719 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.720 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.772 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.773 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:48:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:32.825 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:48:32 np0005486759.ooo.test python3.9[284615]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:32 np0005486759.ooo.test sudo[284613]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.005 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.006 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12548MB free_disk=386.72137451171875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.006 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.007 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.094 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.095 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.095 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:48:33 np0005486759.ooo.test sudo[284681]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnabhrqtwhybmopybuadyssmsqfgtifu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435311.7117546-309-144059635016426/AnsiballZ_file.py
Oct 14 09:48:33 np0005486759.ooo.test sudo[284681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.144 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.158 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.161 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:48:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:33.161 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:48:33 np0005486759.ooo.test python3.9[284683]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:33 np0005486759.ooo.test sudo[284681]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:33 np0005486759.ooo.test sudo[284791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwxevzcpwpckcjkdzqjphplwgucdjxoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435313.4670033-321-220569859456304/AnsiballZ_systemd.py
Oct 14 09:48:33 np0005486759.ooo.test sudo[284791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:34 np0005486759.ooo.test python3.9[284793]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:48:34 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:48:34 np0005486759.ooo.test systemd-rc-local-generator[284811]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:48:34 np0005486759.ooo.test systemd-sysv-generator[284817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:48:34 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:48:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:48:34 np0005486759.ooo.test sudo[284791]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:34 np0005486759.ooo.test podman[284830]: 2025-10-14 09:48:34.530049003 +0000 UTC m=+0.081035486 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:48:34 np0005486759.ooo.test podman[284830]: 2025-10-14 09:48:34.563391003 +0000 UTC m=+0.114377436 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:48:34 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:48:34 np0005486759.ooo.test sudo[284956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sruthsibblkawfqwdskhyqyswptsxppm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435314.6472015-329-14297157465784/AnsiballZ_stat.py
Oct 14 09:48:35 np0005486759.ooo.test sudo[284956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:35 np0005486759.ooo.test python3.9[284958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:35 np0005486759.ooo.test sudo[284956]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:35 np0005486759.ooo.test sudo[285013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lplrzzxwuiziptimueosertuewibitby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435314.6472015-329-14297157465784/AnsiballZ_file.py
Oct 14 09:48:35 np0005486759.ooo.test sudo[285013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:35 np0005486759.ooo.test python3.9[285015]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:35 np0005486759.ooo.test sudo[285013]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:35.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:36 np0005486759.ooo.test sudo[285123]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swgcosohjnzzrwggcjjddnrzaeauqekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435315.7954404-341-281177125016036/AnsiballZ_stat.py
Oct 14 09:48:36 np0005486759.ooo.test sudo[285123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:48:36 np0005486759.ooo.test podman[285126]: 2025-10-14 09:48:36.141137411 +0000 UTC m=+0.051424375 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:48:36 np0005486759.ooo.test podman[285126]: 2025-10-14 09:48:36.150168666 +0000 UTC m=+0.060455640 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009)
Oct 14 09:48:36 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:48:36 np0005486759.ooo.test python3.9[285125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:36 np0005486759.ooo.test sudo[285123]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:36 np0005486759.ooo.test sudo[285199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myubfujirzzlrprlexncqzbdbjwbsokz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435315.7954404-341-281177125016036/AnsiballZ_file.py
Oct 14 09:48:36 np0005486759.ooo.test sudo[285199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:36 np0005486759.ooo.test python3.9[285201]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:36 np0005486759.ooo.test sudo[285199]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:37 np0005486759.ooo.test sudo[285309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-achkfyywmcogsbqyzwdkfpojkeieivdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435316.8461664-353-236928869857511/AnsiballZ_systemd.py
Oct 14 09:48:37 np0005486759.ooo.test sudo[285309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:37 np0005486759.ooo.test python3.9[285311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:48:37 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:48:37 np0005486759.ooo.test systemd-rc-local-generator[285338]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:48:37 np0005486759.ooo.test systemd-sysv-generator[285341]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:48:37 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:48:37 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:48:37 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:48:37 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:48:37 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:48:37 np0005486759.ooo.test sudo[285309]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:38 np0005486759.ooo.test sudo[285462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kixzofdwmggphwjcjmjwebezkuzsiqec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435318.243186-363-236620201591544/AnsiballZ_file.py
Oct 14 09:48:38 np0005486759.ooo.test sudo[285462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:38 np0005486759.ooo.test python3.9[285464]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:48:38 np0005486759.ooo.test sudo[285462]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:39 np0005486759.ooo.test sudo[285572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orsxbwoplkbfwgvidgovmdpmaiaboufq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435319.020227-371-111697036673589/AnsiballZ_stat.py
Oct 14 09:48:39 np0005486759.ooo.test sudo[285572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:48:39 np0005486759.ooo.test systemd[1]: tmp-crun.DUsCJq.mount: Deactivated successfully.
Oct 14 09:48:39 np0005486759.ooo.test podman[285574]: 2025-10-14 09:48:39.432565824 +0000 UTC m=+0.089677620 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:48:39 np0005486759.ooo.test podman[285574]: 2025-10-14 09:48:39.443277169 +0000 UTC m=+0.100388945 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:48:39 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:48:39 np0005486759.ooo.test python3.9[285575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:48:39 np0005486759.ooo.test sudo[285572]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:39 np0005486759.ooo.test sudo[285682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhqkhsusxckqvbbpmdyxdszccyqiyhdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435319.020227-371-111697036673589/AnsiballZ_copy.py
Oct 14 09:48:39 np0005486759.ooo.test sudo[285682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:40 np0005486759.ooo.test python3.9[285684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760435319.020227-371-111697036673589/.source.json _original_basename=.3exk4utt follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:40 np0005486759.ooo.test sudo[285682]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:40 np0005486759.ooo.test sudo[285792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrpddxsxkrtwcvxrfcfdrcyvnasdcrqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435320.2840796-386-171365619892534/AnsiballZ_file.py
Oct 14 09:48:40 np0005486759.ooo.test sudo[285792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:40.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:40.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:40 np0005486759.ooo.test python3.9[285794]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:40 np0005486759.ooo.test sudo[285792]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:41 np0005486759.ooo.test sudo[285902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlhelnxqyoqnkqczltnxuxlkhgepluvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435321.0018702-394-93554642492326/AnsiballZ_stat.py
Oct 14 09:48:41 np0005486759.ooo.test sudo[285902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:41 np0005486759.ooo.test sudo[285902]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:48:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:48:42 np0005486759.ooo.test sudo[285990]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbfvoehwxkhfvevvpxwwrgzjwjbveruv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435321.0018702-394-93554642492326/AnsiballZ_copy.py
Oct 14 09:48:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:48:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 123700 "" "Go-http-client/1.1"
Oct 14 09:48:42 np0005486759.ooo.test sudo[285990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:48:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 14790 "" "Go-http-client/1.1"
Oct 14 09:48:42 np0005486759.ooo.test sudo[285990]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:43 np0005486759.ooo.test sudo[286100]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwkvoytzfvuyocpsafobvjfewwmgzcyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435322.7746487-411-33340062810340/AnsiballZ_container_config_data.py
Oct 14 09:48:43 np0005486759.ooo.test sudo[286100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:43 np0005486759.ooo.test python3.9[286102]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Oct 14 09:48:43 np0005486759.ooo.test sudo[286100]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:48:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:48:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:48:44 np0005486759.ooo.test sudo[286210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxxbjdfhcmtenjpolmrqmfldtmqnbcfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435323.6000478-420-117863645595510/AnsiballZ_container_config_hash.py
Oct 14 09:48:44 np0005486759.ooo.test sudo[286210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:45 np0005486759.ooo.test python3.9[286212]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:48:45 np0005486759.ooo.test sudo[286210]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:45.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:45 np0005486759.ooo.test sudo[286320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aexfkqmuinutllzixpqllzgvvzdceubs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435325.323023-429-124952356401768/AnsiballZ_podman_container_info.py
Oct 14 09:48:45 np0005486759.ooo.test sudo[286320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:45 np0005486759.ooo.test python3.9[286322]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:48:46 np0005486759.ooo.test sudo[286320]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:48 np0005486759.ooo.test sudo[286458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzygmywxypnmaespwnwctgpeoniaydyt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435327.9555879-442-60277157249995/AnsiballZ_edpm_container_manage.py
Oct 14 09:48:48 np0005486759.ooo.test sudo[286458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:48 np0005486759.ooo.test python3[286460]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:48:49 np0005486759.ooo.test podman[286498]: 
Oct 14 09:48:49 np0005486759.ooo.test podman[286498]: 2025-10-14 09:48:49.092052534 +0000 UTC m=+0.083366114 container create c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Oct 14 09:48:49 np0005486759.ooo.test podman[286498]: 2025-10-14 09:48:49.045259567 +0000 UTC m=+0.036573167 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 09:48:49 np0005486759.ooo.test python3[286460]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 09:48:49 np0005486759.ooo.test sudo[286458]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2661 DF PROTO=TCP SPT=37036 DPT=9102 SEQ=872325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9617D80000000001030307) 
Oct 14 09:48:49 np0005486759.ooo.test sudo[286641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edxqzndzysyabvndcdqqwvtlpqnmewya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435329.4266608-450-16336275154275/AnsiballZ_stat.py
Oct 14 09:48:49 np0005486759.ooo.test sudo[286641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:48:49 np0005486759.ooo.test podman[286644]: 2025-10-14 09:48:49.850027117 +0000 UTC m=+0.085434504 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:48:49 np0005486759.ooo.test podman[286644]: 2025-10-14 09:48:49.891371414 +0000 UTC m=+0.126778811 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct 14 09:48:49 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:48:49 np0005486759.ooo.test python3.9[286643]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:48:49 np0005486759.ooo.test sudo[286641]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2662 DF PROTO=TCP SPT=37036 DPT=9102 SEQ=872325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F961BC10000000001030307) 
Oct 14 09:48:50 np0005486759.ooo.test sudo[286770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxqdjjacplzevrnekslkdpumlligtfnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435330.2751927-459-48995705436352/AnsiballZ_file.py
Oct 14 09:48:50 np0005486759.ooo.test sudo[286770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:50.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:48:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:50.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:50.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:48:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:50.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:50.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:48:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:50.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:50 np0005486759.ooo.test python3.9[286772]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:50 np0005486759.ooo.test sudo[286770]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:51 np0005486759.ooo.test sudo[286825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyajmurdmxxqtupvqubynboabueuyvlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435330.2751927-459-48995705436352/AnsiballZ_stat.py
Oct 14 09:48:51 np0005486759.ooo.test sudo[286825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:51 np0005486759.ooo.test python3.9[286827]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:48:51 np0005486759.ooo.test sudo[286825]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:51 np0005486759.ooo.test sudo[286934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcnetdsjvtjymubhiccbnlzfjbptjhjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435331.3510487-459-193690328137310/AnsiballZ_copy.py
Oct 14 09:48:51 np0005486759.ooo.test sudo[286934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:48:51 np0005486759.ooo.test podman[286937]: 2025-10-14 09:48:51.967773185 +0000 UTC m=+0.074277597 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.build-date=20251009, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 09:48:51 np0005486759.ooo.test podman[286937]: 2025-10-14 09:48:51.97953205 +0000 UTC m=+0.086036482 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:48:51 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:48:52 np0005486759.ooo.test python3.9[286936]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435331.3510487-459-193690328137310/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:48:52 np0005486759.ooo.test sudo[286934]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:52 np0005486759.ooo.test sudo[287009]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gevvofozrqnhlpymnziwxcujuxsxtonr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435331.3510487-459-193690328137310/AnsiballZ_systemd.py
Oct 14 09:48:52 np0005486759.ooo.test sudo[287009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2663 DF PROTO=TCP SPT=37036 DPT=9102 SEQ=872325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9623C10000000001030307) 
Oct 14 09:48:52 np0005486759.ooo.test python3.9[287011]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:48:52 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:48:52 np0005486759.ooo.test systemd-rc-local-generator[287038]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:48:52 np0005486759.ooo.test systemd-sysv-generator[287041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:48:52 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:48:53 np0005486759.ooo.test sudo[287009]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:53 np0005486759.ooo.test sudo[287099]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbsegzwrarwnkrxlfbwjmqodgiqsyhfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435331.3510487-459-193690328137310/AnsiballZ_systemd.py
Oct 14 09:48:53 np0005486759.ooo.test sudo[287099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:53 np0005486759.ooo.test python3.9[287101]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:48:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:48:53 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:48:53 np0005486759.ooo.test podman[287103]: 2025-10-14 09:48:53.855210845 +0000 UTC m=+0.060176223 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:48:53 np0005486759.ooo.test podman[287103]: 2025-10-14 09:48:53.865210959 +0000 UTC m=+0.070176327 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:48:53 np0005486759.ooo.test systemd-rc-local-generator[287143]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:48:53 np0005486759.ooo.test systemd-sysv-generator[287150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:48:53 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:48:54 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:48:54 np0005486759.ooo.test systemd[1]: Starting neutron_dhcp_agent container...
Oct 14 09:48:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:48:54.152 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:48:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:48:54.153 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:48:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:48:54.154 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:48:54 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:48:54 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cd18aac39515c6f47e78dbb90133681b8abf3115daaca39ff2325194c26cdda/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 14 09:48:54 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cd18aac39515c6f47e78dbb90133681b8abf3115daaca39ff2325194c26cdda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:48:54 np0005486759.ooo.test podman[287164]: 2025-10-14 09:48:54.270853365 +0000 UTC m=+0.125204994 container init c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:48:54 np0005486759.ooo.test podman[287164]: 2025-10-14 09:48:54.282025964 +0000 UTC m=+0.136377593 container start c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:48:54 np0005486759.ooo.test podman[287164]: neutron_dhcp_agent
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + sudo -E kolla_set_configs
Oct 14 09:48:54 np0005486759.ooo.test systemd[1]: Started neutron_dhcp_agent container.
Oct 14 09:48:54 np0005486759.ooo.test sudo[287099]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Validating config file
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Copying service configuration files
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Writing out command to execute
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/cd0de74397aa76b626744172300028943e2372ca220b3e27b1c7d2b66ff2832c
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: ++ cat /run_command
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + CMD=/usr/bin/neutron-dhcp-agent
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + ARGS=
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + sudo kolla_copy_cacerts
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + [[ ! -n '' ]]
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + . kolla_extend_start
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: Running command: '/usr/bin/neutron-dhcp-agent'
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + umask 0022
Oct 14 09:48:54 np0005486759.ooo.test neutron_dhcp_agent[287179]: + exec /usr/bin/neutron-dhcp-agent
Oct 14 09:48:54 np0005486759.ooo.test sudo[287300]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doohpdbrfvtkzdzuzkrjlttexvylopsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435334.505306-487-260838569676094/AnsiballZ_systemd.py
Oct 14 09:48:54 np0005486759.ooo.test sudo[287300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:48:55 np0005486759.ooo.test python3.9[287302]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: Stopping neutron_dhcp_agent container...
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: libpod-c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1.scope: Deactivated successfully.
Oct 14 09:48:55 np0005486759.ooo.test podman[287307]: 2025-10-14 09:48:55.272736057 +0000 UTC m=+0.067944410 container died c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, config_id=neutron_dhcp, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent)
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: tmp-crun.e54HG2.mount: Deactivated successfully.
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1-userdata-shm.mount: Deactivated successfully.
Oct 14 09:48:55 np0005486759.ooo.test podman[287307]: 2025-10-14 09:48:55.319218425 +0000 UTC m=+0.114426778 container cleanup c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:48:55 np0005486759.ooo.test podman[287307]: neutron_dhcp_agent
Oct 14 09:48:55 np0005486759.ooo.test podman[287344]: error opening file `/run/crun/c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1/status`: No such file or directory
Oct 14 09:48:55 np0005486759.ooo.test podman[287333]: 2025-10-14 09:48:55.413840559 +0000 UTC m=+0.065849418 container cleanup c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS)
Oct 14 09:48:55 np0005486759.ooo.test podman[287333]: neutron_dhcp_agent
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: Stopped neutron_dhcp_agent container.
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: Starting neutron_dhcp_agent container...
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:48:55 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cd18aac39515c6f47e78dbb90133681b8abf3115daaca39ff2325194c26cdda/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 14 09:48:55 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cd18aac39515c6f47e78dbb90133681b8abf3115daaca39ff2325194c26cdda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:48:55 np0005486759.ooo.test podman[287346]: 2025-10-14 09:48:55.527147713 +0000 UTC m=+0.089178876 container init c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251009)
Oct 14 09:48:55 np0005486759.ooo.test podman[287346]: 2025-10-14 09:48:55.533564422 +0000 UTC m=+0.095595564 container start c6914a84be08a887e892a1f1391c96aa01523f2ef15631e30d7b15367381c3d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bf62bbb0f49999b25cafec3c4cbc1584df0d0f0264f7836b9b04d016809e97ce'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0)
Oct 14 09:48:55 np0005486759.ooo.test podman[287346]: neutron_dhcp_agent
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + sudo -E kolla_set_configs
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: Started neutron_dhcp_agent container.
Oct 14 09:48:55 np0005486759.ooo.test sudo[287300]: pam_unix(sudo:session): session closed for user root
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Validating config file
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Copying service configuration files
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Writing out command to execute
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/333254bb87316156e96cebc0941f89c4b6bf7d0c72b62f2bd2e3f232ec27cb23
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/cd0de74397aa76b626744172300028943e2372ca220b3e27b1c7d2b66ff2832c
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: ++ cat /run_command
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + CMD=/usr/bin/neutron-dhcp-agent
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + ARGS=
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + sudo kolla_copy_cacerts
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + [[ ! -n '' ]]
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + . kolla_extend_start
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: Running command: '/usr/bin/neutron-dhcp-agent'
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + umask 0022
Oct 14 09:48:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: + exec /usr/bin/neutron-dhcp-agent
Oct 14 09:48:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:48:55.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:48:55 np0005486759.ooo.test sshd[280611]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: session-41.scope: Deactivated successfully.
Oct 14 09:48:55 np0005486759.ooo.test systemd[1]: session-41.scope: Consumed 33.658s CPU time.
Oct 14 09:48:55 np0005486759.ooo.test systemd-logind[759]: Session 41 logged out. Waiting for processes to exit.
Oct 14 09:48:55 np0005486759.ooo.test systemd-logind[759]: Removed session 41.
Oct 14 09:48:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2664 DF PROTO=TCP SPT=37036 DPT=9102 SEQ=872325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9633810000000001030307) 
Oct 14 09:48:56 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:48:56.819 287366 INFO neutron.common.config [-] Logging enabled!
Oct 14 09:48:56 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:48:56.819 287366 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Oct 14 09:48:57 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:48:57.200 287366 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Oct 14 09:48:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:48:57 np0005486759.ooo.test podman[287395]: 2025-10-14 09:48:57.446528252 +0000 UTC m=+0.075548347 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible)
Oct 14 09:48:57 np0005486759.ooo.test podman[287395]: 2025-10-14 09:48:57.513505138 +0000 UTC m=+0.142525193 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:48:57 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:48:57 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:48:57.972 287366 INFO neutron.agent.dhcp.agent [None req-cb854f98-0a83-407f-8140-c08000f3cace - - - - - -] All active networks have been fetched through RPC.
Oct 14 09:48:57 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:48:57.973 287366 INFO neutron.agent.dhcp.agent [None req-cb854f98-0a83-407f-8140-c08000f3cace - - - - - -] Synchronizing state complete
Oct 14 09:48:58 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:48:58.065 287366 INFO neutron.agent.dhcp.agent [None req-cb854f98-0a83-407f-8140-c08000f3cace - - - - - -] DHCP agent started
Oct 14 09:48:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:48:59 np0005486759.ooo.test systemd[1]: tmp-crun.YJRLmA.mount: Deactivated successfully.
Oct 14 09:48:59 np0005486759.ooo.test podman[287420]: 2025-10-14 09:48:59.455386113 +0000 UTC m=+0.084316405 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Oct 14 09:48:59 np0005486759.ooo.test podman[287420]: 2025-10-14 09:48:59.472467542 +0000 UTC m=+0.101397814 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-type=git)
Oct 14 09:48:59 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:49:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:00.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:49:05 np0005486759.ooo.test podman[287440]: 2025-10-14 09:49:05.429170867 +0000 UTC m=+0.060851393 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct 14 09:49:05 np0005486759.ooo.test podman[287440]: 2025-10-14 09:49:05.463401779 +0000 UTC m=+0.095082305 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true)
Oct 14 09:49:05 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:49:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:05.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:49:06 np0005486759.ooo.test systemd[1]: tmp-crun.6fCFhe.mount: Deactivated successfully.
Oct 14 09:49:06 np0005486759.ooo.test podman[287458]: 2025-10-14 09:49:06.423890062 +0000 UTC m=+0.058370405 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:49:06 np0005486759.ooo.test podman[287458]: 2025-10-14 09:49:06.430004915 +0000 UTC m=+0.064485268 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:49:06 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:49:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:49:10 np0005486759.ooo.test systemd[1]: tmp-crun.OpxSwb.mount: Deactivated successfully.
Oct 14 09:49:10 np0005486759.ooo.test podman[287477]: 2025-10-14 09:49:10.419691783 +0000 UTC m=+0.049027477 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:49:10 np0005486759.ooo.test podman[287477]: 2025-10-14 09:49:10.425436786 +0000 UTC m=+0.054772570 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:49:10 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:49:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:10.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:49:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:49:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:49:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:49:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:49:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15232 "" "Go-http-client/1.1"
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:49:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:49:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:15.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:15.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:15.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:49:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:15.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:49:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:15.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:15.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:49:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36697 DF PROTO=TCP SPT=57982 DPT=9102 SEQ=1295848467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F968D070000000001030307) 
Oct 14 09:49:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:49:20 np0005486759.ooo.test podman[287501]: 2025-10-14 09:49:20.45039431 +0000 UTC m=+0.074757470 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, config_id=edpm, org.label-schema.vendor=CentOS)
Oct 14 09:49:20 np0005486759.ooo.test podman[287501]: 2025-10-14 09:49:20.48630376 +0000 UTC m=+0.110666910 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 09:49:20 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:49:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36698 DF PROTO=TCP SPT=57982 DPT=9102 SEQ=1295848467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9691020000000001030307) 
Oct 14 09:49:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:20.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:49:22 np0005486759.ooo.test podman[287520]: 2025-10-14 09:49:22.415927169 +0000 UTC m=+0.051745087 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:49:22 np0005486759.ooo.test podman[287520]: 2025-10-14 09:49:22.426699104 +0000 UTC m=+0.062516962 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 14 09:49:22 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:49:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36699 DF PROTO=TCP SPT=57982 DPT=9102 SEQ=1295848467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9699010000000001030307) 
Oct 14 09:49:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:49:24 np0005486759.ooo.test podman[287539]: 2025-10-14 09:49:24.41907001 +0000 UTC m=+0.054146680 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:49:24 np0005486759.ooo.test podman[287539]: 2025-10-14 09:49:24.428419542 +0000 UTC m=+0.063496232 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:49:24 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:49:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:25.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36700 DF PROTO=TCP SPT=57982 DPT=9102 SEQ=1295848467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F96A8C10000000001030307) 
Oct 14 09:49:27 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:27.158 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:49:28 np0005486759.ooo.test podman[287562]: 2025-10-14 09:49:28.42859809 +0000 UTC m=+0.063694467 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 09:49:28 np0005486759.ooo.test podman[287562]: 2025-10-14 09:49:28.468393288 +0000 UTC m=+0.103489675 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:49:28 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:49:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:28.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:49:30 np0005486759.ooo.test podman[287587]: 2025-10-14 09:49:30.430775273 +0000 UTC m=+0.066564674 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, version=9.6, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Oct 14 09:49:30 np0005486759.ooo.test podman[287587]: 2025-10-14 09:49:30.440534417 +0000 UTC m=+0.076323818 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc.)
Oct 14 09:49:30 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:49:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:30.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:30.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:30.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.935 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.935 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.935 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:49:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:31.936 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:49:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:32.278 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:49:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:32.304 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:49:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:32.305 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:49:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:32.306 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:32.306 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:49:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:32.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:33.493 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.522 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.522 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.523 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.523 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.609 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.660 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.662 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.712 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.713 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.753 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.754 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:49:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:34.823 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.009 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.011 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12440MB free_disk=386.72095489501953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.011 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.011 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.117 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.117 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.118 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.173 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.195 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.198 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.198 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:49:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:35.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:49:36 np0005486759.ooo.test podman[287619]: 2025-10-14 09:49:36.451410571 +0000 UTC m=+0.078369149 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 09:49:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:49:36 np0005486759.ooo.test podman[287619]: 2025-10-14 09:49:36.501408575 +0000 UTC m=+0.128367113 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:49:36 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:49:36 np0005486759.ooo.test podman[287638]: 2025-10-14 09:49:36.552210314 +0000 UTC m=+0.070087310 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:49:36 np0005486759.ooo.test podman[287638]: 2025-10-14 09:49:36.566296268 +0000 UTC m=+0.084173264 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:49:36 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:49:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:40.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:49:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:40.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:40.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:49:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:40.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:49:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:40.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:49:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:40.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:49:41 np0005486759.ooo.test podman[287656]: 2025-10-14 09:49:41.429008828 +0000 UTC m=+0.063110399 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:49:41 np0005486759.ooo.test podman[287656]: 2025-10-14 09:49:41.438475104 +0000 UTC m=+0.072576625 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:49:41 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:49:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:49:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:49:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:49:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:49:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:49:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15232 "" "Go-http-client/1.1"
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:49:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:49:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:49:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:45.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58431 DF PROTO=TCP SPT=38642 DPT=9102 SEQ=2010497927 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9702370000000001030307) 
Oct 14 09:49:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58432 DF PROTO=TCP SPT=38642 DPT=9102 SEQ=2010497927 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9706410000000001030307) 
Oct 14 09:49:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:50.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:49:51 np0005486759.ooo.test podman[287679]: 2025-10-14 09:49:51.460742517 +0000 UTC m=+0.084005509 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:49:51 np0005486759.ooo.test podman[287679]: 2025-10-14 09:49:51.469603663 +0000 UTC m=+0.092866575 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 09:49:51 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:49:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58433 DF PROTO=TCP SPT=38642 DPT=9102 SEQ=2010497927 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F970E410000000001030307) 
Oct 14 09:49:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:49:53 np0005486759.ooo.test podman[287699]: 2025-10-14 09:49:53.447226687 +0000 UTC m=+0.074037908 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 09:49:53 np0005486759.ooo.test podman[287699]: 2025-10-14 09:49:53.461432074 +0000 UTC m=+0.088243345 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:49:53 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:49:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:49:54.153 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:49:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:49:54.154 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:49:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:49:54.155 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:49:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:49:55 np0005486759.ooo.test podman[287719]: 2025-10-14 09:49:55.456234955 +0000 UTC m=+0.077420380 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:49:55 np0005486759.ooo.test podman[287719]: 2025-10-14 09:49:55.466397801 +0000 UTC m=+0.087583186 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:49:55 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:49:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:55.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:49:55.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:49:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58434 DF PROTO=TCP SPT=38642 DPT=9102 SEQ=2010497927 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F971E010000000001030307) 
Oct 14 09:49:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:49:59 np0005486759.ooo.test podman[287740]: 2025-10-14 09:49:59.446690181 +0000 UTC m=+0.079277536 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 09:49:59 np0005486759.ooo.test podman[287740]: 2025-10-14 09:49:59.519414929 +0000 UTC m=+0.152002334 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 14 09:49:59 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:50:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:00.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:50:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:50:01 np0005486759.ooo.test podman[287764]: 2025-10-14 09:50:01.419978503 +0000 UTC m=+0.056747069 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:50:01 np0005486759.ooo.test podman[287764]: 2025-10-14 09:50:01.430310384 +0000 UTC m=+0.067078970 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 14 09:50:01 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:50:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:05.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:50:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:50:07 np0005486759.ooo.test podman[287786]: 2025-10-14 09:50:07.440067136 +0000 UTC m=+0.071808601 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 09:50:07 np0005486759.ooo.test podman[287786]: 2025-10-14 09:50:07.446426017 +0000 UTC m=+0.078167462 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251009)
Oct 14 09:50:07 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:50:07 np0005486759.ooo.test podman[287787]: 2025-10-14 09:50:07.495397411 +0000 UTC m=+0.124868718 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:50:07 np0005486759.ooo.test podman[287787]: 2025-10-14 09:50:07.525295041 +0000 UTC m=+0.154766338 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 09:50:07 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:50:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:10.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:50:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:50:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:50:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:50:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:50:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15233 "" "Go-http-client/1.1"
Oct 14 09:50:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:50:12 np0005486759.ooo.test podman[287824]: 2025-10-14 09:50:12.435131842 +0000 UTC m=+0.066684028 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:50:12 np0005486759.ooo.test podman[287824]: 2025-10-14 09:50:12.469271679 +0000 UTC m=+0.100823855 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:50:12 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:50:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:50:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:15.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:50:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17893 DF PROTO=TCP SPT=42222 DPT=9102 SEQ=558765899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9777670000000001030307) 
Oct 14 09:50:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17894 DF PROTO=TCP SPT=42222 DPT=9102 SEQ=558765899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F977B820000000001030307) 
Oct 14 09:50:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:20.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:50:22 np0005486759.ooo.test podman[287848]: 2025-10-14 09:50:22.434388971 +0000 UTC m=+0.066288486 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:50:22 np0005486759.ooo.test podman[287848]: 2025-10-14 09:50:22.468288261 +0000 UTC m=+0.100187746 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 09:50:22 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:50:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17895 DF PROTO=TCP SPT=42222 DPT=9102 SEQ=558765899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9783810000000001030307) 
Oct 14 09:50:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:50:24 np0005486759.ooo.test systemd[1]: tmp-crun.6PyBkl.mount: Deactivated successfully.
Oct 14 09:50:24 np0005486759.ooo.test podman[287867]: 2025-10-14 09:50:24.434607624 +0000 UTC m=+0.063003997 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251009, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.446 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test podman[287867]: 2025-10-14 09:50:24.47237741 +0000 UTC m=+0.100773763 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.479 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.480 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2580075f-9e0f-4eb2-90e9-161b5e7a1e5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.447551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34c42590-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': 'd5d579e38183fa877eaa8591c9d3feed4b5c1b2e20d97f7d30c5cf7f2176c38a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.447551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34c43576-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '489ff76a7cb66fd99b204a6264f56d869b74788c7df85554ef8a76fe368fcd62'}]}, 'timestamp': '2025-10-14 09:50:24.480351', '_unique_id': 'cbf4baffca7c4773b294bde2f903a4fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.481 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.482 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.500 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.500 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5804f46c-7f23-4fe4-813c-06e01b8bf6b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.482466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34c74c8e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.677552979, 'message_signature': 'b243de7a5984a620c4eb464b6af600a7ca1a5c39594c606a3e0edba0f47b0d8c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.482466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34c755bc-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.677552979, 'message_signature': '48eb3c2b9cbdc6be3e149ea40b417040ba3598a2e5601babb395c6e724510f3d'}]}, 'timestamp': '2025-10-14 09:50:24.500780', '_unique_id': '26b74b7d865e4252b17cffd6151db38c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '998e9506-4629-48b9-a213-cc528593b48a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.502123', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34c79252-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '1388dcba5b953d161b2e4a5b980b220bcc2553660baa25dfb1d764f1f3c87742'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.502123', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34c79982-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': 'e9ca323a877e79749c20bfeeea4b88f2822d824fc5fc0327b3114fd9869f8411'}]}, 'timestamp': '2025-10-14 09:50:24.502505', '_unique_id': 'a8de0af9cea7455484231a839d63ee67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.502 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.506 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '376785cf-c716-46cb-9a36-7fc4e7658b66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.503580', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34c84fa8-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '25e839f1ce7fcc000c5493f4b1fd293a3f52c43d9bc87133417b41376b64326a'}]}, 'timestamp': '2025-10-14 09:50:24.507199', '_unique_id': 'd4e2714e3ee24a629345a74400dceaff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.507 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2e885d-3551-4091-be63-5a6b9095ebb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.508296', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34c8837e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '4969ee005936399bc9e68a85cbf9c871697e5b47590ebbb2100858d3db640cd2'}]}, 'timestamp': '2025-10-14 09:50:24.508529', '_unique_id': '04cae09612c94236adef9287aebcf12a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.508 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.509 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb4de3f2-f41f-441f-8781-75dfb0d3a50f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.509499', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34c8b268-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '9ace759063c16971b4d20a6f65a5607bf2accbbbe39caab332bea7153c033683'}]}, 'timestamp': '2025-10-14 09:50:24.509708', '_unique_id': '359b61f9cd6a46cf89c7ef1cf9be5214'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.510 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9407c096-1f56-4844-8989-38156bd02c8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.510769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34c8e3f0-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': 'ec7f88ebf891ee7998d7748fee42c614ccb547714df1205a56b5556bd6b2441d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.510769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34c8ebd4-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '0cb3c2597ae03970cab70c4c44efa365205f288e7dcf7a22d076030c8250c7a1'}]}, 'timestamp': '2025-10-14 09:50:24.511166', '_unique_id': 'b0818f6d2e084c3aa2db668a06895c2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65c11dd1-cb8e-4638-989b-e22fbfbfca13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.512152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34c91a0a-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.677552979, 'message_signature': '80aa59e8102be7db88d361a1bb4d650a834b5d94ae0885129c87d7c4b29b26ee'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.512152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34c92112-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.677552979, 'message_signature': 'a246c9eed2960af4c0798121c0047a2e9d789784af8d9f1ba8fa3795f2c26d70'}]}, 'timestamp': '2025-10-14 09:50:24.512545', '_unique_id': 'bf7c68b2ed5843028bf73c6982702179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.513 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.514 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6618d054-7150-4589-be41-9c8cdfdf6ecc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.514088', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34c96960-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '1288862083c20c105c94e359410716cf61a58dd057b305c0460b096447597c06'}]}, 'timestamp': '2025-10-14 09:50:24.514481', '_unique_id': '0c4672246ad64e95b9e1648f594aac92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.515 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.516 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db20593b-638c-4da3-9cea-c015bb172793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:50:24.516356', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '34ccd780-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.731392269, 'message_signature': '470c609dac59b938d9240bfc52ef40f6efb7735249fe77812c5b71bacd1be898'}]}, 'timestamp': '2025-10-14 09:50:24.537029', '_unique_id': 'e614b6eae2b14b2abee38f7ee9e742f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.538 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed23351c-d443-4587-b16b-e0add0c485f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.538880', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34cd3130-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': 'e3087a02f5498cab79bdc6b31646c536580ba8c60820d061d2def264e864b10f'}]}, 'timestamp': '2025-10-14 09:50:24.539222', '_unique_id': 'bffd9f4dda4444a2bb41bf94eb481187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.539 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.540 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.540 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd83d19d2-8acd-41ab-a96c-90f0cabaea0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.540692', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34cd76a4-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '3640f013dc6626fe1c21ce1775b56f588f2ce1cc6205e87482e016c148fe556d'}]}, 'timestamp': '2025-10-14 09:50:24.541081', '_unique_id': '376e07a581da48ebbef7b573b345e890'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.542 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bee1467-27be-435d-9acb-ca2d88eacdc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.542569', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34cdc096-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '3f8b895f658201caeb0e03e551a25ce0064e64e1071e561295a1ba962230e223'}]}, 'timestamp': '2025-10-14 09:50:24.542891', '_unique_id': 'aa52c517f5224a7abb26d64ebf054b87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.544 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.544 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1177101b-e117-4649-ac21-8430e1ca9bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.544424', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34ce089e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': 'b4240bf46b9cb8a0760b9351d34c7e57d79f915d89f835f428797b6961d37149'}]}, 'timestamp': '2025-10-14 09:50:24.544766', '_unique_id': '8b8558f5ae6845d19e23f5c2c49c1c71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.545 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.546 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.546 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c92e493b-f707-47aa-9726-453ddda9767e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.546266', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34ce5074-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '95711868ff176c821441026766fe13ad0f7e8e3fa9a82a909df3800ce77da8d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.546266', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34ce5b6e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '2684b6bd6d62709a9597c437cdda7019150514528c943e7f0fe212c9d131afee'}]}, 'timestamp': '2025-10-14 09:50:24.546837', '_unique_id': '5882d21b552d4216bff180ea9880b8d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.547 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.548 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.548 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.548 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2f2b987-de67-4a25-b182-fce1379e31f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.548602', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34ceabd2-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '02bc5ce512b8a78ebc469647df4c4ad8da5349d7c66283cb9840ed4411ca80a6'}]}, 'timestamp': '2025-10-14 09:50:24.548918', '_unique_id': '1aa2527f4b0b430790813d2324902ab9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.550 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.550 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 51190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e627457c-c1ad-4646-a135-35b2aa413427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51190000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:50:24.550554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '34cef970-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.731392269, 'message_signature': 'a49a55dfa748dcd3a30ea423c29914bdf3aec4ef5c658bd70bb9d0265f57bf89'}]}, 'timestamp': '2025-10-14 09:50:24.550892', '_unique_id': 'f517a84d03db4ad1b80e3207a8a5cd45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.552 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.552 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.552 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4357cc39-3355-44e7-8bac-a41e21933cca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.552602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34cf4844-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.677552979, 'message_signature': 'ce88fd37ccd98f1af18e66c936d6baea4280a4af7b5aa10cb99541055265ce2b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.552602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34cf5686-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.677552979, 'message_signature': 'f53188ae1cbf0403d2540cdb5a841e9a7a04b1559896832b7413cd55e8c1ced3'}]}, 'timestamp': '2025-10-14 09:50:24.553325', '_unique_id': 'abeae986b58949d5adabfb98b3d0c676'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.555 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.555 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.555 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bd997cd-81d8-4379-b295-627c1c1258d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.555214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34cfadfc-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': 'bcda6a24cde761c15af86fb1c0165e4adebd7fe96956bcc7f7d65b4eaecd79a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.555214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34cfb8c4-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '269873e2eef3dda9c3a2a980a655d373d145d6f6de0481b37eb08b0418a80010'}]}, 'timestamp': '2025-10-14 09:50:24.555780', '_unique_id': '805e724e09bf482f94710d53dbb37468'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '293301ff-12bb-45d7-b914-59a12b31c16a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:50:24.557365', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '34d001f8-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.698644203, 'message_signature': '5f98ff2fe2f39234c97f485f7b294f2dd79b609aedcabcda2a7023335c66e8c8'}]}, 'timestamp': '2025-10-14 09:50:24.557670', '_unique_id': '5f4da6bcbcb34ff4a4e67b99fc1482e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.559 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.559 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f070d32e-2bfe-43eb-a0ca-988c8b92430b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:50:24.559167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '34d0497e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': '6205f6a63e72d3c1d0322b1ece4a3ba968848b1f5ebdc13ef1e5338fb89c6fea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:50:24.559167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '34d05586-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11444.642636148, 'message_signature': 'ed8211be73d2e7c84accb9d6757f8f6491c37bbc4223b0f7900bf2155afc0f1d'}]}, 'timestamp': '2025-10-14 09:50:24.559791', '_unique_id': 'a30363e659dd46f3bc9333f1293eb9d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:50:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:50:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:50:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:25.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:25.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:50:26 np0005486759.ooo.test systemd[1]: tmp-crun.PVoEh8.mount: Deactivated successfully.
Oct 14 09:50:26 np0005486759.ooo.test podman[287887]: 2025-10-14 09:50:26.436518798 +0000 UTC m=+0.070849313 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:50:26 np0005486759.ooo.test podman[287887]: 2025-10-14 09:50:26.443340914 +0000 UTC m=+0.077671429 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:50:26 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:50:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17896 DF PROTO=TCP SPT=42222 DPT=9102 SEQ=558765899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9793410000000001030307) 
Oct 14 09:50:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:29.199 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:50:30 np0005486759.ooo.test systemd[1]: tmp-crun.Tk5Z3j.mount: Deactivated successfully.
Oct 14 09:50:30 np0005486759.ooo.test podman[287910]: 2025-10-14 09:50:30.451524344 +0000 UTC m=+0.080070879 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:50:30 np0005486759.ooo.test podman[287910]: 2025-10-14 09:50:30.488516408 +0000 UTC m=+0.117062903 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:50:30 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:50:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:30.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:31.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:50:32 np0005486759.ooo.test podman[287936]: 2025-10-14 09:50:32.447533822 +0000 UTC m=+0.076195933 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Oct 14 09:50:32 np0005486759.ooo.test podman[287936]: 2025-10-14 09:50:32.488432963 +0000 UTC m=+0.117095074 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 09:50:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:32.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:32 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.496 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.819 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.820 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.820 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:50:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:33.821 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:50:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:34.314 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:50:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:34.336 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:50:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:34.337 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:50:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:34.337 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:34.337 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:34.338 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:50:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:35.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.334 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.496 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.518 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.518 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.519 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.519 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.572 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.642 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.644 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.715 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.716 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.783 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.785 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:50:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:36.835 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.021 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.022 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12485MB free_disk=386.72095489501953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.023 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.023 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.114 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.115 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.115 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.166 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.184 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.187 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:50:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:37.187 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:50:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:50:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:50:38 np0005486759.ooo.test podman[287968]: 2025-10-14 09:50:38.449428127 +0000 UTC m=+0.076382079 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 09:50:38 np0005486759.ooo.test podman[287968]: 2025-10-14 09:50:38.460410237 +0000 UTC m=+0.087364189 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid)
Oct 14 09:50:38 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:50:38 np0005486759.ooo.test podman[287969]: 2025-10-14 09:50:38.494187813 +0000 UTC m=+0.118749643 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 09:50:38 np0005486759.ooo.test podman[287969]: 2025-10-14 09:50:38.502460092 +0000 UTC m=+0.127021952 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009)
Oct 14 09:50:38 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:50:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:40.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:40.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:50:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:50:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:50:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:50:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:50:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15233 "" "Go-http-client/1.1"
Oct 14 09:50:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:50:43 np0005486759.ooo.test podman[288005]: 2025-10-14 09:50:43.443727357 +0000 UTC m=+0.073255065 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:50:43 np0005486759.ooo.test podman[288005]: 2025-10-14 09:50:43.47638325 +0000 UTC m=+0.105910938 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:50:43 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:50:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:50:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:50:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:45.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59735 DF PROTO=TCP SPT=49270 DPT=9102 SEQ=276019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F97EC980000000001030307) 
Oct 14 09:50:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59736 DF PROTO=TCP SPT=49270 DPT=9102 SEQ=276019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F97F0810000000001030307) 
Oct 14 09:50:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:50.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:50:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59737 DF PROTO=TCP SPT=49270 DPT=9102 SEQ=276019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F97F8810000000001030307) 
Oct 14 09:50:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:50:53 np0005486759.ooo.test podman[288028]: 2025-10-14 09:50:53.490532318 +0000 UTC m=+0.118627810 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 09:50:53 np0005486759.ooo.test podman[288028]: 2025-10-14 09:50:53.522479569 +0000 UTC m=+0.150575051 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:50:53 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:50:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:50:54.154 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:50:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:50:54.155 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:50:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:50:54.156 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:50:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:50:55 np0005486759.ooo.test podman[288047]: 2025-10-14 09:50:55.443904253 +0000 UTC m=+0.076025458 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:50:55 np0005486759.ooo.test podman[288047]: 2025-10-14 09:50:55.456698697 +0000 UTC m=+0.088819952 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 09:50:55 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:50:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:55.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:50:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:55.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:50:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:55.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:50:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:55.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:50:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:55.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:50:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:50:55.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:50:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59738 DF PROTO=TCP SPT=49270 DPT=9102 SEQ=276019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9808410000000001030307) 
Oct 14 09:50:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:50:57 np0005486759.ooo.test podman[288067]: 2025-10-14 09:50:57.43892835 +0000 UTC m=+0.070289266 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:50:57 np0005486759.ooo.test podman[288067]: 2025-10-14 09:50:57.442654862 +0000 UTC m=+0.074015788 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:50:57 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:51:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:00.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:00.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:51:01 np0005486759.ooo.test podman[288090]: 2025-10-14 09:51:01.462663269 +0000 UTC m=+0.078720750 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller)
Oct 14 09:51:01 np0005486759.ooo.test podman[288090]: 2025-10-14 09:51:01.646344076 +0000 UTC m=+0.262401487 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=ovn_controller, container_name=ovn_controller)
Oct 14 09:51:01 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:51:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:51:03 np0005486759.ooo.test podman[288115]: 2025-10-14 09:51:03.42889989 +0000 UTC m=+0.059498151 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, version=9.6)
Oct 14 09:51:03 np0005486759.ooo.test podman[288115]: 2025-10-14 09:51:03.439583572 +0000 UTC m=+0.070181893 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 14 09:51:03 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:51:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:05.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:05.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:05.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:51:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:05.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:05.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:05.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:51:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:51:09 np0005486759.ooo.test systemd[1]: tmp-crun.hl61Ip.mount: Deactivated successfully.
Oct 14 09:51:09 np0005486759.ooo.test podman[288135]: 2025-10-14 09:51:09.453057207 +0000 UTC m=+0.084273527 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 09:51:09 np0005486759.ooo.test podman[288135]: 2025-10-14 09:51:09.463199012 +0000 UTC m=+0.094415302 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3)
Oct 14 09:51:09 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:51:09 np0005486759.ooo.test podman[288136]: 2025-10-14 09:51:09.431811597 +0000 UTC m=+0.060879093 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:51:09 np0005486759.ooo.test podman[288136]: 2025-10-14 09:51:09.515289099 +0000 UTC m=+0.144356565 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:51:09 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:51:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:10.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:10.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:51:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:51:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:51:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:51:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:51:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15232 "" "Go-http-client/1.1"
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:51:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:51:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:51:14 np0005486759.ooo.test systemd[1]: tmp-crun.5XVvB2.mount: Deactivated successfully.
Oct 14 09:51:14 np0005486759.ooo.test podman[288172]: 2025-10-14 09:51:14.428822449 +0000 UTC m=+0.060091090 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:51:14 np0005486759.ooo.test podman[288172]: 2025-10-14 09:51:14.461403308 +0000 UTC m=+0.092671919 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:51:14 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:51:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:15.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:15.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:15.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5025 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:51:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:15.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:15.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:15.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54579 DF PROTO=TCP SPT=58012 DPT=9102 SEQ=219612157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9861C80000000001030307) 
Oct 14 09:51:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54580 DF PROTO=TCP SPT=58012 DPT=9102 SEQ=219612157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9865C20000000001030307) 
Oct 14 09:51:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:20.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54581 DF PROTO=TCP SPT=58012 DPT=9102 SEQ=219612157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F986DC10000000001030307) 
Oct 14 09:51:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:51:24 np0005486759.ooo.test podman[288195]: 2025-10-14 09:51:24.448506729 +0000 UTC m=+0.071077901 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct 14 09:51:24 np0005486759.ooo.test podman[288195]: 2025-10-14 09:51:24.458270719 +0000 UTC m=+0.080841901 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct 14 09:51:24 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:51:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:25.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:51:26 np0005486759.ooo.test systemd[1]: tmp-crun.EN6jRF.mount: Deactivated successfully.
Oct 14 09:51:26 np0005486759.ooo.test podman[288214]: 2025-10-14 09:51:26.431055512 +0000 UTC m=+0.062122254 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Oct 14 09:51:26 np0005486759.ooo.test podman[288214]: 2025-10-14 09:51:26.441513332 +0000 UTC m=+0.072580124 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:51:26 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:51:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54582 DF PROTO=TCP SPT=58012 DPT=9102 SEQ=219612157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F987D810000000001030307) 
Oct 14 09:51:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:28.183 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:51:28 np0005486759.ooo.test podman[288234]: 2025-10-14 09:51:28.427667029 +0000 UTC m=+0.053601950 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:51:28 np0005486759.ooo.test podman[288234]: 2025-10-14 09:51:28.432262481 +0000 UTC m=+0.058197432 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:51:28 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:51:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:28.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:29.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.514 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.515 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:30.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:51:32 np0005486759.ooo.test podman[288259]: 2025-10-14 09:51:32.437254041 +0000 UTC m=+0.070180275 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Oct 14 09:51:32 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:32.531 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:32 np0005486759.ooo.test podman[288259]: 2025-10-14 09:51:32.539386182 +0000 UTC m=+0.172312426 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:51:32 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:51:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:33.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:33.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:33.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:51:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:51:34 np0005486759.ooo.test podman[288284]: 2025-10-14 09:51:34.403086905 +0000 UTC m=+0.072233125 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct 14 09:51:34 np0005486759.ooo.test podman[288284]: 2025-10-14 09:51:34.413232056 +0000 UTC m=+0.082378306 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Oct 14 09:51:34 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:51:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:34.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:35.493 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:35.496 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:35.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:51:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:35.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:51:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:35.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.046 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.047 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.048 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.048 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.556 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.594 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.595 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.596 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.596 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.596 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 14 09:51:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:36.635 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 14 09:51:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:38.537 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:51:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:51:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:51:40 np0005486759.ooo.test podman[288304]: 2025-10-14 09:51:40.414761991 +0000 UTC m=+0.050073988 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 09:51:40 np0005486759.ooo.test podman[288305]: 2025-10-14 09:51:40.479254962 +0000 UTC m=+0.114099696 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 14 09:51:40 np0005486759.ooo.test podman[288304]: 2025-10-14 09:51:40.50290661 +0000 UTC m=+0.138218617 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 09:51:40 np0005486759.ooo.test podman[288305]: 2025-10-14 09:51:40.512353492 +0000 UTC m=+0.147198146 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:51:40 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:51:40 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:51:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:40.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:51:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:40.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:40.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:51:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:40.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:40.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:51:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:40.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:51:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:51:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:51:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:51:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:51:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15238 "" "Go-http-client/1.1"
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.578 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.579 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.579 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.579 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.668 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.744 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.746 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.805 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.807 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.861 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.862 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:51:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:43.909 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:51:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:51:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.101 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.102 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12485MB free_disk=386.72095489501953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.102 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.103 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.627 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.628 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.628 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:51:44 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:44.951 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.274 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.275 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.297 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.330 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 09:51:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.384 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.410 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.412 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.412 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:51:45 np0005486759.ooo.test podman[288352]: 2025-10-14 09:51:45.439336303 +0000 UTC m=+0.072928034 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:51:45 np0005486759.ooo.test podman[288352]: 2025-10-14 09:51:45.445470709 +0000 UTC m=+0.079062420 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:51:45 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:45.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35659 DF PROTO=TCP SPT=53594 DPT=9102 SEQ=1169737255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F98D6F80000000001030307) 
Oct 14 09:51:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35660 DF PROTO=TCP SPT=53594 DPT=9102 SEQ=1169737255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F98DB020000000001030307) 
Oct 14 09:51:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:50.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35661 DF PROTO=TCP SPT=53594 DPT=9102 SEQ=1169737255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F98E3020000000001030307) 
Oct 14 09:51:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:51:54.156 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:51:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:51:54.157 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:51:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:51:54.157 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:51:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:51:55 np0005486759.ooo.test podman[288375]: 2025-10-14 09:51:55.432086775 +0000 UTC m=+0.067016574 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:51:55 np0005486759.ooo.test podman[288375]: 2025-10-14 09:51:55.466911284 +0000 UTC m=+0.101841093 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Oct 14 09:51:55 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:51:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:51:55.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:51:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35662 DF PROTO=TCP SPT=53594 DPT=9102 SEQ=1169737255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F98F2C10000000001030307) 
Oct 14 09:51:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:51:57 np0005486759.ooo.test podman[288394]: 2025-10-14 09:51:57.445125872 +0000 UTC m=+0.077569487 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:51:57 np0005486759.ooo.test podman[288394]: 2025-10-14 09:51:57.456314973 +0000 UTC m=+0.088758598 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Oct 14 09:51:57 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:51:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:51:59 np0005486759.ooo.test systemd[1]: tmp-crun.F6jSnY.mount: Deactivated successfully.
Oct 14 09:51:59 np0005486759.ooo.test podman[288413]: 2025-10-14 09:51:59.453077745 +0000 UTC m=+0.085518646 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:51:59 np0005486759.ooo.test podman[288413]: 2025-10-14 09:51:59.458365027 +0000 UTC m=+0.090805958 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:51:59 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:52:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:00.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:52:03 np0005486759.ooo.test podman[288436]: 2025-10-14 09:52:03.443041154 +0000 UTC m=+0.074233421 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:52:03 np0005486759.ooo.test podman[288436]: 2025-10-14 09:52:03.479529931 +0000 UTC m=+0.110722148 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 09:52:03 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:52:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:52:05 np0005486759.ooo.test podman[288461]: 2025-10-14 09:52:05.453890368 +0000 UTC m=+0.082067895 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:52:05 np0005486759.ooo.test podman[288461]: 2025-10-14 09:52:05.470369981 +0000 UTC m=+0.098547558 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 14 09:52:05 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:52:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:05.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:09 np0005486759.ooo.test sshd[288481]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:52:09 np0005486759.ooo.test sshd[288481]: Accepted publickey for zuul from 192.168.122.30 port 43206 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:52:09 np0005486759.ooo.test systemd-logind[759]: New session 42 of user zuul.
Oct 14 09:52:09 np0005486759.ooo.test systemd[1]: Started Session 42 of User zuul.
Oct 14 09:52:09 np0005486759.ooo.test sshd[288481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:52:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:10.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:52:11 np0005486759.ooo.test python3.9[288592]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:52:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:52:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:52:11 np0005486759.ooo.test podman[288598]: 2025-10-14 09:52:11.482757648 +0000 UTC m=+0.069911837 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:52:11 np0005486759.ooo.test podman[288597]: 2025-10-14 09:52:11.541070452 +0000 UTC m=+0.127331826 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Oct 14 09:52:11 np0005486759.ooo.test podman[288597]: 2025-10-14 09:52:11.550163963 +0000 UTC m=+0.136425337 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_managed=true, org.label-schema.license=GPLv2, config_id=iscsid)
Oct 14 09:52:11 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:52:11 np0005486759.ooo.test podman[288598]: 2025-10-14 09:52:11.568556261 +0000 UTC m=+0.155710470 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 09:52:11 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:52:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:52:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:52:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:52:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:52:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:52:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15242 "" "Go-http-client/1.1"
Oct 14 09:52:12 np0005486759.ooo.test sudo[288741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myvklcrseozshajsvszmeymkyehtdfzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435532.0525846-34-219491706464793/AnsiballZ_file.py
Oct 14 09:52:12 np0005486759.ooo.test sudo[288741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:12 np0005486759.ooo.test python3.9[288743]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:12 np0005486759.ooo.test sudo[288741]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:13 np0005486759.ooo.test sudo[288851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzatbdndgbcqmiouwaxjklziwqomvmxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435533.2357712-34-210452037720505/AnsiballZ_file.py
Oct 14 09:52:13 np0005486759.ooo.test sudo[288851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:13 np0005486759.ooo.test python3.9[288853]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:13 np0005486759.ooo.test sudo[288851]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:52:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:52:14 np0005486759.ooo.test sudo[288961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkaqhkhfikzurnswwpvxyutjitkkvrek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435533.9508526-34-86997495072637/AnsiballZ_file.py
Oct 14 09:52:14 np0005486759.ooo.test sudo[288961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:14 np0005486759.ooo.test python3.9[288963]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:14 np0005486759.ooo.test sudo[288961]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:14 np0005486759.ooo.test sudo[289071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdktkhpesdnkiicnpgpxtizyfzbfqkuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435534.6019242-34-267722934198678/AnsiballZ_file.py
Oct 14 09:52:14 np0005486759.ooo.test sudo[289071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:15 np0005486759.ooo.test python3.9[289073]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:52:15 np0005486759.ooo.test sudo[289071]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:15 np0005486759.ooo.test sudo[289181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saqhzzkioocwbiucceqwgeehpcfhhhju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435535.2631347-34-127225803953727/AnsiballZ_file.py
Oct 14 09:52:15 np0005486759.ooo.test sudo[289181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:52:15 np0005486759.ooo.test podman[289184]: 2025-10-14 09:52:15.651841918 +0000 UTC m=+0.074344924 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:52:15 np0005486759.ooo.test podman[289184]: 2025-10-14 09:52:15.659400146 +0000 UTC m=+0.081903192 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:52:15 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:52:15 np0005486759.ooo.test python3.9[289183]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:15 np0005486759.ooo.test sudo[289181]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:15.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:52:16 np0005486759.ooo.test sudo[289314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbcuqcsgpcrjtjlkvdsnboqjhouwocqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435535.9388976-70-113682679930861/AnsiballZ_stat.py
Oct 14 09:52:16 np0005486759.ooo.test sudo[289314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:16 np0005486759.ooo.test python3.9[289316]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:52:16 np0005486759.ooo.test sudo[289314]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:17 np0005486759.ooo.test sudo[289426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovkputwfbyyukjhsagnrejabjeebestk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435536.6861715-78-195421178641043/AnsiballZ_systemd.py
Oct 14 09:52:17 np0005486759.ooo.test sudo[289426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:17 np0005486759.ooo.test python3.9[289428]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:52:17 np0005486759.ooo.test sudo[289426]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:18 np0005486759.ooo.test sudo[289538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeavaoyivlggzfazhaxrmrliadqhpikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435537.8903823-86-228819207289847/AnsiballZ_service_facts.py
Oct 14 09:52:18 np0005486759.ooo.test sudo[289538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:18 np0005486759.ooo.test python3.9[289540]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:52:18 np0005486759.ooo.test network[289557]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:52:18 np0005486759.ooo.test network[289558]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:52:18 np0005486759.ooo.test network[289559]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:52:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5822 DF PROTO=TCP SPT=45882 DPT=9102 SEQ=3377405946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F994C280000000001030307) 
Oct 14 09:52:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5823 DF PROTO=TCP SPT=45882 DPT=9102 SEQ=3377405946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9950410000000001030307) 
Oct 14 09:52:20 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:20.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:52:20 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:52:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5824 DF PROTO=TCP SPT=45882 DPT=9102 SEQ=3377405946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9958410000000001030307) 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.448 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.455 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54bd20ec-84a3-40cf-b390-b5811b008c69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.450066', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c4712d8-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': '0c2fc51d6145a5338c4dc01ddf7b9881f617691930fe1748a81eb73d961aaac4'}]}, 'timestamp': '2025-10-14 09:52:24.456382', '_unique_id': 'fcbe4186516f42179ebcaaa22542d8ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.458 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.459 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40d45d0b-8ec0-4609-b268-0b46f7bbc08c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.459766', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c47b42c-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': 'f3d661314c772bef43b870046d42b4044cfdc208f43ce79397dfe5c9905a3696'}]}, 'timestamp': '2025-10-14 09:52:24.460554', '_unique_id': '933ddc91a6f7496cbc1d595e2dca73d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.461 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.464 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.485 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.485 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61cee429-50c5-475d-8043-f3d4709eee14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.464721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c4b90f6-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.659848566, 'message_signature': 'd7497d222197e6c30b770528982859c52070bbb6720a5cdd9eb6cd4a5361b9ff'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.464721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c4ba87a-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.659848566, 'message_signature': '48704de91add9b3ac832ea8e57e4e82cae097ca6d46e93a83599e59f64758b42'}]}, 'timestamp': '2025-10-14 09:52:24.486368', '_unique_id': '56250cecab3c4f649dce9e0e30bf6e4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.487 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.488 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.489 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41ab8e82-b29e-4487-8677-1a0f2d2e8aec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.489126', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c4c28ae-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': '2ec3c271dfb8d060db72c5686c834f7c94fa062f9ae033a4a39ebdb48458009e'}]}, 'timestamp': '2025-10-14 09:52:24.489630', '_unique_id': '6231c34512554df2be09dfa50f0681aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.490 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.492 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09c6238d-2559-4f8a-8aea-8c0916ebbe3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.491993', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c4c9794-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': '6fc56fc272795b98d528cd6e7424a55b5400c9272bd90a34103f5fef2421bdfd'}]}, 'timestamp': '2025-10-14 09:52:24.492547', '_unique_id': '214f17c841b1441fa3b7fb72d92ef1f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.494 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.495 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff297289-9b5e-4b19-8a14-5b3dd2e613a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.494820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c4d0b66-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.659848566, 'message_signature': '3190cbb66cf723362889d70ed6662a158649bd560f52538eb1358c22ac07ad55'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.494820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c4d1ce6-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.659848566, 'message_signature': '7c06ccd9b60ff1a8da4de00a274ebc1d26f773a6e8ed44279d65c9ba2c3a4ec5'}]}, 'timestamp': '2025-10-14 09:52:24.495847', '_unique_id': '3c4c1dd94d9d403fb7c648d7ae01e179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.496 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.498 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 52290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0730a955-de4a-4ba7-9b5a-4f4a49d72526', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52290000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:52:24.498196', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7c512aac-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.716684528, 'message_signature': 'a0d0f7d7add1705c34253c626dcfe090e3976bf8a53c5f3ca8d6ce90c39153cf'}]}, 'timestamp': '2025-10-14 09:52:24.522564', '_unique_id': 'b577a28586c843f0b70e4786a3c9d4f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.524 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.525 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.525 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2cc1be1-c37b-4743-98e5-9503084b87ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.525587', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c51b90e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': '68d0d1395c52bc6da42d70d0813c5b2d7e376144be88c68d39850d49d872230b'}]}, 'timestamp': '2025-10-14 09:52:24.526127', '_unique_id': '5dd0a6f9472e4a1b8d285ee78d7a87ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.528 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.528 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6ade9fe-8c7b-44d8-8e18-0a311462bf86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:52:24.528285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7c5220ec-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.716684528, 'message_signature': '9b4a00fbfcda9d34787b290a14718a56d1ae8054e2ce675dec7952a5453e09c4'}]}, 'timestamp': '2025-10-14 09:52:24.528728', '_unique_id': '92b7079bb0ee44d1b87b2ab48b510e74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.530 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.531 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9648e80-fdbe-48fd-8b51-399895fb5353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.531076', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c528e10-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': 'e9f47260b4124ea5a6175d208568f77dfcb2889524e3cd90c73e759b7219f984'}]}, 'timestamp': '2025-10-14 09:52:24.531536', '_unique_id': 'd2dfd89decc94ddb8658639392db48d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.568 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.568 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d396727-77cb-49c5-9c65-052fb397e55b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.533658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c5834be-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': 'a2ab3ec6923ede2dae421105f007d9ce395da4d6c3742dfab30f02f27f765b74'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.533658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c584562-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': 'a83a82212946795e37c4ada917c89a2908fc7738c1f70f43019c7423710ce38f'}]}, 'timestamp': '2025-10-14 09:52:24.569001', '_unique_id': 'b76fec3d80f945ba8e982c6c208315a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.570 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.571 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.571 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.571 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7637331-f222-46c5-9469-367e827dcff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.571235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c58ae94-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '24b63f4bbd33ec5dc46bdbe0fb55cc158a66381529b3effdacbd1ec5f7484ad8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.571235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c58bdee-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '9380217ad36aca733d05a8899ca490dcccdd02f52197df2141bba89cc1ed25d8'}]}, 'timestamp': '2025-10-14 09:52:24.572087', '_unique_id': 'ec76f0dc5f954a3485a7382ff23daf36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.573 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.574 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.574 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e200a88-bf78-4665-a2fb-93ed65e5f4af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.574235', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c5923c4-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': 'ce54ef0b696158d5527e3c2fc092559cfa4e5e822a64cfba868a73e8194a5634'}]}, 'timestamp': '2025-10-14 09:52:24.574686', '_unique_id': 'bf5122d97ec64b339e31dc0b34c4a287'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.575 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.576 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.577 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.577 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '812d24f8-80ef-4cfc-9e95-516408090731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.577076', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c59932c-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '15bf0ca949fd53f3128131afdd34bed6d590d5411abe7014d68e487df373110a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.577076', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c59a43e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '0ae6b82fcb9d25aae36fc3e0b1f7aef6e34ccc9be6ae5ad780063471ed25f9bc'}]}, 'timestamp': '2025-10-14 09:52:24.577979', '_unique_id': '1b87274c4fdf4870989ee8916347a9dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.578 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.580 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '475aa7ff-6850-4950-b2e8-abffa53bf8e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.580185', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c5a0c30-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': 'f439ed8842a6d39a7299df489008ad2d4279f6e4bcfbc63e6bc96a6692a3e34b'}]}, 'timestamp': '2025-10-14 09:52:24.580637', '_unique_id': 'd34aea50ce834f0e9cec7af49636a9a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.581 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.582 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.582 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.583 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fe456a1-dc99-4f12-96e5-a41779f92427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.582739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c5a709e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '3eeeb53cf09d46ca7404edde333042b8c1130e02a07123547581831db0ec18ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.582739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c5a80ac-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '33e9d059353e97e1a54ac3deb8aaac5d2ef8d8c976002bf7024f450e2416ae9c'}]}, 'timestamp': '2025-10-14 09:52:24.583589', '_unique_id': '6a5971fa3a2d4d08906cab8d636e532c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.584 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.585 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7926e48e-dfc1-40a4-af7c-948973518730', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.585715', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c5ae420-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': '857a301730265b57c7ff610966020adf25fef97849e8ada596a8fd041310e51b'}]}, 'timestamp': '2025-10-14 09:52:24.586194', '_unique_id': 'b150c990118e4c47990ba582bd62105c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.587 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.588 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bf34beb-ea9d-4f4e-b8a1-129f70edd4e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:52:24.588262', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7c5b4794-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.645182796, 'message_signature': '51a3c7fd048e06c8d7d947ff29813ac60e742c7fab4e8f071168561e4a778125'}]}, 'timestamp': '2025-10-14 09:52:24.588708', '_unique_id': '3b5d3e8ea8794493b74afb9ad581a8c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.590 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.591 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.591 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0eff20b-638e-4b66-be66-36c9efbacda6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.591107', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c5bb6b6-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': 'def504a582c77595a43e8e6a9a2ea8cf59b4400d4c2bf41657667a8ad6e25b1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.591107', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c5bc6ba-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': '910ad1b7052e4dedf032284b190e40137ca3beb935565e3ce987bfccc8def47a'}]}, 'timestamp': '2025-10-14 09:52:24.591942', '_unique_id': '061566a7ffaa4495a43a8b3066446c34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.592 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.594 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.594 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.594 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ba74b19-500f-4880-906a-46cc1ec13a8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.594244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c5c315e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.659848566, 'message_signature': '3781384479c11dcf1dd4a4767937ec65d1f7d77c19da704dbe231e43e7ce82e8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.594244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c5c4130-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.659848566, 'message_signature': 'f4a32d9b00d260b65748aea797f782dbcbffcc10ceed314f68c2bf3bffd60289'}]}, 'timestamp': '2025-10-14 09:52:24.595101', '_unique_id': '19ab2aaf2eb64cf78e8f4f5b6bbe8616'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.596 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.597 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.597 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67a4d835-7ea3-40bf-9ebc-dc57a3dc8f41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:52:24.597190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c5ca40e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': 'ffc5b394f43e5c3f8e0ed10e716847956717dbf3c43b06f3ec2ceb602d50c006'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:52:24.597190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c5cb52a-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11564.728751644, 'message_signature': 'f166948caa4c6de28b0686fd12112d50761b7ffd1fac2276e125fae9bc9bc656'}]}, 'timestamp': '2025-10-14 09:52:24.598069', '_unique_id': '281b3e0d72774fbebeecc3128e05b5b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:52:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:52:24.598 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:52:24 np0005486759.ooo.test sudo[289538]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:25.450 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:25.474 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Triggering sync for uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 14 09:52:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:25.475 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:52:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:25.476 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:52:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:25.527 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:52:25 np0005486759.ooo.test python3.9[289791]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:52:25 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:25.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:52:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:52:26 np0005486759.ooo.test python3.9[289901]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:52:26 np0005486759.ooo.test podman[289902]: 2025-10-14 09:52:26.464006184 +0000 UTC m=+0.086529494 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Oct 14 09:52:26 np0005486759.ooo.test podman[289902]: 2025-10-14 09:52:26.471097927 +0000 UTC m=+0.093621247 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:52:26 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:52:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5825 DF PROTO=TCP SPT=45882 DPT=9102 SEQ=3377405946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9968020000000001030307) 
Oct 14 09:52:27 np0005486759.ooo.test sudo[290031]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqkrneyscoddyvmodhncncxybofqryxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435546.6304767-116-216089983115389/AnsiballZ_lineinfile.py
Oct 14 09:52:27 np0005486759.ooo.test sudo[290031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:27 np0005486759.ooo.test python3.9[290033]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:27 np0005486759.ooo.test sudo[290031]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:27 np0005486759.ooo.test sudo[290141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kctetlttugbyyrneqpxkwnekfwxqmwip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435547.589529-125-19855161622875/AnsiballZ_file.py
Oct 14 09:52:27 np0005486759.ooo.test sudo[290141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:52:27 np0005486759.ooo.test podman[290144]: 2025-10-14 09:52:27.978370812 +0000 UTC m=+0.087037870 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251009, config_id=multipathd, tcib_managed=true)
Oct 14 09:52:27 np0005486759.ooo.test podman[290144]: 2025-10-14 09:52:27.99191699 +0000 UTC m=+0.100584068 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:52:28 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:52:28 np0005486759.ooo.test python3.9[290143]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:28 np0005486759.ooo.test sudo[290141]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:28 np0005486759.ooo.test sudo[290270]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puyvcpwplbuiemjsvklfiptaaixghuwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435548.2638-133-168819457752169/AnsiballZ_stat.py
Oct 14 09:52:28 np0005486759.ooo.test sudo[290270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:28 np0005486759.ooo.test python3.9[290272]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:28 np0005486759.ooo.test sudo[290270]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:28 np0005486759.ooo.test sudo[290327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tamyyunjntvrqgildchkqmwcxofnznsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435548.2638-133-168819457752169/AnsiballZ_file.py
Oct 14 09:52:28 np0005486759.ooo.test sudo[290327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:29 np0005486759.ooo.test python3.9[290329]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:29 np0005486759.ooo.test sudo[290327]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:29 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:29.524 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:29 np0005486759.ooo.test sudo[290437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cypmqwnbyihgqpnszbgtxvpppjitqqxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435549.3102436-133-194637427215504/AnsiballZ_stat.py
Oct 14 09:52:29 np0005486759.ooo.test sudo[290437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:52:29 np0005486759.ooo.test podman[290439]: 2025-10-14 09:52:29.646375749 +0000 UTC m=+0.064991166 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:52:29 np0005486759.ooo.test podman[290439]: 2025-10-14 09:52:29.655170711 +0000 UTC m=+0.073786118 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:52:29 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:52:29 np0005486759.ooo.test python3.9[290440]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:29 np0005486759.ooo.test sudo[290437]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:29 np0005486759.ooo.test sudo[290517]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdisxwumvarzigvyfemtkpehtaipihhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435549.3102436-133-194637427215504/AnsiballZ_file.py
Oct 14 09:52:29 np0005486759.ooo.test sudo[290517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:30 np0005486759.ooo.test python3.9[290519]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:30 np0005486759.ooo.test sudo[290517]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:30 np0005486759.ooo.test sudo[290627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mksirlivbybeoupjamafkukvonikfakb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435550.3799841-156-69194274119388/AnsiballZ_file.py
Oct 14 09:52:30 np0005486759.ooo.test sudo[290627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:30 np0005486759.ooo.test python3.9[290629]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:30 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:30.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:30 np0005486759.ooo.test sudo[290627]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:31 np0005486759.ooo.test auditd[725]: Audit daemon rotating log files
Oct 14 09:52:32 np0005486759.ooo.test sudo[290737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wplkiszdysqqkehleqsooiwcdllkrbty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435551.0467093-164-134224533882317/AnsiballZ_stat.py
Oct 14 09:52:32 np0005486759.ooo.test sudo[290737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:32 np0005486759.ooo.test python3.9[290739]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:32 np0005486759.ooo.test sudo[290737]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:32 np0005486759.ooo.test sudo[290794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohymqmbosckyehxdrutnqezzcvioineg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435551.0467093-164-134224533882317/AnsiballZ_file.py
Oct 14 09:52:32 np0005486759.ooo.test sudo[290794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:32 np0005486759.ooo.test python3.9[290796]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:32 np0005486759.ooo.test sudo[290794]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:33 np0005486759.ooo.test sudo[290904]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjpdlhlvtickrtnnlwtnkhponjvqvydu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435553.0893042-176-231716683389717/AnsiballZ_stat.py
Oct 14 09:52:33 np0005486759.ooo.test sudo[290904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:33.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:33.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:52:33 np0005486759.ooo.test python3.9[290906]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:33 np0005486759.ooo.test sudo[290904]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:52:34 np0005486759.ooo.test sudo[290973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfaearebksonygqbltxnqbdxzvmonkxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435553.0893042-176-231716683389717/AnsiballZ_file.py
Oct 14 09:52:34 np0005486759.ooo.test sudo[290973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:34 np0005486759.ooo.test podman[290942]: 2025-10-14 09:52:34.429415127 +0000 UTC m=+0.091571488 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:52:34 np0005486759.ooo.test podman[290942]: 2025-10-14 09:52:34.465458422 +0000 UTC m=+0.127614783 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:52:34 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:52:34 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:34.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:34 np0005486759.ooo.test python3.9[290976]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:34 np0005486759.ooo.test sudo[290973]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:35 np0005486759.ooo.test sudo[291096]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhuwgzoejwwmsmsqerxhvzyylejgylus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435554.7833967-188-59656095185774/AnsiballZ_systemd.py
Oct 14 09:52:35 np0005486759.ooo.test sudo[291096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:35 np0005486759.ooo.test python3.9[291098]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:52:35 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.493 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:35 np0005486759.ooo.test systemd-rc-local-generator[291121]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:52:35 np0005486759.ooo.test systemd-sysv-generator[291126]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:52:35 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:52:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:52:35 np0005486759.ooo.test systemd[1]: tmp-crun.pVUP9H.mount: Deactivated successfully.
Oct 14 09:52:35 np0005486759.ooo.test podman[291136]: 2025-10-14 09:52:35.821763333 +0000 UTC m=+0.070451023 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Oct 14 09:52:35 np0005486759.ooo.test podman[291136]: 2025-10-14 09:52:35.835551068 +0000 UTC m=+0.084238748 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6)
Oct 14 09:52:35 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:35.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:36.499 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:36 np0005486759.ooo.test sudo[291096]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:37 np0005486759.ooo.test sudo[291265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kueazgoftywgsaulrgnfgxsftbbwcqyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435557.0256276-196-120394152868253/AnsiballZ_stat.py
Oct 14 09:52:37 np0005486759.ooo.test sudo[291265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:37.499 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:37.501 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:52:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:37.501 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:52:37 np0005486759.ooo.test python3.9[291267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:37 np0005486759.ooo.test sudo[291265]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:37 np0005486759.ooo.test sudo[291322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejngpsgogflgwgvlgpsvpgswnqydkxos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435557.0256276-196-120394152868253/AnsiballZ_file.py
Oct 14 09:52:37 np0005486759.ooo.test sudo[291322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:37 np0005486759.ooo.test python3.9[291324]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:38 np0005486759.ooo.test sudo[291322]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.050 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.050 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.051 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.051 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:52:38 np0005486759.ooo.test sudo[291432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aajqgvbzurthlaxychcbcgfodlqlngyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435558.1737041-208-255806235216757/AnsiballZ_stat.py
Oct 14 09:52:38 np0005486759.ooo.test sudo[291432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.532 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.563 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.564 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:52:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:38.564 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:38 np0005486759.ooo.test python3.9[291434]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:38 np0005486759.ooo.test sudo[291432]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:38 np0005486759.ooo.test sudo[291489]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enzddjxpggqipowoxylyzzwujzibjuot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435558.1737041-208-255806235216757/AnsiballZ_file.py
Oct 14 09:52:38 np0005486759.ooo.test sudo[291489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:39 np0005486759.ooo.test python3.9[291491]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:39 np0005486759.ooo.test sudo[291489]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:39 np0005486759.ooo.test sudo[291599]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otqyalqchpzjnryifmokwtqmthlenkew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435559.3109071-220-268453420531970/AnsiballZ_systemd.py
Oct 14 09:52:39 np0005486759.ooo.test sudo[291599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:39 np0005486759.ooo.test python3.9[291601]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:52:39 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:52:40 np0005486759.ooo.test systemd-rc-local-generator[291624]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:52:40 np0005486759.ooo.test systemd-sysv-generator[291628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:52:40 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:52:40 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:52:40 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:52:40 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:52:40 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:52:40 np0005486759.ooo.test sudo[291599]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.529 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.529 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.530 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.530 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.604 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.679 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.680 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.741 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.743 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.810 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.811 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.865 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:52:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:40.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:40 np0005486759.ooo.test sudo[291764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tderyxzklvmfdgerfirofudlqldajaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435560.7045696-230-270359557635762/AnsiballZ_file.py
Oct 14 09:52:40 np0005486759.ooo.test sudo[291764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.084 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.086 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12475MB free_disk=386.71887588500977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.087 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.087 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:52:41 np0005486759.ooo.test python3.9[291766]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.168 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.169 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.169 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:52:41 np0005486759.ooo.test sudo[291764]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.227 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.241 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.243 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:52:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:41.244 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:52:41 np0005486759.ooo.test sudo[291874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkhrzttnokdpnwislxofclxpeobivapj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435561.3597388-238-32764216718998/AnsiballZ_stat.py
Oct 14 09:52:41 np0005486759.ooo.test sudo[291874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:52:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:52:41 np0005486759.ooo.test podman[291878]: 2025-10-14 09:52:41.813647933 +0000 UTC m=+0.078935216 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:52:41 np0005486759.ooo.test podman[291877]: 2025-10-14 09:52:41.865538753 +0000 UTC m=+0.130833916 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 09:52:41 np0005486759.ooo.test podman[291878]: 2025-10-14 09:52:41.898867919 +0000 UTC m=+0.164155262 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 09:52:41 np0005486759.ooo.test python3.9[291876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:41 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:52:41 np0005486759.ooo.test sudo[291874]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:41 np0005486759.ooo.test podman[291877]: 2025-10-14 09:52:41.957650316 +0000 UTC m=+0.222945449 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 14 09:52:41 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:52:42 np0005486759.ooo.test sudo[291967]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tugmlnznxlkfbjizxhcvnoavtqdprolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435561.3597388-238-32764216718998/AnsiballZ_file.py
Oct 14 09:52:42 np0005486759.ooo.test sudo[291967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:52:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:52:42 np0005486759.ooo.test python3.9[291969]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/iscsid/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/iscsid/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:52:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126005 "" "Go-http-client/1.1"
Oct 14 09:52:42 np0005486759.ooo.test sudo[291967]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:52:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15240 "" "Go-http-client/1.1"
Oct 14 09:52:42 np0005486759.ooo.test sudo[292077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjtzwfvlbypamkzmwwtwfvxoyozbrnvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435562.642397-252-231453200254259/AnsiballZ_file.py
Oct 14 09:52:42 np0005486759.ooo.test sudo[292077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:43 np0005486759.ooo.test python3.9[292079]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:52:43 np0005486759.ooo.test sudo[292077]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:43 np0005486759.ooo.test sudo[292187]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqldrviooqarlvxncegpwohcjugsnelk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435563.3282278-260-71319094655323/AnsiballZ_stat.py
Oct 14 09:52:43 np0005486759.ooo.test sudo[292187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:43 np0005486759.ooo.test python3.9[292189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:52:43 np0005486759.ooo.test sudo[292187]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:52:44 np0005486759.ooo.test sudo[292244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyaohgqxrjdmgwrynqefwzwmcweaexlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435563.3282278-260-71319094655323/AnsiballZ_file.py
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:52:44 np0005486759.ooo.test sudo[292244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:52:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:52:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:52:44 np0005486759.ooo.test python3.9[292246]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/iscsid.json _original_basename=.a_k22rdo recurse=False state=file path=/var/lib/kolla/config_files/iscsid.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:44 np0005486759.ooo.test sudo[292244]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:44 np0005486759.ooo.test sudo[292354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klqknramxmoaiejhcmbwdxofswvgaztk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435564.422071-272-225594275447806/AnsiballZ_file.py
Oct 14 09:52:44 np0005486759.ooo.test sudo[292354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:45 np0005486759.ooo.test python3.9[292356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:45 np0005486759.ooo.test sudo[292354]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:45 np0005486759.ooo.test sudo[292464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfaralifdywshyybaujsojdbkcafgodu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435565.3066616-280-44016105226364/AnsiballZ_stat.py
Oct 14 09:52:45 np0005486759.ooo.test sudo[292464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:45 np0005486759.ooo.test sudo[292464]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:45 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:45.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:46 np0005486759.ooo.test sudo[292521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odcfcxrnbukwaecydbjtssihcjqrknpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435565.3066616-280-44016105226364/AnsiballZ_file.py
Oct 14 09:52:46 np0005486759.ooo.test sudo[292521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:52:46 np0005486759.ooo.test podman[292524]: 2025-10-14 09:52:46.185137653 +0000 UTC m=+0.080310056 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:52:46 np0005486759.ooo.test podman[292524]: 2025-10-14 09:52:46.196367935 +0000 UTC m=+0.091540258 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:52:46 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:52:46 np0005486759.ooo.test sudo[292521]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:47 np0005486759.ooo.test sudo[292654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzdrlqxenzofwvxuezaazqijnrvecwxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435566.6273854-294-210453073159230/AnsiballZ_container_config_data.py
Oct 14 09:52:47 np0005486759.ooo.test sudo[292654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:47 np0005486759.ooo.test python3.9[292656]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 14 09:52:47 np0005486759.ooo.test sudo[292654]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:48 np0005486759.ooo.test sudo[292764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrbjvbxdhkfphotoyjsozbgfqhfutrdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435567.5562453-303-10035130617842/AnsiballZ_container_config_hash.py
Oct 14 09:52:48 np0005486759.ooo.test sudo[292764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:48 np0005486759.ooo.test python3.9[292766]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:52:48 np0005486759.ooo.test sudo[292764]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:48 np0005486759.ooo.test sudo[292874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uchbmhjczyxgyoijrontjhogziuntduv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435568.5007153-312-102705643090974/AnsiballZ_podman_container_info.py
Oct 14 09:52:48 np0005486759.ooo.test sudo[292874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:49 np0005486759.ooo.test python3.9[292876]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:52:49 np0005486759.ooo.test sudo[292874]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10922 DF PROTO=TCP SPT=48980 DPT=9102 SEQ=4134124241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F99C1570000000001030307) 
Oct 14 09:52:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10923 DF PROTO=TCP SPT=48980 DPT=9102 SEQ=4134124241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F99C5410000000001030307) 
Oct 14 09:52:50 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:50.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:51 np0005486759.ooo.test sudo[293010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iljyxbiankgofgaukxtqwswivplipgmk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435571.2312655-325-78500222431564/AnsiballZ_edpm_container_manage.py
Oct 14 09:52:51 np0005486759.ooo.test sudo[293010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:52 np0005486759.ooo.test python3[293012]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:52:52 np0005486759.ooo.test python3[293012]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "4f44a4f5e0315c0d3dbd533e21d0927bf0518cf452942382901ff1ff9d621cbd",
                                                                 "Digest": "sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-iscsid:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-iscsid@sha256:2975c6e807fa09f0e2062da08d3a0bb209ca055d73011ebb91164def554f60aa"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-14T06:14:08.154480843Z",
                                                                 "Config": {
                                                                      "User": "root",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 403858061,
                                                                 "VirtualSize": 403858061,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/1b94024f0eaacdff3ae200e2172324d7aec107282443f6fc22fe2f0287bc90ec/diff:/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/9c7bc0417a3c6c9361659b5f2f41d814b152f8a47a3821564971debd2b788997/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",
                                                                           "sha256:f640179b0564dc7abbe22bd39fc8810d5bbb8e54094fe7ebc5b3c45b658c4983",
                                                                           "sha256:f004953af60f7a99c360488169b0781a154164be09dce508bd68d57932c60f8f"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "root",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969219151Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969253522Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969285133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969308103Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969342284Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969363945Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:55.340499198Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:32.389605838Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.587912811Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.976619634Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:36.392967414Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.005863592Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.29378883Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.651733508Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.077574384Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.492629447Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.841668394Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.241713606Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.624152332Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.968354993Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.280465471Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.616162553Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.039895541Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.340755181Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.002994823Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.284637314Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.582935524Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:47.185088535Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260120756Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260167227Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260179498Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260189038Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:50.485771038Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:11:48.328117095Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:12:30.499124675Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:12:33.437399647Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:13:23.772230749Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:06.540870685Z",
                                                                           "created_by": "/bin/sh -c dnf -y install iscsi-initiator-utils python3-rtslib targetcli socat && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:06.903436438Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/iscsid/extend_start.sh /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:07.559847274Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:08.150720607Z",
                                                                           "created_by": "/bin/sh -c rm -f /etc/iscsi/initiatorname.iscsi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:14:11.370653418Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-iscsid:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 14 09:52:52 np0005486759.ooo.test sudo[293010]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10924 DF PROTO=TCP SPT=48980 DPT=9102 SEQ=4134124241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F99CD410000000001030307) 
Oct 14 09:52:52 np0005486759.ooo.test sudo[293183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvomwmrkjamoqxwpbwithkugduykqfus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435572.559898-333-157858294799967/AnsiballZ_stat.py
Oct 14 09:52:52 np0005486759.ooo.test sudo[293183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:53 np0005486759.ooo.test python3.9[293185]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:52:53 np0005486759.ooo.test sudo[293183]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:53 np0005486759.ooo.test sudo[293295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbkhqbxfcbnyavhyjbnydnskriixoaly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435573.3054717-342-143332392832654/AnsiballZ_file.py
Oct 14 09:52:53 np0005486759.ooo.test sudo[293295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:53 np0005486759.ooo.test python3.9[293297]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:53 np0005486759.ooo.test sudo[293295]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:54 np0005486759.ooo.test sudo[293350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxsaoiykunnmvtdtmruijvezcsbsrfog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435573.3054717-342-143332392832654/AnsiballZ_stat.py
Oct 14 09:52:54 np0005486759.ooo.test sudo[293350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:52:54.157 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:52:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:52:54.158 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:52:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:52:54.159 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:52:54 np0005486759.ooo.test python3.9[293352]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:52:54 np0005486759.ooo.test sudo[293350]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:55 np0005486759.ooo.test sudo[293459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epqyxsttarnztitvxzsdwfmjsfawaitc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435574.2982292-342-170598249796327/AnsiballZ_copy.py
Oct 14 09:52:55 np0005486759.ooo.test sudo[293459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:55 np0005486759.ooo.test python3.9[293461]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435574.2982292-342-170598249796327/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:55 np0005486759.ooo.test sudo[293459]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:55 np0005486759.ooo.test sudo[293514]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdeczkwyflzygxbrcbhjthoahppxhfvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435574.2982292-342-170598249796327/AnsiballZ_systemd.py
Oct 14 09:52:55 np0005486759.ooo.test sudo[293514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:55 np0005486759.ooo.test python3.9[293516]: ansible-systemd Invoked with state=started name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:52:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:55.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:52:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:55.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:52:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:55.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5016 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:52:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:55.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:52:55 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:52:55.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:52:55 np0005486759.ooo.test sudo[293514]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:56 np0005486759.ooo.test python3.9[293626]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:52:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10925 DF PROTO=TCP SPT=48980 DPT=9102 SEQ=4134124241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F99DD010000000001030307) 
Oct 14 09:52:56 np0005486759.ooo.test sudo[293736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egyokbcrwzfahhuvjalctzylpdfptyol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435576.6863315-374-244237822079738/AnsiballZ_systemd.py
Oct 14 09:52:57 np0005486759.ooo.test sudo[293736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: tmp-crun.KUEWbk.mount: Deactivated successfully.
Oct 14 09:52:57 np0005486759.ooo.test podman[293739]: 2025-10-14 09:52:57.112690681 +0000 UTC m=+0.090544359 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:52:57 np0005486759.ooo.test podman[293739]: 2025-10-14 09:52:57.127595689 +0000 UTC m=+0.105449397 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:52:57 np0005486759.ooo.test python3.9[293738]: ansible-ansible.builtin.systemd Invoked with name=edpm_iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Stopping iscsid container...
Oct 14 09:52:57 np0005486759.ooo.test iscsid[235388]: iscsid shutting down.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: libpod-895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.scope: Deactivated successfully.
Oct 14 09:52:57 np0005486759.ooo.test podman[293760]: 2025-10-14 09:52:57.464092376 +0000 UTC m=+0.055044501 container died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.timer: Deactivated successfully.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Stopped /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Failed to open /run/systemd/transient/895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: No such file or directory
Oct 14 09:52:57 np0005486759.ooo.test podman[293760]: 2025-10-14 09:52:57.568387168 +0000 UTC m=+0.159339263 container cleanup 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=iscsid, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 09:52:57 np0005486759.ooo.test podman[293760]: iscsid
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.timer: Failed to open /run/systemd/transient/895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.timer: No such file or directory
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Failed to open /run/systemd/transient/895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: No such file or directory
Oct 14 09:52:57 np0005486759.ooo.test podman[293787]: 2025-10-14 09:52:57.658542275 +0000 UTC m=+0.059968661 container cleanup 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:52:57 np0005486759.ooo.test podman[293787]: iscsid
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: edpm_iscsid.service: Deactivated successfully.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Stopped iscsid container.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Starting iscsid container...
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:52:57 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ddc5fbe16b14e08d4db6edd68ab8354f4e24be6dd0d2901ee351991c0d47b0/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 14 09:52:57 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ddc5fbe16b14e08d4db6edd68ab8354f4e24be6dd0d2901ee351991c0d47b0/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:52:57 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ddc5fbe16b14e08d4db6edd68ab8354f4e24be6dd0d2901ee351991c0d47b0/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.timer: Failed to open /run/systemd/transient/895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.timer: No such file or directory
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Failed to open /run/systemd/transient/895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: No such file or directory
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:52:57 np0005486759.ooo.test podman[293800]: 2025-10-14 09:52:57.835945477 +0000 UTC m=+0.145788715 container init 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid)
Oct 14 09:52:57 np0005486759.ooo.test iscsid[293815]: + sudo -E kolla_set_configs
Oct 14 09:52:57 np0005486759.ooo.test sudo[293821]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:52:57 np0005486759.ooo.test podman[293800]: 2025-10-14 09:52:57.870578741 +0000 UTC m=+0.180421949 container start 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:52:57 np0005486759.ooo.test podman[293800]: iscsid
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Started iscsid container.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Created slice User Slice of UID 0.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: Starting User Manager for UID 0...
Oct 14 09:52:57 np0005486759.ooo.test sudo[293736]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:57 np0005486759.ooo.test systemd[293837]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:52:57 np0005486759.ooo.test podman[293823]: 2025-10-14 09:52:57.953100598 +0000 UTC m=+0.075486776 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 09:52:57 np0005486759.ooo.test podman[293823]: 2025-10-14 09:52:57.964225908 +0000 UTC m=+0.086612056 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 09:52:57 np0005486759.ooo.test podman[293823]: unhealthy
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Main process exited, code=exited, status=1/FAILURE
Oct 14 09:52:57 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Failed with result 'exit-code'.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Queued start job for default target Main User Target.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Created slice User Application Slice.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Started Daily Cleanup of User's Temporary Directories.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Reached target Paths.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Reached target Timers.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Starting D-Bus User Message Bus Socket...
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Starting Create User's Volatile Files and Directories...
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Finished Create User's Volatile Files and Directories.
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Listening on D-Bus User Message Bus Socket.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Reached target Sockets.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Reached target Basic System.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Reached target Main User Target.
Oct 14 09:52:58 np0005486759.ooo.test systemd[293837]: Startup finished in 155ms.
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: Started User Manager for UID 0.
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: Started Session c17 of User root.
Oct 14 09:52:58 np0005486759.ooo.test sudo[293821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:52:58 np0005486759.ooo.test podman[293897]: 2025-10-14 09:52:58.163562899 +0000 UTC m=+0.061826145 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: INFO:__main__:Validating config file
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: INFO:__main__:Writing out command to execute
Oct 14 09:52:58 np0005486759.ooo.test sudo[293821]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: session-c17.scope: Deactivated successfully.
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: ++ cat /run_command
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + CMD='/usr/sbin/iscsid -f'
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + ARGS=
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + sudo kolla_copy_cacerts
Oct 14 09:52:58 np0005486759.ooo.test sudo[293949]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 14 09:52:58 np0005486759.ooo.test podman[293897]: 2025-10-14 09:52:58.20125913 +0000 UTC m=+0.099522376 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: Started Session c18 of User root.
Oct 14 09:52:58 np0005486759.ooo.test sudo[293949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 14 09:52:58 np0005486759.ooo.test sudo[293949]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: Running command: '/usr/sbin/iscsid -f'
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + [[ ! -n '' ]]
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + . kolla_extend_start
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + umask 0022
Oct 14 09:52:58 np0005486759.ooo.test iscsid[293815]: + exec /usr/sbin/iscsid -f
Oct 14 09:52:58 np0005486759.ooo.test systemd[1]: session-c18.scope: Deactivated successfully.
Oct 14 09:52:58 np0005486759.ooo.test sudo[293990]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnppdzsfscstwnwuruxxjhnvnjbhxtxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435578.0708883-382-247638907285982/AnsiballZ_file.py
Oct 14 09:52:58 np0005486759.ooo.test sudo[293990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:58 np0005486759.ooo.test python3.9[293992]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:52:58 np0005486759.ooo.test sudo[293990]: pam_unix(sudo:session): session closed for user root
Oct 14 09:52:59 np0005486759.ooo.test sudo[294100]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfniwaechfxoprdlzdmxmnpbrwhaadfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435578.847938-393-26306249086402/AnsiballZ_service_facts.py
Oct 14 09:52:59 np0005486759.ooo.test sudo[294100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:52:59 np0005486759.ooo.test python3.9[294102]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:52:59 np0005486759.ooo.test network[294119]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:52:59 np0005486759.ooo.test network[294120]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:52:59 np0005486759.ooo.test network[294121]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:53:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:53:00 np0005486759.ooo.test podman[294128]: 2025-10-14 09:53:00.117570762 +0000 UTC m=+0.077172945 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:53:00 np0005486759.ooo.test podman[294128]: 2025-10-14 09:53:00.131508562 +0000 UTC m=+0.091110775 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:53:00 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:53:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:00.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:00.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:00.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:00.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:00.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:00 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:00.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:01 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:53:03 np0005486759.ooo.test sudo[294100]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:03 np0005486759.ooo.test sudo[294376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maxicuwcvessznisrcgfsilmrgzxeitv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435583.6659753-403-206040718030168/AnsiballZ_file.py
Oct 14 09:53:03 np0005486759.ooo.test sudo[294376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:04 np0005486759.ooo.test python3.9[294378]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:53:04 np0005486759.ooo.test sudo[294376]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:04 np0005486759.ooo.test sudo[294486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldakggmoergemevvtwnmwghqtiuplrbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435584.3702939-411-92144545853718/AnsiballZ_modprobe.py
Oct 14 09:53:04 np0005486759.ooo.test sudo[294486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:53:04 np0005486759.ooo.test podman[294488]: 2025-10-14 09:53:04.965510344 +0000 UTC m=+0.094021430 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:53:05 np0005486759.ooo.test podman[294488]: 2025-10-14 09:53:05.003424241 +0000 UTC m=+0.131935337 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:53:05 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:53:05 np0005486759.ooo.test python3.9[294489]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 14 09:53:05 np0005486759.ooo.test sudo[294486]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:05 np0005486759.ooo.test sudo[294623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwilglakztfqonwsdrumchomixsdewbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435585.2670743-419-14226856437566/AnsiballZ_stat.py
Oct 14 09:53:05 np0005486759.ooo.test sudo[294623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:05 np0005486759.ooo.test python3.9[294625]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:05 np0005486759.ooo.test sudo[294623]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:05.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:05.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:05.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:05.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:05.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:05 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:05.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:06 np0005486759.ooo.test sudo[294680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuyslkjcultpgajmpbfiamnssoowqica ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435585.2670743-419-14226856437566/AnsiballZ_file.py
Oct 14 09:53:06 np0005486759.ooo.test sudo[294680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:53:06 np0005486759.ooo.test systemd[1]: tmp-crun.ydAjBM.mount: Deactivated successfully.
Oct 14 09:53:06 np0005486759.ooo.test podman[294683]: 2025-10-14 09:53:06.169730482 +0000 UTC m=+0.079881494 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 09:53:06 np0005486759.ooo.test podman[294683]: 2025-10-14 09:53:06.185321459 +0000 UTC m=+0.095472451 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct 14 09:53:06 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:53:06 np0005486759.ooo.test python3.9[294682]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:06 np0005486759.ooo.test sudo[294680]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:06 np0005486759.ooo.test sudo[294808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vunhpjwuykstoyctvafrvqczoymjhmce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435586.5633965-432-100924257297122/AnsiballZ_lineinfile.py
Oct 14 09:53:06 np0005486759.ooo.test sudo[294808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:07 np0005486759.ooo.test python3.9[294810]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:07 np0005486759.ooo.test sudo[294808]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:07 np0005486759.ooo.test sudo[294918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obxaydpmhlgkznmjztopdglfqkmujxim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435587.2819579-441-23884730078576/AnsiballZ_file.py
Oct 14 09:53:07 np0005486759.ooo.test sudo[294918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:07 np0005486759.ooo.test python3.9[294920]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:07 np0005486759.ooo.test sudo[294918]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: Stopping User Manager for UID 0...
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Activating special unit Exit the Session...
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped target Main User Target.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped target Basic System.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped target Paths.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped target Sockets.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped target Timers.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Closed D-Bus User Message Bus Socket.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Stopped Create User's Volatile Files and Directories.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Removed slice User Application Slice.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Reached target Shutdown.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Finished Exit the Session.
Oct 14 09:53:08 np0005486759.ooo.test systemd[293837]: Reached target Exit the Session.
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: user@0.service: Deactivated successfully.
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: Stopped User Manager for UID 0.
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 14 09:53:08 np0005486759.ooo.test systemd[1]: Removed slice User Slice of UID 0.
Oct 14 09:53:08 np0005486759.ooo.test sudo[295029]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwzefmicxstizyvokpwjamyvahiqtqms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435588.0301044-450-245186559305639/AnsiballZ_stat.py
Oct 14 09:53:08 np0005486759.ooo.test sudo[295029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:08 np0005486759.ooo.test python3.9[295031]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:53:08 np0005486759.ooo.test sudo[295029]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:09 np0005486759.ooo.test sudo[295141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rehmltpldfnazakmhivgllmisvhdsrqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435588.8068905-459-154800992038995/AnsiballZ_stat.py
Oct 14 09:53:09 np0005486759.ooo.test sudo[295141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:09 np0005486759.ooo.test python3.9[295143]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:53:09 np0005486759.ooo.test sudo[295141]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:09 np0005486759.ooo.test sudo[295253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knfzztokaghtuuyyarjnervfqzepojny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435589.522033-468-55192049367598/AnsiballZ_command.py
Oct 14 09:53:10 np0005486759.ooo.test sudo[295253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:10 np0005486759.ooo.test python3.9[295255]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:53:10 np0005486759.ooo.test sudo[295253]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:10 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:10.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:10 np0005486759.ooo.test sudo[295364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmebsrmcllnkvyfhezuwvkeviztfabkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435590.5062006-478-75883449301939/AnsiballZ_replace.py
Oct 14 09:53:10 np0005486759.ooo.test sudo[295364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:11 np0005486759.ooo.test python3.9[295366]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:11 np0005486759.ooo.test sudo[295364]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:11 np0005486759.ooo.test sudo[295474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebwwawqgamotdnpuqurkezbcqsfoubmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435591.4464011-487-11759417676140/AnsiballZ_lineinfile.py
Oct 14 09:53:11 np0005486759.ooo.test sudo[295474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:11 np0005486759.ooo.test python3.9[295476]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:11 np0005486759.ooo.test sudo[295474]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:53:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:53:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:53:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126004 "" "Go-http-client/1.1"
Oct 14 09:53:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:53:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15236 "" "Go-http-client/1.1"
Oct 14 09:53:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:53:12 np0005486759.ooo.test sudo[295584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoethwohrsxdrnwabpwctnyamelubdav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435592.0858998-487-263657669249147/AnsiballZ_lineinfile.py
Oct 14 09:53:12 np0005486759.ooo.test sudo[295584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:12 np0005486759.ooo.test podman[295586]: 2025-10-14 09:53:12.46898283 +0000 UTC m=+0.088671606 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 14 09:53:12 np0005486759.ooo.test podman[295586]: 2025-10-14 09:53:12.502524903 +0000 UTC m=+0.122213669 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 14 09:53:12 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:53:12 np0005486759.ooo.test python3.9[295587]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:12 np0005486759.ooo.test sudo[295584]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:13 np0005486759.ooo.test sudo[295711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yelqbovrsnervciqakcjrbbxqmyctfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435592.7321646-487-254387620549705/AnsiballZ_lineinfile.py
Oct 14 09:53:13 np0005486759.ooo.test sudo[295711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:13 np0005486759.ooo.test python3.9[295713]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:13 np0005486759.ooo.test sudo[295711]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:13 np0005486759.ooo.test sudo[295821]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iubwcvjpszeldgbbsxffsiqmuvaytszm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435593.3297513-487-71846098258101/AnsiballZ_lineinfile.py
Oct 14 09:53:13 np0005486759.ooo.test sudo[295821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:13 np0005486759.ooo.test python3.9[295823]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:13 np0005486759.ooo.test sudo[295821]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:53:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:53:14 np0005486759.ooo.test sudo[295931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcycpqvchrwhxlynllntfeqjeosisqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435594.0108669-516-29994353448783/AnsiballZ_stat.py
Oct 14 09:53:14 np0005486759.ooo.test sudo[295931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:14 np0005486759.ooo.test python3.9[295933]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:53:14 np0005486759.ooo.test sudo[295931]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:15 np0005486759.ooo.test sudo[296043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxzpivyquymayaegztxfqbbcnpspfbgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435594.834845-526-54867632755593/AnsiballZ_file.py
Oct 14 09:53:15 np0005486759.ooo.test sudo[296043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:15 np0005486759.ooo.test python3.9[296045]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:15 np0005486759.ooo.test sudo[296043]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:15 np0005486759.ooo.test sudo[296153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lukjwvcmxnhpinmkuknybggpkwsglfct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435595.5513465-534-264470212185996/AnsiballZ_stat.py
Oct 14 09:53:15 np0005486759.ooo.test sudo[296153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:15.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:15.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:15.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:15 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:15.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:16.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:16.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:16 np0005486759.ooo.test python3.9[296155]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:16 np0005486759.ooo.test sudo[296153]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:16 np0005486759.ooo.test sudo[296210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdqdapnowtwmytydbgqumwgannuozkxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435595.5513465-534-264470212185996/AnsiballZ_file.py
Oct 14 09:53:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:53:16 np0005486759.ooo.test sudo[296210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:16 np0005486759.ooo.test systemd[1]: tmp-crun.b7cY4O.mount: Deactivated successfully.
Oct 14 09:53:16 np0005486759.ooo.test podman[296212]: 2025-10-14 09:53:16.460383412 +0000 UTC m=+0.096271733 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:53:16 np0005486759.ooo.test podman[296212]: 2025-10-14 09:53:16.469464202 +0000 UTC m=+0.105352453 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:53:16 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:53:16 np0005486759.ooo.test python3.9[296213]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:16 np0005486759.ooo.test sudo[296210]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:17 np0005486759.ooo.test sudo[296343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgydghnrtnonurvewblkpepetersdayf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435596.7067726-534-23567215584731/AnsiballZ_stat.py
Oct 14 09:53:17 np0005486759.ooo.test sudo[296343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:17 np0005486759.ooo.test python3.9[296345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:17 np0005486759.ooo.test sudo[296343]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:17 np0005486759.ooo.test sudo[296400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irrugsisgwowmigrlqhzfftdfzanerza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435596.7067726-534-23567215584731/AnsiballZ_file.py
Oct 14 09:53:17 np0005486759.ooo.test sudo[296400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:17 np0005486759.ooo.test python3.9[296402]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:17 np0005486759.ooo.test sudo[296400]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:18 np0005486759.ooo.test sudo[296510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skwwbjekbnkyutryhcmcmbfvavukaztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435597.842268-557-54029737037197/AnsiballZ_file.py
Oct 14 09:53:18 np0005486759.ooo.test sudo[296510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:18 np0005486759.ooo.test python3.9[296512]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:18 np0005486759.ooo.test sudo[296510]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:18 np0005486759.ooo.test sudo[296620]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxoaeggrclsltuqlcoxeyqtiyploaxwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435598.466006-565-253720726789734/AnsiballZ_stat.py
Oct 14 09:53:18 np0005486759.ooo.test sudo[296620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:18 np0005486759.ooo.test python3.9[296622]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:18 np0005486759.ooo.test sudo[296620]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:19 np0005486759.ooo.test sudo[296677]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqjpdyygrzgdxcrzxqubulutxjmkmobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435598.466006-565-253720726789734/AnsiballZ_file.py
Oct 14 09:53:19 np0005486759.ooo.test sudo[296677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:19 np0005486759.ooo.test python3.9[296679]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:19 np0005486759.ooo.test sudo[296677]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1950 DF PROTO=TCP SPT=51128 DPT=9102 SEQ=1711284311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9A36870000000001030307) 
Oct 14 09:53:19 np0005486759.ooo.test sudo[296787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuqfbsaxhmnppgjqyzsmhspmxguopouo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435599.635072-577-169924158049910/AnsiballZ_stat.py
Oct 14 09:53:19 np0005486759.ooo.test sudo[296787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:20 np0005486759.ooo.test python3.9[296789]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:20 np0005486759.ooo.test sudo[296787]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:20 np0005486759.ooo.test sudo[296844]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdrtxnsfslwrpbqwpcjmbujzyounfvbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435599.635072-577-169924158049910/AnsiballZ_file.py
Oct 14 09:53:20 np0005486759.ooo.test sudo[296844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1951 DF PROTO=TCP SPT=51128 DPT=9102 SEQ=1711284311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9A3A810000000001030307) 
Oct 14 09:53:20 np0005486759.ooo.test python3.9[296846]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:20 np0005486759.ooo.test sudo[296844]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:21.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:21.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:21.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:21.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:21.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:21.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:21 np0005486759.ooo.test sudo[296954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imliabtyomhbrxebrajmztalznenlktz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435600.817008-589-33524983433213/AnsiballZ_systemd.py
Oct 14 09:53:21 np0005486759.ooo.test sudo[296954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:21 np0005486759.ooo.test python3.9[296956]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:53:21 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:53:21 np0005486759.ooo.test systemd-rc-local-generator[296980]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:53:21 np0005486759.ooo.test systemd-sysv-generator[296986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:53:21 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:53:21 np0005486759.ooo.test sudo[296954]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:22 np0005486759.ooo.test sudo[297102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kboqujzymfzdgrabdgejmdmleqydfpvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435602.0480237-597-258309486371957/AnsiballZ_stat.py
Oct 14 09:53:22 np0005486759.ooo.test sudo[297102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:22 np0005486759.ooo.test python3.9[297104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1952 DF PROTO=TCP SPT=51128 DPT=9102 SEQ=1711284311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9A42810000000001030307) 
Oct 14 09:53:22 np0005486759.ooo.test sudo[297102]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:22 np0005486759.ooo.test sudo[297159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twdfafyauxpibwsdjvoqaxhnmbtcvwpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435602.0480237-597-258309486371957/AnsiballZ_file.py
Oct 14 09:53:22 np0005486759.ooo.test sudo[297159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:23 np0005486759.ooo.test python3.9[297161]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:23 np0005486759.ooo.test sudo[297159]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:23 np0005486759.ooo.test sudo[297269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtwpwgzadhzcswiajinrhkajnkmrnfdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435603.236166-609-101570161158851/AnsiballZ_stat.py
Oct 14 09:53:23 np0005486759.ooo.test sudo[297269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:23 np0005486759.ooo.test python3.9[297271]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:23 np0005486759.ooo.test sudo[297269]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:24 np0005486759.ooo.test sudo[297326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oprpebnmxviupjepbyvlkhtscjffktrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435603.236166-609-101570161158851/AnsiballZ_file.py
Oct 14 09:53:24 np0005486759.ooo.test sudo[297326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:24 np0005486759.ooo.test python3.9[297328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:24 np0005486759.ooo.test sudo[297326]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:24 np0005486759.ooo.test sudo[297436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjnnjofkqbnwqhqtuqvwkdfklzhfjhye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435604.4412-621-158906525409095/AnsiballZ_systemd.py
Oct 14 09:53:24 np0005486759.ooo.test sudo[297436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:25 np0005486759.ooo.test python3.9[297438]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:53:25 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:53:25 np0005486759.ooo.test systemd-rc-local-generator[297460]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:53:25 np0005486759.ooo.test systemd-sysv-generator[297463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:53:25 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:53:25 np0005486759.ooo.test systemd[1]: Starting Create netns directory...
Oct 14 09:53:25 np0005486759.ooo.test systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 14 09:53:25 np0005486759.ooo.test systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 14 09:53:25 np0005486759.ooo.test systemd[1]: Finished Create netns directory.
Oct 14 09:53:25 np0005486759.ooo.test sudo[297436]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:26.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:26.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:26.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:26.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:26.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:26.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:26 np0005486759.ooo.test sudo[297588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zztdppstvpdqycwfxqjegyxlleddeutk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435605.8314002-631-202058307594875/AnsiballZ_file.py
Oct 14 09:53:26 np0005486759.ooo.test sudo[297588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:26 np0005486759.ooo.test python3.9[297590]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:26 np0005486759.ooo.test sudo[297588]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1953 DF PROTO=TCP SPT=51128 DPT=9102 SEQ=1711284311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9A52420000000001030307) 
Oct 14 09:53:26 np0005486759.ooo.test sudo[297698]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qinzxoutzklaqtxcijbfomsvlelbtxmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435606.5600898-639-135856416332890/AnsiballZ_stat.py
Oct 14 09:53:26 np0005486759.ooo.test sudo[297698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:27 np0005486759.ooo.test python3.9[297700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:27 np0005486759.ooo.test sudo[297698]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:27 np0005486759.ooo.test sudo[297755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaskxzgdkubgningooyjwdlemnrdfpde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435606.5600898-639-135856416332890/AnsiballZ_file.py
Oct 14 09:53:27 np0005486759.ooo.test sudo[297755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:53:27 np0005486759.ooo.test podman[297758]: 2025-10-14 09:53:27.457137445 +0000 UTC m=+0.088032901 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:53:27 np0005486759.ooo.test podman[297758]: 2025-10-14 09:53:27.469236321 +0000 UTC m=+0.100131747 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:53:27 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:53:27 np0005486759.ooo.test python3.9[297757]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:27 np0005486759.ooo.test sudo[297755]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:28 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:28.239 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:28 np0005486759.ooo.test sudo[297882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmdoaaefwvynvyagbhusbnhoexsuclva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435608.0419037-653-168406955709746/AnsiballZ_file.py
Oct 14 09:53:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:53:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:53:28 np0005486759.ooo.test sudo[297882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:28 np0005486759.ooo.test podman[297885]: 2025-10-14 09:53:28.465351047 +0000 UTC m=+0.087983249 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:53:28 np0005486759.ooo.test podman[297885]: 2025-10-14 09:53:28.503823221 +0000 UTC m=+0.126455413 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:53:28 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:53:28 np0005486759.ooo.test systemd[1]: tmp-crun.X8xfv8.mount: Deactivated successfully.
Oct 14 09:53:28 np0005486759.ooo.test podman[297884]: 2025-10-14 09:53:28.53250256 +0000 UTC m=+0.159191158 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=iscsid, container_name=iscsid)
Oct 14 09:53:28 np0005486759.ooo.test podman[297884]: 2025-10-14 09:53:28.544478562 +0000 UTC m=+0.171167210 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 09:53:28 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:53:28 np0005486759.ooo.test python3.9[297886]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:53:28 np0005486759.ooo.test sudo[297882]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:29 np0005486759.ooo.test sudo[298028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyndfbpearxhlyhrjbaahucztgefscqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435608.8331454-661-76576194068741/AnsiballZ_stat.py
Oct 14 09:53:29 np0005486759.ooo.test sudo[298028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:29 np0005486759.ooo.test python3.9[298030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:29 np0005486759.ooo.test sudo[298028]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:29 np0005486759.ooo.test sudo[298085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaqzfzoowrcgfndpoyjsdvzghqvqstue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435608.8331454-661-76576194068741/AnsiballZ_file.py
Oct 14 09:53:29 np0005486759.ooo.test sudo[298085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:29 np0005486759.ooo.test python3.9[298087]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.jfm7cmqe recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:29 np0005486759.ooo.test sudo[298085]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:30 np0005486759.ooo.test sudo[298195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lohyheeucngdscrazaapqvlaglaytbjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435609.9525018-673-194219421885046/AnsiballZ_file.py
Oct 14 09:53:30 np0005486759.ooo.test sudo[298195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:53:30 np0005486759.ooo.test podman[298198]: 2025-10-14 09:53:30.310526443 +0000 UTC m=+0.054296075 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:53:30 np0005486759.ooo.test podman[298198]: 2025-10-14 09:53:30.321248945 +0000 UTC m=+0.065018547 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:53:30 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:53:30 np0005486759.ooo.test python3.9[298197]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:30 np0005486759.ooo.test sudo[298195]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:31 np0005486759.ooo.test sudo[298327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofodetbdnhqresnwesebnrdbfpptgksq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435610.7314446-681-280136688307235/AnsiballZ_stat.py
Oct 14 09:53:31 np0005486759.ooo.test sudo[298327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:31 np0005486759.ooo.test sudo[298327]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:31.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:31 np0005486759.ooo.test sudo[298384]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfpdkrbivluujuizklrgauaxuogrbvxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435610.7314446-681-280136688307235/AnsiballZ_file.py
Oct 14 09:53:31 np0005486759.ooo.test sudo[298384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:31 np0005486759.ooo.test sudo[298384]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:32 np0005486759.ooo.test sudo[298494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zotrjluolmrfcnhfxdfxhfrhvkazexbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435612.0492995-695-190868144582326/AnsiballZ_container_config_data.py
Oct 14 09:53:32 np0005486759.ooo.test sudo[298494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:32 np0005486759.ooo.test python3.9[298496]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 14 09:53:32 np0005486759.ooo.test sudo[298494]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:33 np0005486759.ooo.test sudo[298604]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaojmuxrallbijyievvndsldzogdoqib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435612.7270293-704-159654357329256/AnsiballZ_container_config_hash.py
Oct 14 09:53:33 np0005486759.ooo.test sudo[298604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:33 np0005486759.ooo.test python3.9[298606]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:53:33 np0005486759.ooo.test sudo[298604]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:33.496 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:33.496 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:53:33 np0005486759.ooo.test sudo[298714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzbpzjoebykgpzfamiruqpnqdtboqnjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435613.520029-713-240969878342064/AnsiballZ_podman_container_info.py
Oct 14 09:53:33 np0005486759.ooo.test sudo[298714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:33 np0005486759.ooo.test python3.9[298716]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 14 09:53:34 np0005486759.ooo.test sudo[298714]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:53:35 np0005486759.ooo.test podman[298761]: 2025-10-14 09:53:35.455711225 +0000 UTC m=+0.081365274 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 09:53:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:35.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:35 np0005486759.ooo.test podman[298761]: 2025-10-14 09:53:35.52940854 +0000 UTC m=+0.155062549 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:53:35 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:53:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:36.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:36 np0005486759.ooo.test sudo[298876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzaqkfakofaqkelajyuyqpkjvcrdwqbn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435615.9960818-726-111979194903801/AnsiballZ_edpm_container_manage.py
Oct 14 09:53:36 np0005486759.ooo.test sudo[298876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:53:36 np0005486759.ooo.test systemd[1]: tmp-crun.x3K1ai.mount: Deactivated successfully.
Oct 14 09:53:36 np0005486759.ooo.test podman[298879]: 2025-10-14 09:53:36.407357083 +0000 UTC m=+0.088233997 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm)
Oct 14 09:53:36 np0005486759.ooo.test podman[298879]: 2025-10-14 09:53:36.420201031 +0000 UTC m=+0.101077935 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 14 09:53:36 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:53:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:36.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:36 np0005486759.ooo.test python3[298878]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:53:36 np0005486759.ooo.test python3[298878]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "0cc989a5ef996507b0f9d8ef7fc230c93fad4ad33debd19bbe24250b85566285",
                                                                 "Digest": "sha256:7b5e7d0bff1c705215946e167be50eac031a93886d33e2e88e389776e8e13e70",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:7b5e7d0bff1c705215946e167be50eac031a93886d33e2e88e389776e8e13e70"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-14T06:10:30.956277521Z",
                                                                 "Config": {
                                                                      "User": "root",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 249351661,
                                                                 "VirtualSize": 249351661,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/0b52816892c0967aea6a33893e73899adbf76e3ca055f6670535905d8ddf2b2c/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/b229675e52e0150c8f53be2f60bdcd02e09cc9ac91e9d7513ccf836c4fc95815/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2896905ce9321c1f2feb1f3ada413e86eda3444455358ab965478a041351b392",
                                                                           "sha256:3be5c7cbc12431945afa672da84f6330a9da4cc765276b49a4ad90cf80ae26d7"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "0468cb21803d466b2abfe00835cf1d2d",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "root",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969219151Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969253522Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969285133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969308103Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969342284Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:54.969363945Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:08:55.340499198Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:32.389605838Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.587912811Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:35.976619634Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:36.392967414Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.005863592Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.29378883Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:37.651733508Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.077574384Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.492629447Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:38.841668394Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.241713606Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.624152332Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:39.968354993Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.280465471Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:40.616162553Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.039895541Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:41.340755181Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.002994823Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.284637314Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:45.582935524Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:47.185088535Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260120756Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260167227Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260179498Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:49.260189038Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:50.485771038Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:09:52.351730678Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:0468cb21803d466b2abfe00835cf1d2d",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:10:30.955473698Z",
                                                                           "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-14T06:10:31.859441237Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"0468cb21803d466b2abfe00835cf1d2d\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 14 09:53:37 np0005486759.ooo.test sudo[298876]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:37 np0005486759.ooo.test sudo[299066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtexxvumosuayboadgpgtxjtgpdmelsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435617.156911-734-189223648345337/AnsiballZ_stat.py
Oct 14 09:53:37 np0005486759.ooo.test sudo[299066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:37.493 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:37.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:37 np0005486759.ooo.test python3.9[299068]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:53:37 np0005486759.ooo.test sudo[299066]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:38 np0005486759.ooo.test sudo[299178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmubdfjcxtmlickdcphvowdeqsolzcrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435618.0051665-743-159338146253441/AnsiballZ_file.py
Oct 14 09:53:38 np0005486759.ooo.test sudo[299178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:38.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:38.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:53:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:38.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:53:38 np0005486759.ooo.test python3.9[299180]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:38 np0005486759.ooo.test sudo[299178]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:38 np0005486759.ooo.test sudo[299233]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioswzkjgzkqqqzgrfjxntikiqtrbxwti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435618.0051665-743-159338146253441/AnsiballZ_stat.py
Oct 14 09:53:38 np0005486759.ooo.test sudo[299233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:38 np0005486759.ooo.test python3.9[299235]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:53:38 np0005486759.ooo.test sudo[299233]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.076 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.076 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.077 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.077 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.439 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.460 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:53:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:39.461 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:53:39 np0005486759.ooo.test sudo[299342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwcgvphbrxxbxcqvgpanmvlfchrkntdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435619.0349808-743-248194015368337/AnsiballZ_copy.py
Oct 14 09:53:39 np0005486759.ooo.test sudo[299342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:39 np0005486759.ooo.test python3.9[299344]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435619.0349808-743-248194015368337/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:39 np0005486759.ooo.test sudo[299342]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:39 np0005486759.ooo.test sudo[299397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfbqjkecwirqzatcevubaagoamovcfog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435619.0349808-743-248194015368337/AnsiballZ_systemd.py
Oct 14 09:53:39 np0005486759.ooo.test sudo[299397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:40 np0005486759.ooo.test python3.9[299399]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:53:40 np0005486759.ooo.test sudo[299397]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:40.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:41 np0005486759.ooo.test python3.9[299509]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:53:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:41.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:41.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:41.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:41.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:41.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:41.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:41 np0005486759.ooo.test sudo[299617]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raantykgkkcqxxobylunoygnmgmlwwdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435621.3232598-777-87894777113870/AnsiballZ_file.py
Oct 14 09:53:41 np0005486759.ooo.test sudo[299617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:41 np0005486759.ooo.test python3.9[299619]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:41 np0005486759.ooo.test sudo[299617]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:53:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:53:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:53:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126003 "" "Go-http-client/1.1"
Oct 14 09:53:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:53:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15237 "" "Go-http-client/1.1"
Oct 14 09:53:42 np0005486759.ooo.test sudo[299727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oildomjgampexikrmdsuyxcogtpyrywg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435622.1811802-789-208462625385667/AnsiballZ_file.py
Oct 14 09:53:42 np0005486759.ooo.test sudo[299727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.526 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.526 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.526 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.526 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.600 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.673 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.674 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:53:42 np0005486759.ooo.test python3.9[299729]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 14 09:53:42 np0005486759.ooo.test sudo[299727]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.747 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.749 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.820 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.822 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:53:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:42.894 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.130 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.132 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12467MB free_disk=386.7171859741211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.133 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.133 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.230 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.231 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.231 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:53:43 np0005486759.ooo.test sudo[299849]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkvmijkklyiwasntktvewslamiigkakm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435622.9097676-797-191629684671079/AnsiballZ_modprobe.py
Oct 14 09:53:43 np0005486759.ooo.test sudo[299849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.290 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.306 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.308 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:53:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:43.309 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:53:43 np0005486759.ooo.test systemd[1]: tmp-crun.D0p82Z.mount: Deactivated successfully.
Oct 14 09:53:43 np0005486759.ooo.test podman[299852]: 2025-10-14 09:53:43.366385788 +0000 UTC m=+0.090276890 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 09:53:43 np0005486759.ooo.test podman[299852]: 2025-10-14 09:53:43.376348618 +0000 UTC m=+0.100239670 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:53:43 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:53:43 np0005486759.ooo.test python3.9[299851]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 14 09:53:43 np0005486759.ooo.test sudo[299849]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:43 np0005486759.ooo.test sudo[299977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxzwpksbhkdshvwyjkikjobmejjqqico ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435623.6859753-805-42118877303820/AnsiballZ_stat.py
Oct 14 09:53:43 np0005486759.ooo.test sudo[299977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:53:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:53:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:53:44 np0005486759.ooo.test python3.9[299979]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:53:44 np0005486759.ooo.test sudo[299977]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:44 np0005486759.ooo.test sudo[300034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnwitdrzzjspkuwoxtbyaskhjehmuqpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435623.6859753-805-42118877303820/AnsiballZ_file.py
Oct 14 09:53:44 np0005486759.ooo.test sudo[300034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:44 np0005486759.ooo.test python3.9[300036]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:44 np0005486759.ooo.test sudo[300034]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:45 np0005486759.ooo.test sudo[300144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exuoaplkmnsnpviddtlozocuxujfjkdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435624.9083042-818-199775906040684/AnsiballZ_lineinfile.py
Oct 14 09:53:45 np0005486759.ooo.test sudo[300144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:45 np0005486759.ooo.test python3.9[300146]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:45 np0005486759.ooo.test sudo[300144]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:46 np0005486759.ooo.test sudo[300254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anvvulugrbypvyxljgmxzenrumjquheg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435625.6877863-827-45901466462084/AnsiballZ_setup.py
Oct 14 09:53:46 np0005486759.ooo.test sudo[300254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:46.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:46.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:46.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:46.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:46.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:53:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:46.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:46 np0005486759.ooo.test python3.9[300256]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 14 09:53:46 np0005486759.ooo.test sudo[300254]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:47 np0005486759.ooo.test sudo[300317]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqnrlvdpbsjtpgurnmxlmzjqolwqozmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435625.6877863-827-45901466462084/AnsiballZ_dnf.py
Oct 14 09:53:47 np0005486759.ooo.test sudo[300317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:53:47 np0005486759.ooo.test podman[300320]: 2025-10-14 09:53:47.209608359 +0000 UTC m=+0.091275060 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:53:47 np0005486759.ooo.test podman[300320]: 2025-10-14 09:53:47.218275828 +0000 UTC m=+0.099942499 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:53:47 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:53:47 np0005486759.ooo.test python3.9[300319]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 14 09:53:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36845 DF PROTO=TCP SPT=48118 DPT=9102 SEQ=1885608346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9AABB80000000001030307) 
Oct 14 09:53:50 np0005486759.ooo.test sudo[300317]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36846 DF PROTO=TCP SPT=48118 DPT=9102 SEQ=1885608346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9AAFC10000000001030307) 
Oct 14 09:53:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:51.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:51.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:51 np0005486759.ooo.test python3.9[300450]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 14 09:53:52 np0005486759.ooo.test sudo[300562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrrqvlcbqhjvorqodyhpblrkkutkuzmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435631.8616476-849-70358011232757/AnsiballZ_file.py
Oct 14 09:53:52 np0005486759.ooo.test sudo[300562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:52 np0005486759.ooo.test python3.9[300564]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:53:52 np0005486759.ooo.test sudo[300562]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36847 DF PROTO=TCP SPT=48118 DPT=9102 SEQ=1885608346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9AB7C10000000001030307) 
Oct 14 09:53:53 np0005486759.ooo.test sudo[300672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjurpbwnwjjqyiqafutqofsymxxlftpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435632.7509136-860-45624111381654/AnsiballZ_systemd_service.py
Oct 14 09:53:53 np0005486759.ooo.test sudo[300672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:53:53 np0005486759.ooo.test python3.9[300674]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:53:53 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:53:53 np0005486759.ooo.test systemd-rc-local-generator[300701]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:53:53 np0005486759.ooo.test systemd-sysv-generator[300704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:53:53 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:53:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:53:54.159 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:53:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:53:54.160 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:53:54 np0005486759.ooo.test sudo[300672]: pam_unix(sudo:session): session closed for user root
Oct 14 09:53:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:53:54.161 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:53:54 np0005486759.ooo.test python3.9[300818]: ansible-ansible.builtin.service_facts Invoked
Oct 14 09:53:54 np0005486759.ooo.test network[300835]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 14 09:53:54 np0005486759.ooo.test network[300836]: 'network-scripts' will be removed from distribution in near future.
Oct 14 09:53:54 np0005486759.ooo.test network[300837]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 14 09:53:56 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:53:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:56.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:56.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:53:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:56.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:53:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:56.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:53:56.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:53:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36848 DF PROTO=TCP SPT=48118 DPT=9102 SEQ=1885608346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9AC7810000000001030307) 
Oct 14 09:53:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:53:57 np0005486759.ooo.test systemd[1]: tmp-crun.kSZrRN.mount: Deactivated successfully.
Oct 14 09:53:57 np0005486759.ooo.test podman[300899]: 2025-10-14 09:53:57.635476674 +0000 UTC m=+0.106747451 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 09:53:57 np0005486759.ooo.test podman[300899]: 2025-10-14 09:53:57.651426799 +0000 UTC m=+0.122697586 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 09:53:57 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:53:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:53:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:53:58 np0005486759.ooo.test podman[300926]: 2025-10-14 09:53:58.769328074 +0000 UTC m=+0.083622354 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:53:58 np0005486759.ooo.test podman[300926]: 2025-10-14 09:53:58.779853809 +0000 UTC m=+0.094148079 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:53:58 np0005486759.ooo.test podman[300927]: 2025-10-14 09:53:58.738383874 +0000 UTC m=+0.054237413 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:53:58 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:53:58 np0005486759.ooo.test podman[300927]: 2025-10-14 09:53:58.821313075 +0000 UTC m=+0.137166664 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:53:58 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:54:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:54:00 np0005486759.ooo.test podman[301002]: 2025-10-14 09:54:00.459165612 +0000 UTC m=+0.073095387 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:54:00 np0005486759.ooo.test podman[301002]: 2025-10-14 09:54:00.471396461 +0000 UTC m=+0.085326296 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:54:00 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:54:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:01.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:01.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:01.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:01.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:01.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:01.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:01 np0005486759.ooo.test sudo[301151]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-civnuymppowaavyyykiqrzivizyfwnfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435641.1033592-879-20100639227830/AnsiballZ_systemd_service.py
Oct 14 09:54:01 np0005486759.ooo.test sudo[301151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:01 np0005486759.ooo.test python3.9[301153]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:01 np0005486759.ooo.test sudo[301151]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:02 np0005486759.ooo.test sudo[301262]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnrjjmggpmjhgmogpdgssgklkrwzjoqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435641.8794343-879-51069406695586/AnsiballZ_systemd_service.py
Oct 14 09:54:02 np0005486759.ooo.test sudo[301262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:02 np0005486759.ooo.test python3.9[301264]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:02 np0005486759.ooo.test sudo[301262]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:02 np0005486759.ooo.test sudo[301373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-antgcxhrwjeschhbctphsaiaauzxcsbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435642.598674-879-234274550780349/AnsiballZ_systemd_service.py
Oct 14 09:54:02 np0005486759.ooo.test sudo[301373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:03 np0005486759.ooo.test python3.9[301375]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:03 np0005486759.ooo.test sudo[301373]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:03 np0005486759.ooo.test sudo[301484]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmvueugyvluwumrrbdkmvtxpschfinum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435643.3185413-879-217855765036415/AnsiballZ_systemd_service.py
Oct 14 09:54:03 np0005486759.ooo.test sudo[301484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:03 np0005486759.ooo.test python3.9[301486]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:03 np0005486759.ooo.test sudo[301484]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:04 np0005486759.ooo.test sudo[301595]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqvaxfttjerpznhkidecccjnssjnktns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435644.0137775-879-176584865668436/AnsiballZ_systemd_service.py
Oct 14 09:54:04 np0005486759.ooo.test sudo[301595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:04 np0005486759.ooo.test python3.9[301597]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:04 np0005486759.ooo.test sudo[301595]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:05 np0005486759.ooo.test sudo[301706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecwipkqrljwkmplfxbvsnfmpntrewcfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435644.808163-879-75188580296881/AnsiballZ_systemd_service.py
Oct 14 09:54:05 np0005486759.ooo.test sudo[301706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:05 np0005486759.ooo.test python3.9[301708]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:05 np0005486759.ooo.test sudo[301706]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:05 np0005486759.ooo.test sudo[301817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvwxjqjuaydqqyltararipyvxkeeqqpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435645.5145636-879-75861320732819/AnsiballZ_systemd_service.py
Oct 14 09:54:05 np0005486759.ooo.test sudo[301817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:54:05 np0005486759.ooo.test podman[301819]: 2025-10-14 09:54:05.911212658 +0000 UTC m=+0.064695187 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:54:05 np0005486759.ooo.test podman[301819]: 2025-10-14 09:54:05.97575805 +0000 UTC m=+0.129240649 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Oct 14 09:54:05 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:54:06 np0005486759.ooo.test python3.9[301820]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:06.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:06.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:07 np0005486759.ooo.test sudo[301817]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:54:07 np0005486759.ooo.test systemd[1]: tmp-crun.4R5eKM.mount: Deactivated successfully.
Oct 14 09:54:07 np0005486759.ooo.test podman[301885]: 2025-10-14 09:54:07.444697349 +0000 UTC m=+0.071788637 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Oct 14 09:54:07 np0005486759.ooo.test podman[301885]: 2025-10-14 09:54:07.460404326 +0000 UTC m=+0.087495624 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6)
Oct 14 09:54:07 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:54:07 np0005486759.ooo.test sudo[301974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntjdtajsswnofvbojbrhtuszouozhpeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435647.3366964-879-164590215093083/AnsiballZ_systemd_service.py
Oct 14 09:54:07 np0005486759.ooo.test sudo[301974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:07 np0005486759.ooo.test python3.9[301976]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:54:07 np0005486759.ooo.test sudo[301974]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:08 np0005486759.ooo.test sudo[302085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwiwuadrdietcnitheeorogoirnaydel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435648.3073766-938-256891734473507/AnsiballZ_file.py
Oct 14 09:54:08 np0005486759.ooo.test sudo[302085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:08 np0005486759.ooo.test python3.9[302087]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:08 np0005486759.ooo.test sudo[302085]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:09 np0005486759.ooo.test sudo[302195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plaejstzgoxgwurvlgnispstdrzkuokk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435648.9662814-938-38912535038708/AnsiballZ_file.py
Oct 14 09:54:09 np0005486759.ooo.test sudo[302195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:09 np0005486759.ooo.test python3.9[302197]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:09 np0005486759.ooo.test sudo[302195]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:09 np0005486759.ooo.test sudo[302305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzipsjegghxvwzlfccdrujlvithydwyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435649.5975683-938-172265137493249/AnsiballZ_file.py
Oct 14 09:54:09 np0005486759.ooo.test sudo[302305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:10 np0005486759.ooo.test python3.9[302307]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:10 np0005486759.ooo.test sudo[302305]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:10 np0005486759.ooo.test sudo[302415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rghphralkylxixnezmagdmhivxsmkblg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435650.2272525-938-74676289343423/AnsiballZ_file.py
Oct 14 09:54:10 np0005486759.ooo.test sudo[302415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:10 np0005486759.ooo.test python3.9[302417]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:10 np0005486759.ooo.test sudo[302415]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:11 np0005486759.ooo.test sudo[302525]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drtdnhbfpzorpkuhfjpaiazuenvkpuux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435650.919463-938-252255459733519/AnsiballZ_file.py
Oct 14 09:54:11 np0005486759.ooo.test sudo[302525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:11 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:11.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:11 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:11.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:11 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:11.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:11 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:11.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:11 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:11.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:11 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:11.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:11 np0005486759.ooo.test python3.9[302527]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:11 np0005486759.ooo.test sudo[302525]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:11 np0005486759.ooo.test sudo[302635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouzcothgdlwzsqsgiupgguyuzjrlqhbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435651.5022995-938-53912936592016/AnsiballZ_file.py
Oct 14 09:54:11 np0005486759.ooo.test sudo[302635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:12 np0005486759.ooo.test python3.9[302637]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:12 np0005486759.ooo.test sudo[302635]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:54:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:54:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:54:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126003 "" "Go-http-client/1.1"
Oct 14 09:54:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:54:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15237 "" "Go-http-client/1.1"
Oct 14 09:54:12 np0005486759.ooo.test sudo[302745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujzohqdpmauedsoxczlcdzcmoucncpyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435652.135706-938-121643810571917/AnsiballZ_file.py
Oct 14 09:54:12 np0005486759.ooo.test sudo[302745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:12 np0005486759.ooo.test python3.9[302747]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:12 np0005486759.ooo.test sudo[302745]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:13 np0005486759.ooo.test sudo[302855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvutjeniauhgftqfjfsjbgfnjyqbcgma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435652.7801936-938-68525293882669/AnsiballZ_file.py
Oct 14 09:54:13 np0005486759.ooo.test sudo[302855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:13 np0005486759.ooo.test python3.9[302857]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:13 np0005486759.ooo.test sudo[302855]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:13 np0005486759.ooo.test sudo[302965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bugzmzhrveqjirsvvcnrrqlpyoawqfvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435653.5046241-995-142069946980172/AnsiballZ_file.py
Oct 14 09:54:13 np0005486759.ooo.test sudo[302965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:54:13 np0005486759.ooo.test podman[302968]: 2025-10-14 09:54:13.869313143 +0000 UTC m=+0.055497182 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:54:13 np0005486759.ooo.test podman[302968]: 2025-10-14 09:54:13.903367039 +0000 UTC m=+0.089551078 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS)
Oct 14 09:54:13 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:54:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:54:14 np0005486759.ooo.test python3.9[302967]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:14 np0005486759.ooo.test sudo[302965]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:14 np0005486759.ooo.test sudo[303094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-couupjcxmioolavcolcwsxitrtbvvqzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435654.1381457-995-8818641352071/AnsiballZ_file.py
Oct 14 09:54:14 np0005486759.ooo.test sudo[303094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:14 np0005486759.ooo.test python3.9[303096]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:14 np0005486759.ooo.test sudo[303094]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:15 np0005486759.ooo.test sudo[303204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlljferzknqmanizvbhrpfoqkooitgkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435654.8046012-995-190309396232377/AnsiballZ_file.py
Oct 14 09:54:15 np0005486759.ooo.test sudo[303204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:15 np0005486759.ooo.test python3.9[303206]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:15 np0005486759.ooo.test sudo[303204]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:15 np0005486759.ooo.test sudo[303314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebzwmkofuxeqmkfovwmpkknpxowbqquh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435655.5140314-995-94292439461527/AnsiballZ_file.py
Oct 14 09:54:15 np0005486759.ooo.test sudo[303314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:15 np0005486759.ooo.test python3.9[303316]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:16 np0005486759.ooo.test sudo[303314]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:16.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:16.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:16.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:16.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:16.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:16 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:16.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:16 np0005486759.ooo.test sudo[303424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehvzmdbgdmyvrcpaxqxqjfaptlumrmtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435656.235188-995-263762946791241/AnsiballZ_file.py
Oct 14 09:54:16 np0005486759.ooo.test sudo[303424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:16 np0005486759.ooo.test python3.9[303426]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:16 np0005486759.ooo.test sudo[303424]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:17 np0005486759.ooo.test sudo[303534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axqzmfblazmltphrvugbbavskspfnnbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435656.847317-995-185599646228613/AnsiballZ_file.py
Oct 14 09:54:17 np0005486759.ooo.test sudo[303534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:17 np0005486759.ooo.test python3.9[303536]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:17 np0005486759.ooo.test sudo[303534]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:54:17 np0005486759.ooo.test podman[303541]: 2025-10-14 09:54:17.458165727 +0000 UTC m=+0.086580706 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:54:17 np0005486759.ooo.test podman[303541]: 2025-10-14 09:54:17.489998883 +0000 UTC m=+0.118413892 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:54:17 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:54:17 np0005486759.ooo.test sudo[303666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cloxwvvlplfkacqmdnmlfpvdhwxoznqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435657.4649656-995-262067671526888/AnsiballZ_file.py
Oct 14 09:54:17 np0005486759.ooo.test sudo[303666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:17 np0005486759.ooo.test python3.9[303668]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:17 np0005486759.ooo.test sudo[303666]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:18 np0005486759.ooo.test sudo[303776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csmddkybsiiuauetvtbenicdgsyireke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435658.097594-995-50274770141023/AnsiballZ_file.py
Oct 14 09:54:18 np0005486759.ooo.test sudo[303776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:18 np0005486759.ooo.test python3.9[303778]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:18 np0005486759.ooo.test sudo[303776]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:19 np0005486759.ooo.test sudo[303886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yunqadhbgbwxbiqblkuyzbbvmztppsbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435658.877002-1053-171795460565956/AnsiballZ_command.py
Oct 14 09:54:19 np0005486759.ooo.test sudo[303886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:19 np0005486759.ooo.test python3.9[303888]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                           systemctl disable --now certmonger.service
                                                           test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                         fi
                                                          _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:19 np0005486759.ooo.test sudo[303886]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3476 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2149301591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9B20E80000000001030307) 
Oct 14 09:54:20 np0005486759.ooo.test python3.9[303998]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 14 09:54:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3477 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2149301591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9B25010000000001030307) 
Oct 14 09:54:20 np0005486759.ooo.test sudo[304106]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krvltwkhkrpztvskfkboaeazjoossfou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435660.5531182-1071-66408044589537/AnsiballZ_systemd_service.py
Oct 14 09:54:20 np0005486759.ooo.test sudo[304106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:21 np0005486759.ooo.test python3.9[304108]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 14 09:54:21 np0005486759.ooo.test systemd[1]: Reloading.
Oct 14 09:54:21 np0005486759.ooo.test systemd-rc-local-generator[304133]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 14 09:54:21 np0005486759.ooo.test systemd-sysv-generator[304138]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 14 09:54:21 np0005486759.ooo.test systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Oct 14 09:54:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:21.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:21.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:21.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:21.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:21.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:21 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:21.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:21 np0005486759.ooo.test sudo[304106]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:22 np0005486759.ooo.test sudo[304253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzfyndrulmyiesdopahyctqwthablegi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435661.7477767-1079-194330057937728/AnsiballZ_command.py
Oct 14 09:54:22 np0005486759.ooo.test sudo[304253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:22 np0005486759.ooo.test python3.9[304255]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:22 np0005486759.ooo.test sudo[304253]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3478 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2149301591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9B2D010000000001030307) 
Oct 14 09:54:22 np0005486759.ooo.test sudo[304364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oerwowgvfkinzzkhsabpzgawekxrxefp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435662.5883057-1079-236609943747450/AnsiballZ_command.py
Oct 14 09:54:22 np0005486759.ooo.test sudo[304364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:23 np0005486759.ooo.test python3.9[304366]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:23 np0005486759.ooo.test sudo[304364]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:23 np0005486759.ooo.test sudo[304475]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeyagithlbctvlloztkjlqhymkjpzcxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435663.2642083-1079-120736291796282/AnsiballZ_command.py
Oct 14 09:54:23 np0005486759.ooo.test sudo[304475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:23 np0005486759.ooo.test python3.9[304477]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:23 np0005486759.ooo.test sudo[304475]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:24 np0005486759.ooo.test sudo[304586]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohwcedjssgpqunpgatgndjjmlzivmwce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435663.8462198-1079-13613229216984/AnsiballZ_command.py
Oct 14 09:54:24 np0005486759.ooo.test sudo[304586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:24 np0005486759.ooo.test python3.9[304588]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:24 np0005486759.ooo.test sudo[304586]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.448 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.453 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14629c9e-d13d-4f02-abd0-7c373d13991f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.449248', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3cd39ac-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '0b94225f97d41512bafa78b707316746a81f804e8fa18625fd3dd9d922bdb9c7'}]}, 'timestamp': '2025-10-14 09:54:24.453677', '_unique_id': 'f74db9f1147b4f62bd60d85258142338'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.454 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.455 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9d6b2c2-2aeb-4575-b6e6-5b715059c0fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.455737', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3cd9802-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '4fde5b31fb2e4d4e4fd091cf5379187506fd14da815b9de9e487212a80054314'}]}, 'timestamp': '2025-10-14 09:54:24.456068', '_unique_id': '47f2fc7ec24a4080a4525c74e25de2fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.456 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.486 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.487 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c1fae8c-95f7-4710-9a4c-bc163a6b6304', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.457456', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3d259aa-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': 'f279c803d8c439f8ea59901d6c77f7612b2fe269c415e526d86581ddaf1b1075'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.457456', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3d26738-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '9de6da790339dd122fd085c47282899804b6a6bd71382139ccb927a1c71d7734'}]}, 'timestamp': '2025-10-14 09:54:24.487550', '_unique_id': 'fbaceeeae0174d13b7edddbf894273fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.489 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.509 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 52.17578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd790cdd8-d4b2-417a-bf1b-8f68658f8b76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.17578125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:54:24.489454', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c3d5cbda-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.704378743, 'message_signature': '6f4b35ac89317abf7e8f2cf0dc89f07ddf7936a0d7e60a028e932186c5c4e509'}]}, 'timestamp': '2025-10-14 09:54:24.509798', '_unique_id': '901a94cce70c4f0c8c9f252607b86ad9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.510 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 9773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f48e343-32ea-4b2b-9733-da4a1a903aa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9773, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.511265', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3d60fdc-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': 'f21d9d14284c6de1896b2106b3668819422dde36edd0ae407e3c666304840741'}]}, 'timestamp': '2025-10-14 09:54:24.511495', '_unique_id': 'bac550d431c440d1a54750865a59a743'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 513177663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 75228955 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8e6a9a3-a7c1-4ec3-b773-147dd8c2e6fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513177663, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.512491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3d63f70-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '9e4f1186f8025c1abc9750d8c989ee35ac7fa79f79e477ba6534e5c089cdf21d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75228955, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.512491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3d646c8-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '802ad5d628631d6195eabf194fb8e7d961c41f9170328090ef63f4599a32f131'}]}, 'timestamp': '2025-10-14 09:54:24.512882', '_unique_id': '346bf82768294636b31b8c4cf6f6f27d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.529 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.529 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ab3b359-ad14-46f6-9593-84af40e84d13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.513972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3d8d7ee-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.709031998, 'message_signature': '9fd576742fdf6512b6434a62d5a6a5ee2ac671a34c87c40b377e093681460cdd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.513972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3d8dff0-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.709031998, 'message_signature': '915e8e060a287d3d2da2438139f7b808d50f60356d32fe607020a560a7396dff'}]}, 'timestamp': '2025-10-14 09:54:24.529910', '_unique_id': '0bb49db91db64935a54829ce829045db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8e3d324-026e-49d6-a71f-4c12f1f37dc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.530997', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3d91272-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': 'ec46e9ee34f3f23478e7176fce69ae7156ff53cb9e227605b0edebde5a562eb4'}]}, 'timestamp': '2025-10-14 09:54:24.531218', '_unique_id': '8e8538f655fe42b7b92d58c8a0744bd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.532 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.532 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31129600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.532 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd80cb3ac-ad29-451a-87b7-010eae777abe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31129600, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.532211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3d941a2-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.709031998, 'message_signature': 'a4df86a198ff6d0714d058fec0917d0158442c603ffeb375ecefd1c8f1510c7b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.532211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3d948fa-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.709031998, 'message_signature': '47efc612407f09e33607356b92274426e38fe2ae6359d0fcc076874bdbce943d'}]}, 'timestamp': '2025-10-14 09:54:24.532598', '_unique_id': '86049e642b2c4a39823c119e1af16ae7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.533 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cda7ffc8-63d3-432b-971e-30aa0e4ffcb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.533601', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3d97816-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '27eee021da749a2beb5d03531a2e36107ad6e780236caf29c7d6c3ddeca2ebc3'}]}, 'timestamp': '2025-10-14 09:54:24.533818', '_unique_id': '0d1744d56f5547f0a503f82d7b5165bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.534 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e792cd28-a39e-4969-99dc-76bf33f69d51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.534881', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3d9aaa2-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': 'c13a004e74a21f05419c6fb099baa48f3ee83ff40764e4ff9f50fa509efcffb3'}]}, 'timestamp': '2025-10-14 09:54:24.535114', '_unique_id': '2f165b50a64f4930a47f60d5542bb7fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.535 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 53350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6143814b-96d6-4bbe-8e32-5dc24f5031a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53350000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:54:24.536101', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c3d9d98c-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.704378743, 'message_signature': '9daa21db5fc1235f5c9566b4b5b255d911a62380139989ee42e14d474cdfe9e5'}]}, 'timestamp': '2025-10-14 09:54:24.536306', '_unique_id': '2cb5338722ed46a09ed4ceb26172bace'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b45a692-1744-4336-990c-7eec4fd1fc01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.537338', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3da09d4-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '54e5169b7d09b0e4252f6d84bf5aac469a7c31d7cec2c8ab7316886d2012fb28'}]}, 'timestamp': '2025-10-14 09:54:24.537548', '_unique_id': '0c07ed8cef2a4403bd78fe61679dc55e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.538 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.538 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64194848-1110-4ad9-936f-50c5caae6d54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.538531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3da386e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': 'd45b54c09e3058ba0428e44d73b4c5f4e4ef0d2bd7ae4cbcda37d5843954705b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.538531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3da3f94-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '6491b3c093f39408edd9cfb590217bc8e521d2550b4b154289f8442a0aa9c0e0'}]}, 'timestamp': '2025-10-14 09:54:24.538912', '_unique_id': '82e69c1406574968a41e0d28e55f1177'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee9439b9-1bbd-41e7-b517-30f025eff5e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.539889', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3da6e4c-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '27f83d44071a3263cfc19b45662854aeaf302030a05e9b32654566021ca2add6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.539889', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3da759a-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': 'cbed89021e898a3ac1ac534ac8f0cd08f98d2b4931e84ed91f8c054d56775ce2'}]}, 'timestamp': '2025-10-14 09:54:24.540294', '_unique_id': '9101046a83144662a7801a4507db9e4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57073374-86e0-42e1-baf1-7ce3f14f6f84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.541273', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3daa3a8-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '8626cf967a5eb795ef8382a9d689fb9d626ec440ce53797bda6614812b661dcd'}]}, 'timestamp': '2025-10-14 09:54:24.541500', '_unique_id': '95dcf3a7b7d24e8096d047d02336d3d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.542 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.542 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 1288814026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.542 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 10812347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b77b3de-ba16-49ad-bc4b-1a7636b80404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1288814026, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.542505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3dad3be-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '39604772cc8ded329bd0eea46fb3b959511b34462e7c8788eda3743099e5fe0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10812347, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.542505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3dadb02-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '7858cb2b7fdee0cfd79d647cbfe7298625213bbb49c9447cc00129ef39a64716'}]}, 'timestamp': '2025-10-14 09:54:24.542890', '_unique_id': 'eca0d1af1c784862a60c7838f63c38cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.543 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e5c0bc2-56d3-4055-b09b-3fbd8f39d23d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 591, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.543880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3db0a28-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '6b142240d6dbf827e02f468f543b2ff822842a31063087daa4b334eea5491bc6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.543880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3db116c-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.652536316, 'message_signature': '7cafd868f3e47a041844ab98271961f41ea742a73f328461d26f0eddeaca4522'}]}, 'timestamp': '2025-10-14 09:54:24.544284', '_unique_id': '2fc64ac84ddf4fb58d7b9bbe12a67686'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.545 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '306161e9-b3cf-4fbc-b596-50d58bda9598', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.545378', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3db443e-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '22e24ef5c40cbc1dc61b1e1124eb38798e0075b2e296106551828c1184b0de34'}]}, 'timestamp': '2025-10-14 09:54:24.545598', '_unique_id': 'b9ae3c2f580342deb444821e31eac7a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.546 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd4cb140-6978-4055-953e-0c5d8516939b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:54:24.546592', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'c3db7350-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.644313301, 'message_signature': '89862a2889eff4c271a7d3b59c34759a65a625cdc0c560109ceb8b4d0d0babab'}]}, 'timestamp': '2025-10-14 09:54:24.546802', '_unique_id': '49703c1a29c544e0aacd4470f3d9f99e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.547 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f80398bc-4cdb-4c28-bb89-d3f51c96b9ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:54:24.547749', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3dba096-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.709031998, 'message_signature': 'bc9d509f9079342a396b01f5f010a94eb9a2696c24c1d45822b92307fba947f1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:54:24.547749', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3dba8a2-a8e3-11f0-b515-fa163eba5220', 'monotonic_time': 11684.709031998, 'message_signature': 'a0b5cc2cbebcc54954756ef42942c7e445dcfafaa723281a691f49fc5d114c3e'}]}, 'timestamp': '2025-10-14 09:54:24.548157', '_unique_id': 'ff19308e11f3422493f6dfb2b8992d42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:54:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:54:24.549 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:54:24 np0005486759.ooo.test sudo[304697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugnnfryjcsogndnhcncuxzipqdrcjxdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435664.4455767-1079-59364565024276/AnsiballZ_command.py
Oct 14 09:54:24 np0005486759.ooo.test sudo[304697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:24 np0005486759.ooo.test python3.9[304699]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:25 np0005486759.ooo.test sudo[304697]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:25 np0005486759.ooo.test sudo[304808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfjcgxjcfkuapxlxbwlmkleaznmlzsej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435665.12972-1079-258739068891941/AnsiballZ_command.py
Oct 14 09:54:25 np0005486759.ooo.test sudo[304808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:25 np0005486759.ooo.test python3.9[304810]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:25 np0005486759.ooo.test sudo[304808]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:25 np0005486759.ooo.test sudo[304919]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejrzvgmjmbgmztthegzpsgpbbzhnmlol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435665.7200289-1079-160440382342739/AnsiballZ_command.py
Oct 14 09:54:25 np0005486759.ooo.test sudo[304919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:26 np0005486759.ooo.test python3.9[304921]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:26 np0005486759.ooo.test sudo[304919]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:26.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:26.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:26.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:26.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:26.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:26 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:26.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:26 np0005486759.ooo.test sudo[305030]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roofpzbikxdnqrtenoqznxiadctsygtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435666.3312163-1079-103099013793516/AnsiballZ_command.py
Oct 14 09:54:26 np0005486759.ooo.test sudo[305030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3479 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2149301591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9B3CC10000000001030307) 
Oct 14 09:54:26 np0005486759.ooo.test python3.9[305032]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:54:26 np0005486759.ooo.test sudo[305030]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:27 np0005486759.ooo.test sudo[305141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgdabykaqreobluhnsizcptokqvwlqzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435667.6715696-1158-123342489895168/AnsiballZ_file.py
Oct 14 09:54:27 np0005486759.ooo.test sudo[305141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:54:28 np0005486759.ooo.test podman[305144]: 2025-10-14 09:54:28.061090042 +0000 UTC m=+0.093560182 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Oct 14 09:54:28 np0005486759.ooo.test podman[305144]: 2025-10-14 09:54:28.070404161 +0000 UTC m=+0.102874261 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2)
Oct 14 09:54:28 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:54:28 np0005486759.ooo.test python3.9[305143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:28 np0005486759.ooo.test sudo[305141]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:28 np0005486759.ooo.test sudo[305270]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpsbnywklovcczejqokddjbasaskblcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435668.3547232-1158-234468735652153/AnsiballZ_file.py
Oct 14 09:54:28 np0005486759.ooo.test sudo[305270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:28 np0005486759.ooo.test python3.9[305272]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:28 np0005486759.ooo.test sudo[305270]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:29 np0005486759.ooo.test sudo[305380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uibsvphtyehisipirleyauolokcavxax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435668.9778728-1158-163681361100323/AnsiballZ_file.py
Oct 14 09:54:29 np0005486759.ooo.test sudo[305380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:54:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:54:29 np0005486759.ooo.test systemd[1]: tmp-crun.Ti6Keo.mount: Deactivated successfully.
Oct 14 09:54:29 np0005486759.ooo.test systemd[1]: tmp-crun.P3XunO.mount: Deactivated successfully.
Oct 14 09:54:29 np0005486759.ooo.test podman[305383]: 2025-10-14 09:54:29.389158243 +0000 UTC m=+0.100704894 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, io.buildah.version=1.41.3)
Oct 14 09:54:29 np0005486759.ooo.test podman[305384]: 2025-10-14 09:54:29.353235269 +0000 UTC m=+0.066550615 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 09:54:29 np0005486759.ooo.test podman[305383]: 2025-10-14 09:54:29.425462509 +0000 UTC m=+0.137009120 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible)
Oct 14 09:54:29 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:54:29 np0005486759.ooo.test podman[305384]: 2025-10-14 09:54:29.436303715 +0000 UTC m=+0.149619011 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:54:29 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:54:29 np0005486759.ooo.test python3.9[305382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:29 np0005486759.ooo.test sudo[305380]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:30 np0005486759.ooo.test sudo[305529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmrqlmdtffcmlxbxwlcivalcmecebhbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435669.722316-1180-110524271275559/AnsiballZ_file.py
Oct 14 09:54:30 np0005486759.ooo.test sudo[305529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:30 np0005486759.ooo.test python3.9[305531]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:30 np0005486759.ooo.test sudo[305529]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:30 np0005486759.ooo.test sudo[305639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtplkvzgegyxqcftowsedopynteinhin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435670.413232-1180-208710446492813/AnsiballZ_file.py
Oct 14 09:54:30 np0005486759.ooo.test sudo[305639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:54:30 np0005486759.ooo.test podman[305642]: 2025-10-14 09:54:30.815396218 +0000 UTC m=+0.075278066 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:54:30 np0005486759.ooo.test podman[305642]: 2025-10-14 09:54:30.845202402 +0000 UTC m=+0.105084250 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:54:30 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:54:30 np0005486759.ooo.test python3.9[305641]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:30 np0005486759.ooo.test sudo[305639]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:31 np0005486759.ooo.test sudo[305772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsiwwnycqpfzfedjnuomuintnikbonkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435671.0876677-1180-42981680736616/AnsiballZ_file.py
Oct 14 09:54:31 np0005486759.ooo.test sudo[305772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:31.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:31.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:31.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:31.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:31.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:31 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:31.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:31 np0005486759.ooo.test python3.9[305774]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:31 np0005486759.ooo.test sudo[305772]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:31 np0005486759.ooo.test sudo[305882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srojaonjujilmysudhjcowhgwkldvlok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435671.7168303-1180-165142554620842/AnsiballZ_file.py
Oct 14 09:54:31 np0005486759.ooo.test sudo[305882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:32 np0005486759.ooo.test python3.9[305884]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:32 np0005486759.ooo.test sudo[305882]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:32 np0005486759.ooo.test sudo[305992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfifekrfnalqdsyfrvpomoxcdyyzymwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435672.3454573-1180-218352209786746/AnsiballZ_file.py
Oct 14 09:54:32 np0005486759.ooo.test sudo[305992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:32 np0005486759.ooo.test python3.9[305994]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:32 np0005486759.ooo.test sudo[305992]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:33 np0005486759.ooo.test sudo[306102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulpimqvpopdaoilsajggtnladgmawzba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435672.9966435-1180-235153935354685/AnsiballZ_file.py
Oct 14 09:54:33 np0005486759.ooo.test sudo[306102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:33.309 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:33 np0005486759.ooo.test python3.9[306104]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:33 np0005486759.ooo.test sudo[306102]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:33.496 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:33 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:33.497 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:54:33 np0005486759.ooo.test sudo[306212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lspcouigfkpqjovavfvqhtmzwgbjvwku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435673.621395-1180-10173144210338/AnsiballZ_file.py
Oct 14 09:54:33 np0005486759.ooo.test sudo[306212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:34 np0005486759.ooo.test python3.9[306214]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:34 np0005486759.ooo.test sudo[306212]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:34 np0005486759.ooo.test sudo[306322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlbotnxxsvtdbndmgrrxzpgcexylvtoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435674.310954-1180-21046717597164/AnsiballZ_file.py
Oct 14 09:54:34 np0005486759.ooo.test sudo[306322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:34 np0005486759.ooo.test python3.9[306324]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:34 np0005486759.ooo.test sudo[306322]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:35 np0005486759.ooo.test sudo[306432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldotvospesibsdvphdwuegovkfxxslve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435674.9660802-1180-11323705789996/AnsiballZ_file.py
Oct 14 09:54:35 np0005486759.ooo.test sudo[306432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:35 np0005486759.ooo.test python3.9[306434]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:35 np0005486759.ooo.test sudo[306432]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:35 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:35.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:54:36 np0005486759.ooo.test podman[306452]: 2025-10-14 09:54:36.456632401 +0000 UTC m=+0.083669625 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0)
Oct 14 09:54:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:36.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:36 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:36.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:36 np0005486759.ooo.test podman[306452]: 2025-10-14 09:54:36.525445075 +0000 UTC m=+0.152482229 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:54:36 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:54:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:37.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:37 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:37.499 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:54:38 np0005486759.ooo.test podman[306478]: 2025-10-14 09:54:38.457261167 +0000 UTC m=+0.083690526 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Oct 14 09:54:38 np0005486759.ooo.test podman[306478]: 2025-10-14 09:54:38.498363072 +0000 UTC m=+0.124792401 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7)
Oct 14 09:54:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:38.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:38.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:54:38 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:38.498 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:54:38 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.110 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.110 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.111 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.111 2 DEBUG nova.objects.instance [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.576 2 DEBUG nova.network.neutron [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.593 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:54:39 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:39.593 2 DEBUG nova.compute.manager [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:54:40 np0005486759.ooo.test sudo[306588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewiowmimyjnzzfenijavrqytjtiaqczi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435679.866972-1363-279317588116049/AnsiballZ_getent.py
Oct 14 09:54:40 np0005486759.ooo.test sudo[306588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:40 np0005486759.ooo.test python3.9[306590]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 14 09:54:40 np0005486759.ooo.test sudo[306588]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:40 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:40.588 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:41 np0005486759.ooo.test sshd[306609]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:54:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:41.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:41 np0005486759.ooo.test sshd[306609]: Accepted publickey for zuul from 192.168.122.30 port 46148 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:54:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:41.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:41.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:41.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:41 np0005486759.ooo.test systemd-logind[759]: New session 44 of user zuul.
Oct 14 09:54:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:41.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:41 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:41.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:41 np0005486759.ooo.test systemd[1]: Started Session 44 of User zuul.
Oct 14 09:54:41 np0005486759.ooo.test sshd[306609]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:54:41 np0005486759.ooo.test sshd[306612]: Received disconnect from 192.168.122.30 port 46148:11: disconnected by user
Oct 14 09:54:41 np0005486759.ooo.test sshd[306612]: Disconnected from user zuul 192.168.122.30 port 46148
Oct 14 09:54:41 np0005486759.ooo.test sshd[306609]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:54:41 np0005486759.ooo.test systemd[1]: session-44.scope: Deactivated successfully.
Oct 14 09:54:41 np0005486759.ooo.test systemd-logind[759]: Session 44 logged out. Waiting for processes to exit.
Oct 14 09:54:41 np0005486759.ooo.test systemd-logind[759]: Removed session 44.
Oct 14 09:54:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:54:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:54:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:54:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126003 "" "Go-http-client/1.1"
Oct 14 09:54:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:54:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15243 "" "Go-http-client/1.1"
Oct 14 09:54:42 np0005486759.ooo.test python3.9[306720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.497 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.498 2 DEBUG oslo_service.periodic_task [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.522 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.522 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.522 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.522 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.598 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.668 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.670 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.737 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.738 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.786 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.788 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:54:42 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:42.829 2 DEBUG oslo_concurrency.processutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:54:42 np0005486759.ooo.test python3.9[306813]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435681.8765001-1390-48882317158269/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.004 2 WARNING nova.virt.libvirt.driver [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.006 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12481MB free_disk=386.7161178588867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.006 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.007 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.092 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.093 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.093 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.147 2 DEBUG nova.compute.provider_tree [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.163 2 DEBUG nova.scheduler.client.report [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.165 2 DEBUG nova.compute.resource_tracker [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:54:43 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:43.165 2 DEBUG oslo_concurrency.lockutils [None req-ae9df62a-48cc-4e94-93f4-62e5a861fef7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:54:43 np0005486759.ooo.test python3.9[306926]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:54:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:54:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:54:44 np0005486759.ooo.test python3.9[306981]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:54:44 np0005486759.ooo.test podman[307053]: 2025-10-14 09:54:44.460328261 +0000 UTC m=+0.083499631 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible)
Oct 14 09:54:44 np0005486759.ooo.test podman[307053]: 2025-10-14 09:54:44.496295096 +0000 UTC m=+0.119466366 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:54:44 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:54:44 np0005486759.ooo.test python3.9[307109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:45 np0005486759.ooo.test python3.9[307195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435684.204495-1390-212604014098843/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:45 np0005486759.ooo.test python3.9[307303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:46 np0005486759.ooo.test python3.9[307389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435685.427978-1390-225791755147508/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=158960547ad6af7ca0183dbf7d845472651d1682 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:46.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:46.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:46.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:46.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:46.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:46 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:46.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:47 np0005486759.ooo.test python3.9[307497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:47 np0005486759.ooo.test python3.9[307583]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760435686.7180517-1390-78452303737626/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:48 np0005486759.ooo.test sudo[307691]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toojhlqeoxtydgkzxhvaxxzmlmplnngx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435687.9589577-1459-196491828476859/AnsiballZ_file.py
Oct 14 09:54:48 np0005486759.ooo.test sudo[307691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:54:48 np0005486759.ooo.test podman[307693]: 2025-10-14 09:54:48.306166653 +0000 UTC m=+0.064513612 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:54:48 np0005486759.ooo.test podman[307693]: 2025-10-14 09:54:48.315181442 +0000 UTC m=+0.073528391 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:54:48 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:54:48 np0005486759.ooo.test python3.9[307694]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:48 np0005486759.ooo.test sudo[307691]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:48 np0005486759.ooo.test sudo[307822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmefclrlorzelowlhhfmyzszwqdxbdat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435688.608899-1467-221051400898461/AnsiballZ_copy.py
Oct 14 09:54:48 np0005486759.ooo.test sudo[307822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:49 np0005486759.ooo.test python3.9[307824]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:49 np0005486759.ooo.test sudo[307822]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44845 DF PROTO=TCP SPT=41378 DPT=9102 SEQ=3144154148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9B96180000000001030307) 
Oct 14 09:54:49 np0005486759.ooo.test sudo[307932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjojdzongwjuofpniuydimmdfffmbpgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435689.230721-1475-144650221293440/AnsiballZ_stat.py
Oct 14 09:54:49 np0005486759.ooo.test sudo[307932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:49 np0005486759.ooo.test python3.9[307934]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:54:49 np0005486759.ooo.test sudo[307932]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:50 np0005486759.ooo.test sudo[308044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwuffsqszlbvmeebtdluaysugkcbyuuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435689.9642026-1484-74509386326775/AnsiballZ_file.py
Oct 14 09:54:50 np0005486759.ooo.test sudo[308044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:50 np0005486759.ooo.test python3.9[308046]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:54:50 np0005486759.ooo.test sudo[308044]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44846 DF PROTO=TCP SPT=41378 DPT=9102 SEQ=3144154148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9B9A010000000001030307) 
Oct 14 09:54:51 np0005486759.ooo.test python3.9[308154]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:54:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:51.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:51.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:51.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:51.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:51.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:51 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:51.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:51 np0005486759.ooo.test python3.9[308264]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:52 np0005486759.ooo.test python3.9[308319]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44847 DF PROTO=TCP SPT=41378 DPT=9102 SEQ=3144154148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9BA2010000000001030307) 
Oct 14 09:54:53 np0005486759.ooo.test python3.9[308427]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 14 09:54:53 np0005486759.ooo.test python3.9[308482]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 14 09:54:53 np0005486759.ooo.test sudo[308590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acwpiqirrkvloqyuhjgmzfbrgolhkltn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435693.6951907-1527-278043565769843/AnsiballZ_container_config_data.py
Oct 14 09:54:53 np0005486759.ooo.test sudo[308590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:54:54.160 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:54:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:54:54.161 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:54:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:54:54.162 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:54:54 np0005486759.ooo.test python3.9[308592]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 14 09:54:54 np0005486759.ooo.test sudo[308590]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:54 np0005486759.ooo.test sudo[308700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flmtyyegeregjgafynbyupjihemdubrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435694.4612944-1536-240735953845046/AnsiballZ_container_config_hash.py
Oct 14 09:54:54 np0005486759.ooo.test sudo[308700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:55 np0005486759.ooo.test python3.9[308702]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:54:55 np0005486759.ooo.test sudo[308700]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:55 np0005486759.ooo.test sudo[308810]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spfboabqmljsihpbzamwkhkovfseciyk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435695.3016574-1546-88353612286944/AnsiballZ_edpm_container_manage.py
Oct 14 09:54:55 np0005486759.ooo.test sudo[308810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:55 np0005486759.ooo.test python3[308812]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:54:56 np0005486759.ooo.test python3[308812]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "b5b57d3572ac74b7c41332c066527d5039dbd47e134e43d7cb5d76b7732d99f5",
                                                                 "Digest": "sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-13T12:50:19.385564198Z",
                                                                 "Config": {
                                                                      "User": "nova",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 1207014273,
                                                                 "VirtualSize": 1207014273,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",
                                                                           "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",
                                                                           "sha256:e0ba9b00dd1340fa4eba9e9cd5f316c11381d47a31460e5b834a6ca56f60033f",
                                                                           "sha256:731e9354c974a424a2f6724faa85f84baef270eb006be0de18bbdc87ff420f97"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "nova",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843286399Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843354051Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843394192Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843417133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843442193Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843461914Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:43.236856724Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:17.539596691Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.007092512Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.334560883Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.713915587Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.426474494Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.742526819Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.072068096Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.376327744Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.639696917Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.946940986Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.329166855Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.709072452Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.066214819Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.407947122Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.744473297Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.044338828Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.376253048Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:29.890793292Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.186632274Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.418527973Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:31.913162322Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817436155Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817485046Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817496507Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817505987Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:34.821748777Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:00.340362183Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:40.80916313Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:43.984050021Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:20.872493025Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:21.523603796Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:21.810108901Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:34.864836738Z",
                                                                           "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:43.551617349Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:59.074531506Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:18.664061292Z",
                                                                           "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.027951629Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.382890946Z",
                                                                           "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.382951197Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:25.718273507Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 14 09:54:56 np0005486759.ooo.test sudo[308810]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44848 DF PROTO=TCP SPT=41378 DPT=9102 SEQ=3144154148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9BB1C20000000001030307) 
Oct 14 09:54:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:56.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:56.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:54:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:56.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:54:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:56.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:56.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:54:56 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:54:56.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:54:56 np0005486759.ooo.test sudo[308979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqzrvoxaqevlywqdlbyxtvkuymtfvtly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435696.5054996-1554-176927736800425/AnsiballZ_stat.py
Oct 14 09:54:56 np0005486759.ooo.test sudo[308979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:57 np0005486759.ooo.test python3.9[308981]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:54:57 np0005486759.ooo.test sudo[308979]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:57 np0005486759.ooo.test sudo[309091]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phfmfzyexxhmwrhgkqivlywexnpzaspg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435697.447296-1566-270623350125989/AnsiballZ_container_config_data.py
Oct 14 09:54:57 np0005486759.ooo.test sudo[309091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:57 np0005486759.ooo.test python3.9[309093]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 14 09:54:57 np0005486759.ooo.test sudo[309091]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:54:58 np0005486759.ooo.test podman[309171]: 2025-10-14 09:54:58.455468772 +0000 UTC m=+0.083547742 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:54:58 np0005486759.ooo.test podman[309171]: 2025-10-14 09:54:58.468193736 +0000 UTC m=+0.096272726 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:54:58 np0005486759.ooo.test sudo[309214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgrklcqdfetojbnpxvsriuirduevxuis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435698.1847305-1575-82221218098165/AnsiballZ_container_config_hash.py
Oct 14 09:54:58 np0005486759.ooo.test sudo[309214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:58 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:54:58 np0005486759.ooo.test python3.9[309222]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 14 09:54:58 np0005486759.ooo.test sudo[309214]: pam_unix(sudo:session): session closed for user root
Oct 14 09:54:59 np0005486759.ooo.test sudo[309330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvwrofchntlehhvltboongrenguujxnw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760435698.9863243-1585-231308974202574/AnsiballZ_edpm_container_manage.py
Oct 14 09:54:59 np0005486759.ooo.test sudo[309330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:54:59 np0005486759.ooo.test python3[309332]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 14 09:54:59 np0005486759.ooo.test python3[309332]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                            {
                                                                 "Id": "b5b57d3572ac74b7c41332c066527d5039dbd47e134e43d7cb5d76b7732d99f5",
                                                                 "Digest": "sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f",
                                                                 "RepoTags": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                 ],
                                                                 "RepoDigests": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6cdce1b6b9f1175545fa217f885c1a3360bebe7d9975584481a6ff221f3ad48f"
                                                                 ],
                                                                 "Parent": "",
                                                                 "Comment": "",
                                                                 "Created": "2025-10-13T12:50:19.385564198Z",
                                                                 "Config": {
                                                                      "User": "nova",
                                                                      "Env": [
                                                                           "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                           "LANG=en_US.UTF-8",
                                                                           "TZ=UTC",
                                                                           "container=oci"
                                                                      ],
                                                                      "Entrypoint": [
                                                                           "dumb-init",
                                                                           "--single-child",
                                                                           "--"
                                                                      ],
                                                                      "Cmd": [
                                                                           "kolla_start"
                                                                      ],
                                                                      "Labels": {
                                                                           "io.buildah.version": "1.41.3",
                                                                           "maintainer": "OpenStack Kubernetes Operator team",
                                                                           "org.label-schema.build-date": "20251009",
                                                                           "org.label-schema.license": "GPLv2",
                                                                           "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                           "org.label-schema.schema-version": "1.0",
                                                                           "org.label-schema.vendor": "CentOS",
                                                                           "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "tcib_managed": "true"
                                                                      },
                                                                      "StopSignal": "SIGTERM"
                                                                 },
                                                                 "Version": "",
                                                                 "Author": "",
                                                                 "Architecture": "amd64",
                                                                 "Os": "linux",
                                                                 "Size": 1207014273,
                                                                 "VirtualSize": 1207014273,
                                                                 "GraphDriver": {
                                                                      "Name": "overlay",
                                                                      "Data": {
                                                                           "LowerDir": "/var/lib/containers/storage/overlay/512b226761ef17c0044cb14b83718aa6f9984afb51b1aeb63112d22d2fdccb36/diff:/var/lib/containers/storage/overlay/0accaf46e2ca98f20a95b21cea4fb623de0e5378cb14b163bca0a8771d84c861/diff:/var/lib/containers/storage/overlay/ab64777085904da680013c790c3f2c65f0b954578737ec4d7fa836f56655c34a/diff:/var/lib/containers/storage/overlay/f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424/diff",
                                                                           "UpperDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/diff",
                                                                           "WorkDir": "/var/lib/containers/storage/overlay/5ce6c5d0cc60f856680938093014249abcf9a107a94355720d953b1d1e7f1bfe/work"
                                                                      }
                                                                 },
                                                                 "RootFS": {
                                                                      "Type": "layers",
                                                                      "Layers": [
                                                                           "sha256:f3f40f6483bf6d587286da9e86e40878c2aaaf723da5aa2364fff24f5ea28424",
                                                                           "sha256:2c35d1af0a6e73cbcf6c04a576d2e6a150aeaa6ae9408c81b2003edd71d6ae59",
                                                                           "sha256:3ad61591f8d467f7db4e096e1991f274fe1d4f8ad685b553dacb57c5e894eab0",
                                                                           "sha256:e0ba9b00dd1340fa4eba9e9cd5f316c11381d47a31460e5b834a6ca56f60033f",
                                                                           "sha256:731e9354c974a424a2f6724faa85f84baef270eb006be0de18bbdc87ff420f97"
                                                                      ]
                                                                 },
                                                                 "Labels": {
                                                                      "io.buildah.version": "1.41.3",
                                                                      "maintainer": "OpenStack Kubernetes Operator team",
                                                                      "org.label-schema.build-date": "20251009",
                                                                      "org.label-schema.license": "GPLv2",
                                                                      "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                      "org.label-schema.schema-version": "1.0",
                                                                      "org.label-schema.vendor": "CentOS",
                                                                      "tcib_build_tag": "1e4eeec18f8da2b364b39b7a7358aef5",
                                                                      "tcib_managed": "true"
                                                                 },
                                                                 "Annotations": {},
                                                                 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                 "User": "nova",
                                                                 "History": [
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.867908726Z",
                                                                           "created_by": "/bin/sh -c #(nop) ADD file:b2e608b9da8e087a764c2aebbd9c2cc9181047f5b301f1dab77fdf098a28268b in / ",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:03.868015697Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251009\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-09T00:18:07.890794359Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843286399Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                           "comment": "FROM quay.io/centos/centos:stream9",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843354051Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843394192Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843417133Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843442193Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:42.843461914Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:28:43.236856724Z",
                                                                           "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:17.539596691Z",
                                                                           "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.007092512Z",
                                                                           "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.334560883Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:21.713915587Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.426474494Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:22.742526819Z",
                                                                           "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.072068096Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.376327744Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.639696917Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:23.946940986Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.329166855Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:24.709072452Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.066214819Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.407947122Z",
                                                                           "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:25.744473297Z",
                                                                           "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.044338828Z",
                                                                           "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:26.376253048Z",
                                                                           "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:29.890793292Z",
                                                                           "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.186632274Z",
                                                                           "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:30.418527973Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:31.913162322Z",
                                                                           "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817436155Z",
                                                                           "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817485046Z",
                                                                           "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817496507Z",
                                                                           "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:33.817505987Z",
                                                                           "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:29:34.821748777Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:00.340362183Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:40.80916313Z",
                                                                           "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:31:43.984050021Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:20.872493025Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:21.523603796Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:35:21.810108901Z",
                                                                           "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:34.864836738Z",
                                                                           "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:36:43.551617349Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:47:59.074531506Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER root",
                                                                           "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1e4eeec18f8da2b364b39b7a7358aef5",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:18.664061292Z",
                                                                           "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.027951629Z",
                                                                           "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.382890946Z",
                                                                           "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:19.382951197Z",
                                                                           "created_by": "/bin/sh -c #(nop) USER nova",
                                                                           "empty_layer": true
                                                                      },
                                                                      {
                                                                           "created": "2025-10-13T12:50:25.718273507Z",
                                                                           "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1e4eeec18f8da2b364b39b7a7358aef5\""
                                                                      }
                                                                 ],
                                                                 "NamesHistory": [
                                                                      "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                 ]
                                                            }
                                                       ]
                                                       : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 14 09:54:59 np0005486759.ooo.test sudo[309330]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:00 np0005486759.ooo.test sudo[309501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zktrvdjsgrjlpifaxordsksfjgxkjwhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435700.0623653-1593-67922322532296/AnsiballZ_stat.py
Oct 14 09:55:00 np0005486759.ooo.test sudo[309501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:55:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:55:00 np0005486759.ooo.test podman[309504]: 2025-10-14 09:55:00.464499758 +0000 UTC m=+0.087387901 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:55:00 np0005486759.ooo.test podman[309504]: 2025-10-14 09:55:00.477314204 +0000 UTC m=+0.100202397 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 14 09:55:00 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:55:00 np0005486759.ooo.test podman[309505]: 2025-10-14 09:55:00.527664066 +0000 UTC m=+0.147615128 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:55:00 np0005486759.ooo.test podman[309505]: 2025-10-14 09:55:00.541405812 +0000 UTC m=+0.161356824 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:55:00 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:55:00 np0005486759.ooo.test python3.9[309503]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:55:00 np0005486759.ooo.test sudo[309501]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:01 np0005486759.ooo.test sudo[309651]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgfdjubryzlfaffuvusbsehiscsyvmhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435700.8786948-1602-43806507659568/AnsiballZ_file.py
Oct 14 09:55:01 np0005486759.ooo.test sudo[309651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:55:01 np0005486759.ooo.test podman[309654]: 2025-10-14 09:55:01.200239321 +0000 UTC m=+0.060802736 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:55:01 np0005486759.ooo.test podman[309654]: 2025-10-14 09:55:01.20536039 +0000 UTC m=+0.065923795 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:55:01 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:55:01 np0005486759.ooo.test python3.9[309653]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:55:01 np0005486759.ooo.test sudo[309651]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:01.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:55:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:01.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:55:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:01.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:55:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:01.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:55:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:01.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:01 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:01.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:55:01 np0005486759.ooo.test sudo[309785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecbdrmbaxvckpxziawcjdackesocjoxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435701.4252615-1602-45886241286354/AnsiballZ_copy.py
Oct 14 09:55:01 np0005486759.ooo.test sudo[309785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:02 np0005486759.ooo.test python3.9[309787]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760435701.4252615-1602-45886241286354/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 14 09:55:02 np0005486759.ooo.test sudo[309785]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:02 np0005486759.ooo.test sudo[309840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovoqmymqeagrnachnnmhmyvzisckaela ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435701.4252615-1602-45886241286354/AnsiballZ_systemd.py
Oct 14 09:55:02 np0005486759.ooo.test sudo[309840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:02 np0005486759.ooo.test python3.9[309842]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 14 09:55:02 np0005486759.ooo.test sudo[309840]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:03 np0005486759.ooo.test python3.9[309952]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:55:04 np0005486759.ooo.test python3.9[310060]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:55:05 np0005486759.ooo.test python3.9[310168]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 14 09:55:05 np0005486759.ooo.test sudo[310276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqvkmgywajjexvoizbdjpafhcbckbynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435705.4427745-1658-194127466612939/AnsiballZ_podman_container.py
Oct 14 09:55:05 np0005486759.ooo.test sudo[310276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:06 np0005486759.ooo.test python3.9[310278]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 14 09:55:06 np0005486759.ooo.test sudo[310276]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:06 np0005486759.ooo.test systemd-journald[35787]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 120.4 (401 of 333 items), suggesting rotation.
Oct 14 09:55:06 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 09:55:06 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:55:06 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 09:55:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:06.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:55:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:06.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:55:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:06.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:55:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:06.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:55:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:06.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:06 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:06.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:55:06 np0005486759.ooo.test sudo[310411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdouxbzlufxuwgtnyewvgcjuajxoyqks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435706.612145-1666-176781702440436/AnsiballZ_systemd.py
Oct 14 09:55:06 np0005486759.ooo.test sudo[310411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:55:07 np0005486759.ooo.test systemd[1]: tmp-crun.32nSfp.mount: Deactivated successfully.
Oct 14 09:55:07 np0005486759.ooo.test podman[310413]: 2025-10-14 09:55:07.028203805 +0000 UTC m=+0.078166355 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:55:07 np0005486759.ooo.test podman[310413]: 2025-10-14 09:55:07.065351876 +0000 UTC m=+0.115314346 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 09:55:07 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:55:07 np0005486759.ooo.test python3.9[310414]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 14 09:55:07 np0005486759.ooo.test systemd[1]: Stopping nova_compute container...
Oct 14 09:55:08 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:08.503 2 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Oct 14 09:55:08 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:08.505 2 DEBUG oslo_concurrency.lockutils [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:55:08 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:08.506 2 DEBUG oslo_concurrency.lockutils [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:55:08 np0005486759.ooo.test nova_compute[255504]: 2025-10-14 09:55:08.506 2 DEBUG oslo_concurrency.lockutils [None req-f27a95e3-949d-41b6-99ba-dcbad0bfa4d9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:55:08 np0005486759.ooo.test virtqemud[225922]: End of file while reading data: Input/output error
Oct 14 09:55:08 np0005486759.ooo.test systemd[1]: libpod-f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e.scope: Deactivated successfully.
Oct 14 09:55:08 np0005486759.ooo.test systemd[1]: libpod-f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e.scope: Consumed 12.577s CPU time.
Oct 14 09:55:08 np0005486759.ooo.test podman[310441]: 2025-10-14 09:55:08.881510632 +0000 UTC m=+1.587872288 container died f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:55:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:55:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e-userdata-shm.mount: Deactivated successfully.
Oct 14 09:55:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16-merged.mount: Deactivated successfully.
Oct 14 09:55:09 np0005486759.ooo.test podman[310461]: 2025-10-14 09:55:09.031655298 +0000 UTC m=+0.124966887 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, vcs-type=git)
Oct 14 09:55:09 np0005486759.ooo.test podman[310441]: 2025-10-14 09:55:09.038143259 +0000 UTC m=+1.744504875 container cleanup f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3)
Oct 14 09:55:09 np0005486759.ooo.test podman[310441]: nova_compute
Oct 14 09:55:09 np0005486759.ooo.test podman[310454]: 2025-10-14 09:55:09.04012257 +0000 UTC m=+0.147351840 container cleanup f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:55:09 np0005486759.ooo.test podman[310461]: 2025-10-14 09:55:09.072495943 +0000 UTC m=+0.165807542 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Oct 14 09:55:09 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:55:09 np0005486759.ooo.test podman[310482]: 2025-10-14 09:55:09.113518456 +0000 UTC m=+0.046786432 container cleanup f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:55:09 np0005486759.ooo.test podman[310482]: nova_compute
Oct 14 09:55:09 np0005486759.ooo.test systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 14 09:55:09 np0005486759.ooo.test systemd[1]: Stopped nova_compute container.
Oct 14 09:55:09 np0005486759.ooo.test systemd[1]: Starting nova_compute container...
Oct 14 09:55:09 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:55:09 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:09 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:09 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:09 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:09 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326e6fcbc43cdd389b73da2b4b8d85da24d5a0f0986700134503d38d625f5c16/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:09 np0005486759.ooo.test podman[310496]: 2025-10-14 09:55:09.254995813 +0000 UTC m=+0.110702734 container init f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 09:55:09 np0005486759.ooo.test podman[310496]: 2025-10-14 09:55:09.264117056 +0000 UTC m=+0.119823937 container start f49c0980c6fb06c9dc32b579300fc85e9c5f93c334569a08eaae15fe638ec87e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 14 09:55:09 np0005486759.ooo.test podman[310496]: nova_compute
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + sudo -E kolla_set_configs
Oct 14 09:55:09 np0005486759.ooo.test systemd[1]: Started nova_compute container.
Oct 14 09:55:09 np0005486759.ooo.test sudo[310411]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Validating config file
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying service configuration files
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /etc/ceph
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Creating directory /etc/ceph
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /etc/ceph
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Writing out command to execute
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: ++ cat /run_command
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + CMD=nova-compute
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + ARGS=
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + sudo kolla_copy_cacerts
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + [[ ! -n '' ]]
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + . kolla_extend_start
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + echo 'Running command: '\''nova-compute'\'''
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: Running command: 'nova-compute'
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + umask 0022
Oct 14 09:55:09 np0005486759.ooo.test nova_compute[310511]: + exec nova-compute
Oct 14 09:55:09 np0005486759.ooo.test sudo[310630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykuvlybzefzenbarmmwnqiorcttmjyhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760435709.5082004-1675-240655206132729/AnsiballZ_podman_container.py
Oct 14 09:55:09 np0005486759.ooo.test sudo[310630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 14 09:55:10 np0005486759.ooo.test python3.9[310632]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 14 09:55:10 np0005486759.ooo.test systemd[1]: Started libpod-conmon-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227.scope.
Oct 14 09:55:10 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:55:10 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:10 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:10 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 14 09:55:10 np0005486759.ooo.test podman[310658]: 2025-10-14 09:55:10.395086365 +0000 UTC m=+0.162768468 container init 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Oct 14 09:55:10 np0005486759.ooo.test podman[310658]: 2025-10-14 09:55:10.404105775 +0000 UTC m=+0.171787868 container start 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0)
Oct 14 09:55:10 np0005486759.ooo.test python3.9[310632]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Applying nova statedir ownership
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4 already 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4 to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.info
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/console.log
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/d4dee7ea20c47bbf691f78ae3efd9dd29eccd913
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-d4dee7ea20c47bbf691f78ae3efd9dd29eccd913
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-storage-registry-lock
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/compute_nodes
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/7dbe5bae7bc27ef07490c629ec1f09edaa9e8c135ff89c3f08f1e44f39cf5928
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/9469aff02825a9e3dcdb3ceeb358f8d540dc07c8b6e9cd975f170399051d29c3
Oct 14 09:55:10 np0005486759.ooo.test nova_compute_init[310678]: INFO:nova_statedir:Nova statedir ownership complete
Oct 14 09:55:10 np0005486759.ooo.test systemd[1]: libpod-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227.scope: Deactivated successfully.
Oct 14 09:55:10 np0005486759.ooo.test podman[310679]: 2025-10-14 09:55:10.504805037 +0000 UTC m=+0.079501667 container died 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=nova_compute_init)
Oct 14 09:55:10 np0005486759.ooo.test podman[310692]: 2025-10-14 09:55:10.57131062 +0000 UTC m=+0.087070042 container cleanup 67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=nova_compute_init)
Oct 14 09:55:10 np0005486759.ooo.test systemd[1]: libpod-conmon-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227.scope: Deactivated successfully.
Oct 14 09:55:10 np0005486759.ooo.test sudo[310630]: pam_unix(sudo:session): session closed for user root
Oct 14 09:55:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a500e2bf218b8aa906e37ea07a6e8802215c65cc50fe603d7dedb6268048d2b7-merged.mount: Deactivated successfully.
Oct 14 09:55:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-67923de70915e9b94a7ae5f96c90ab4e0808b9b84f0ee822936d4b94c0375227-userdata-shm.mount: Deactivated successfully.
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.038 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.038 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.038 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.038 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 14 09:55:11 np0005486759.ooo.test sshd[288481]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:55:11 np0005486759.ooo.test systemd-logind[759]: Session 42 logged out. Waiting for processes to exit.
Oct 14 09:55:11 np0005486759.ooo.test systemd[1]: session-42.scope: Deactivated successfully.
Oct 14 09:55:11 np0005486759.ooo.test systemd[1]: session-42.scope: Consumed 1min 47.653s CPU time.
Oct 14 09:55:11 np0005486759.ooo.test systemd-logind[759]: Removed session 42.
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.153 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.180 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.727 2 INFO nova.virt.driver [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.858 2 INFO nova.compute.provider_config [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.869 2 DEBUG oslo_concurrency.lockutils [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.869 2 DEBUG oslo_concurrency.lockutils [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.869 2 DEBUG oslo_concurrency.lockutils [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.869 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.870 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.871 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] console_host                   = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.872 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.873 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.873 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.873 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.873 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.873 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.873 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.874 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.875 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.875 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] host                           = np0005486759.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.875 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.875 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.875 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.875 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.876 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.877 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.878 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.879 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.880 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.881 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.882 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.883 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.884 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.885 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.886 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.887 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.888 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.889 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.890 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.891 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.892 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.893 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.893 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.893 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.893 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.893 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.893 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.894 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.895 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.896 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.896 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.896 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.896 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.896 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.896 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.897 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.898 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.899 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.900 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.901 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.902 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.903 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.904 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.905 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.906 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.907 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.908 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.909 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.910 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.911 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.912 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.913 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.914 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.915 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.916 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.917 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.918 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.918 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.918 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.918 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.918 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.918 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.919 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.920 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.921 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.922 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.923 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.924 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.925 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.926 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.927 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.928 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.929 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.930 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.931 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.932 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.933 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.934 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.935 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.936 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.936 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.936 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.936 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.937 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.937 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.937 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.937 2 WARNING oslo_config.cfg [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: and ``live_migration_inbound_addr`` respectively.
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: ).  Its value may be silently ignored in the future.
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.937 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.937 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.938 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.939 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.939 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.939 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.939 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.939 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.939 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.940 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.941 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.942 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.942 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.942 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.942 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.942 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.942 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.943 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.944 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.945 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.946 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.947 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.948 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.949 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.950 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.951 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.952 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.953 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.954 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.955 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.956 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.957 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.957 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.957 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.957 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.957 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.957 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.958 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.959 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.960 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.961 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.962 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.963 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.964 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.965 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.966 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.967 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.968 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.969 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.970 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.971 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.971 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.971 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.971 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.971 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.971 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.972 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.972 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.972 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.972 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.972 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.972 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.973 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.974 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.975 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.976 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.977 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.978 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.979 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.980 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.981 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.982 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.983 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.984 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.985 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.986 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.987 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.988 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.989 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.990 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.991 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.992 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.993 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.994 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.995 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.995 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.995 2 DEBUG oslo_service.service [None req-0e6e02be-4b0c-43a4-b034-d929193090cc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 14 09:55:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:11.995 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.008 2 INFO nova.virt.node [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Determined node identity 2da4b4c2-8401-4cdb-85a2-115635137a6d from /var/lib/nova/compute_id
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.008 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.009 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.009 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.009 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.023 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f8a7193cac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.026 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f8a7193cac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.027 2 INFO nova.virt.libvirt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Connection event '1' reason 'None'
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.033 2 INFO nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Libvirt host capabilities <capabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <host>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <uuid>03f5bc8b-edfd-405c-8f42-0ac9afa0b79f</uuid>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <arch>x86_64</arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model>EPYC-Rome-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <vendor>AMD</vendor>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <microcode version='16777317'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <signature family='23' model='49' stepping='0'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='x2apic'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='tsc-deadline'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='osxsave'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='hypervisor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='tsc_adjust'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='spec-ctrl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='stibp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='arch-capabilities'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='cmp_legacy'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='topoext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='virt-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='lbrv'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='tsc-scale'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='vmcb-clean'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='pause-filter'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='pfthreshold'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='svme-addr-chk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='rdctl-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='skip-l1dfl-vmentry'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='mds-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature name='pschange-mc-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <pages unit='KiB' size='4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <pages unit='KiB' size='2048'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <pages unit='KiB' size='1048576'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <power_management>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <suspend_mem/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <suspend_disk/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <suspend_hybrid/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </power_management>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <iommu support='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <migration_features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <live/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <uri_transports>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <uri_transport>tcp</uri_transport>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <uri_transport>rdma</uri_transport>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </uri_transports>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </migration_features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <topology>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <cells num='1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <cell id='0'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           <memory unit='KiB'>16116612</memory>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           <pages unit='KiB' size='4'>4029153</pages>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           <pages unit='KiB' size='2048'>0</pages>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           <distances>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <sibling id='0' value='10'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           </distances>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           <cpus num='8'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:           </cpus>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         </cell>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </cells>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </topology>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <cache>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </cache>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <secmodel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model>selinux</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <doi>0</doi>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </secmodel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <secmodel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model>dac</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <doi>0</doi>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </secmodel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </host>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <guest>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <os_type>hvm</os_type>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <arch name='i686'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <wordsize>32</wordsize>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <domain type='qemu'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <domain type='kvm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <pae/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <nonpae/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <acpi default='on' toggle='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <apic default='on' toggle='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <cpuselection/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <deviceboot/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <disksnapshot default='on' toggle='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <externalSnapshot/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </guest>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <guest>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <os_type>hvm</os_type>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <arch name='x86_64'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <wordsize>64</wordsize>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <domain type='qemu'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <domain type='kvm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <acpi default='on' toggle='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <apic default='on' toggle='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <cpuselection/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <deviceboot/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <disksnapshot default='on' toggle='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <externalSnapshot/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </guest>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: </capabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.039 2 DEBUG nova.virt.libvirt.volume.mount [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.041 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.047 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: <domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <domain>kvm</domain>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <arch>i686</arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <vcpu max='1024'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <iothreads supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <os supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='firmware'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <loader supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>rom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pflash</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='readonly'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>yes</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='secure'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </loader>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </os>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='maximum' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='maximumMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-model' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <vendor>AMD</vendor>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='x2apic'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='stibp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='succor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lbrv'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='mds-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='custom' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Dhyana-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-128'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-256'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-512'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <memoryBacking supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='sourceType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>file</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>anonymous</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>memfd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </memoryBacking>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <disk supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='diskDevice'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>disk</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cdrom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>floppy</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>lun</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>fdc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>sata</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <graphics supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vnc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egl-headless</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>dbus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </graphics>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <video supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='modelType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vga</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cirrus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>none</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>bochs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ramfb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </video>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hostdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='mode'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>subsystem</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='startupPolicy'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>mandatory</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>requisite</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>optional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='subsysType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pci</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='capsType'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='pciBackend'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hostdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <rng supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>random</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </rng>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <filesystem supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='driverType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>path</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>handle</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtiofs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </filesystem>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <tpm supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-tis</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-crb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emulator</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>external</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendVersion'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>2.0</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </tpm>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <redirdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </redirdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <channel supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pty</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>unix</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </channel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <crypto supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>qemu</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </crypto>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <interface supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>passt</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </interface>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <panic supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>isa</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>hyperv</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </panic>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <gic supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <vmcoreinfo supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <genid supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backingStoreInput supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backup supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <async-teardown supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <ps2 supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sev supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sgx supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hyperv supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='features'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>relaxed</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vapic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>spinlocks</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vpindex</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>runtime</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>synic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>stimer</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reset</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vendor_id</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>frequencies</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reenlightenment</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tlbflush</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ipi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>avic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emsr_bitmap</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>xmm_input</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hyperv>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <launchSecurity supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: </domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.052 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: <domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <domain>kvm</domain>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <arch>i686</arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <vcpu max='240'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <iothreads supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <os supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='firmware'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <loader supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>rom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pflash</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='readonly'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>yes</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='secure'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </loader>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </os>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='maximum' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='maximumMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-model' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <vendor>AMD</vendor>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='x2apic'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='stibp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='succor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lbrv'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='mds-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='custom' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Dhyana-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-128'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-256'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-512'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <memoryBacking supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='sourceType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>file</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>anonymous</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>memfd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </memoryBacking>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <disk supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='diskDevice'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>disk</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cdrom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>floppy</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>lun</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ide</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>fdc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>sata</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <graphics supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vnc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egl-headless</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>dbus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </graphics>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <video supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='modelType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vga</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cirrus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>none</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>bochs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ramfb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </video>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hostdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='mode'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>subsystem</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='startupPolicy'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>mandatory</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>requisite</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>optional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='subsysType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pci</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='capsType'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='pciBackend'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hostdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <rng supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>random</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </rng>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <filesystem supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='driverType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>path</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>handle</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtiofs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </filesystem>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <tpm supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-tis</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-crb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emulator</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>external</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendVersion'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>2.0</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </tpm>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <redirdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </redirdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <channel supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pty</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>unix</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </channel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <crypto supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>qemu</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </crypto>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <interface supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>passt</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </interface>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <panic supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>isa</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>hyperv</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </panic>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <gic supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <vmcoreinfo supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <genid supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backingStoreInput supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backup supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <async-teardown supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <ps2 supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sev supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sgx supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hyperv supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='features'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>relaxed</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vapic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>spinlocks</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vpindex</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>runtime</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>synic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>stimer</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reset</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vendor_id</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>frequencies</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reenlightenment</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tlbflush</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ipi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>avic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emsr_bitmap</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>xmm_input</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hyperv>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <launchSecurity supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: </domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.104 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.110 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: <domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <domain>kvm</domain>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <arch>x86_64</arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <vcpu max='1024'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <iothreads supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <os supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='firmware'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>efi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <loader supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>rom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pflash</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='readonly'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>yes</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='secure'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>yes</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </loader>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </os>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='maximum' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='maximumMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-model' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <vendor>AMD</vendor>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='x2apic'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='stibp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='succor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lbrv'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='mds-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='custom' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Dhyana-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-128'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-256'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-512'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <memoryBacking supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='sourceType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>file</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>anonymous</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>memfd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </memoryBacking>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <disk supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='diskDevice'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>disk</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cdrom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>floppy</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>lun</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>fdc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>sata</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <graphics supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vnc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egl-headless</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>dbus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </graphics>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <video supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='modelType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vga</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cirrus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>none</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>bochs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ramfb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </video>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hostdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='mode'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>subsystem</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='startupPolicy'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>mandatory</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>requisite</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>optional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='subsysType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pci</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='capsType'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='pciBackend'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hostdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <rng supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>random</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </rng>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <filesystem supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='driverType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>path</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>handle</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtiofs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </filesystem>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <tpm supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-tis</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-crb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emulator</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>external</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendVersion'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>2.0</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </tpm>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <redirdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </redirdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <channel supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pty</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>unix</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </channel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <crypto supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>qemu</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </crypto>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <interface supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>passt</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </interface>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <panic supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>isa</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>hyperv</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </panic>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <gic supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <vmcoreinfo supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <genid supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backingStoreInput supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backup supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <async-teardown supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <ps2 supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sev supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sgx supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hyperv supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='features'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>relaxed</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vapic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>spinlocks</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vpindex</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>runtime</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>synic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>stimer</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reset</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vendor_id</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>frequencies</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reenlightenment</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tlbflush</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ipi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>avic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emsr_bitmap</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>xmm_input</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hyperv>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <launchSecurity supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: </domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.174 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: <domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <path>/usr/libexec/qemu-kvm</path>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <domain>kvm</domain>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <arch>x86_64</arch>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <vcpu max='240'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <iothreads supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <os supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='firmware'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <loader supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>rom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pflash</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='readonly'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>yes</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='secure'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>no</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </loader>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </os>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-passthrough' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='hostPassthroughMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='maximum' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='maximumMigratable'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>on</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>off</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='host-model' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <vendor>AMD</vendor>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='x2apic'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-deadline'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='hypervisor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc_adjust'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='spec-ctrl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='stibp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='arch-capabilities'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='cmp_legacy'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='overflow-recov'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='succor'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='amd-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='virt-ssbd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lbrv'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='tsc-scale'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='vmcb-clean'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pause-filter'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pfthreshold'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='svme-addr-chk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='rdctl-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='mds-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='require' name='pschange-mc-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <feature policy='disable' name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <mode name='custom' supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Broadwell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cascadelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Cooperlake-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Denverton-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Dhyana-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Genoa-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='auto-ibrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Milan-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amd-psfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='no-nested-data-bp'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='null-sel-clr-base'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='stibp-always-on'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-Rome-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='EPYC-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='GraniteRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-128'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-256'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx10-512'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='prefetchiti'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Haswell-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-noTSX'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v6'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Icelake-Server-v7'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='IvyBridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='KnightsMill-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4fmaps'/>
Oct 14 09:55:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:55:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-4vnniw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512er'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512pf'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G4-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Opteron_G5-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fma4'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tbm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xop'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SapphireRapids-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='amx-tile'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-bf16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-fp16'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512-vpopcntdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bitalg'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vbmi2'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrc'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fzrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='la57'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='taa-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='tsx-ldtrk'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xfd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:55:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126003 "" "Go-http-client/1.1"
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='SierraForest-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ifma'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-ne-convert'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx-vnni-int8'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='bus-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cmpccxadd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fbsdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='fsrs'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ibrs-all'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mcdt-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pbrsb-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='psdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='sbdr-ssdp-no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='serialize'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vaes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='vpclmulqdq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Client-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='hle'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='rtm'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Skylake-Server-v5'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512bw'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512cd'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512dq'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512f'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='avx512vl'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='invpcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pcid'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='pku'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='mpx'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v2'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v3'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='core-capability'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='split-lock-detect'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='Snowridge-v4'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='cldemote'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='erms'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='gfni'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdir64b'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='movdiri'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='xsaves'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='athlon-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='core2duo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='coreduo-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='n270-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='ss'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <blockers model='phenom-v1'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnow'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <feature name='3dnowext'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </blockers>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </mode>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </cpu>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <memoryBacking supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <enum name='sourceType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>file</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>anonymous</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <value>memfd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </memoryBacking>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <disk supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='diskDevice'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>disk</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cdrom</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>floppy</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>lun</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ide</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>fdc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>sata</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <graphics supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vnc</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egl-headless</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>dbus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </graphics>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <video supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='modelType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vga</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>cirrus</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>none</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>bochs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ramfb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </video>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hostdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='mode'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>subsystem</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='startupPolicy'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>mandatory</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>requisite</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>optional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='subsysType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pci</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>scsi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='capsType'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='pciBackend'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hostdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <rng supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtio-non-transitional</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>random</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>egd</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </rng>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <filesystem supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='driverType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>path</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>handle</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>virtiofs</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </filesystem>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <tpm supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-tis</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tpm-crb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emulator</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>external</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendVersion'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>2.0</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </tpm>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <redirdev supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='bus'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>usb</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </redirdev>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <channel supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>pty</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>unix</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </channel>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <crypto supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='type'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>qemu</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendModel'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>builtin</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </crypto>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <interface supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='backendType'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>default</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>passt</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </interface>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <panic supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='model'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>isa</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>hyperv</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </panic>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </devices>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   <features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <gic supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <vmcoreinfo supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <genid supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backingStoreInput supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <backup supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <async-teardown supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <ps2 supported='yes'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sev supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <sgx supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <hyperv supported='yes'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       <enum name='features'>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>relaxed</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vapic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>spinlocks</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vpindex</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>runtime</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>synic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>stimer</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reset</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>vendor_id</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>frequencies</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>reenlightenment</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>tlbflush</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>ipi</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>avic</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>emsr_bitmap</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:         <value>xmm_input</value>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:       </enum>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     </hyperv>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:     <launchSecurity supported='no'/>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:   </features>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: </domainCapabilities>
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.229 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.229 2 INFO nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Secure Boot support detected
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.231 2 INFO nova.virt.libvirt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.232 2 INFO nova.virt.libvirt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.240 2 DEBUG nova.virt.libvirt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.279 2 INFO nova.virt.node [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Determined node identity 2da4b4c2-8401-4cdb-85a2-115635137a6d from /var/lib/nova/compute_id
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.303 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Verified node 2da4b4c2-8401-4cdb-85a2-115635137a6d matches my host np0005486759.ooo.test _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Oct 14 09:55:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:55:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15237 "" "Go-http-client/1.1"
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.361 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.365 2 DEBUG nova.virt.libvirt.vif [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005486759.ooo.test',hostname='test',id=1,image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T08:45:21Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005486759.ooo.test',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8bf64e81a4214f9490d231a2e79ab3d8',ramdisk_id='',reservation_id='r-8vq1axpu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-14T08:45:21Z,user_data=None,user_id='2aff2e6f927a42b1b822d05cd9349762',uuid=4408214d-dae5-4452-92e9-eb4abd6589d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.365 2 DEBUG nova.network.os_vif_util [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Converting VIF {"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.366 2 DEBUG nova.network.os_vif_util [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.366 2 DEBUG os_vif [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.391 2 DEBUG ovsdbapp.backend.ovs_idl [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.391 2 DEBUG ovsdbapp.backend.ovs_idl [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.392 2 DEBUG ovsdbapp.backend.ovs_idl [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.406 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.406 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:55:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.407 2 INFO oslo.privsep.daemon [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpwbv62y68/privsep.sock']
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.033 2 INFO oslo.privsep.daemon [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Spawned new privsep daemon via rootwrap
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.920 40 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.925 40 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.929 40 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:12.929 40 INFO oslo.privsep.daemon [-] privsep daemon running as pid 40
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.334 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee08de8-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.334 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeee08de8-f9, col_values=(('external_ids', {'iface-id': 'eee08de8-f983-4ebe-a654-f67f48659e50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:cf:16', 'vm-uuid': '4408214d-dae5-4452-92e9-eb4abd6589d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.336 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.336 2 INFO os_vif [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9')
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.337 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.345 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.345 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.710 2 DEBUG oslo_concurrency.lockutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.710 2 DEBUG oslo_concurrency.lockutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.711 2 DEBUG oslo_concurrency.lockutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.711 2 DEBUG nova.compute.resource_tracker [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.777 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.859 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.861 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.939 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:55:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:13.940 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:55:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.025 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.026 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.098 2 DEBUG oslo_concurrency.processutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.303 2 WARNING nova.virt.libvirt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.305 2 DEBUG nova.compute.resource_tracker [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12502MB free_disk=386.71522521972656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.305 2 DEBUG oslo_concurrency.lockutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.306 2 DEBUG oslo_concurrency.lockutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.619 2 DEBUG nova.compute.resource_tracker [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.619 2 DEBUG nova.compute.resource_tracker [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.620 2 DEBUG nova.compute.resource_tracker [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:55:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:14.634 2 DEBUG nova.scheduler.client.report [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.018 2 DEBUG nova.scheduler.client.report [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.019 2 DEBUG nova.compute.provider_tree [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.037 2 DEBUG nova.scheduler.client.report [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.068 2 DEBUG nova.scheduler.client.report [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.118 2 DEBUG nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.118 2 INFO nova.virt.libvirt.host [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] kernel doesn't support AMD SEV
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.119 2 DEBUG nova.compute.provider_tree [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.120 2 DEBUG nova.virt.libvirt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.138 2 DEBUG nova.scheduler.client.report [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.162 2 DEBUG nova.compute.resource_tracker [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.162 2 DEBUG oslo_concurrency.lockutils [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.162 2 DEBUG nova.service [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.184 2 DEBUG nova.service [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Oct 14 09:55:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:15.184 2 DEBUG nova.servicegroup.drivers.db [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] DB_Driver: join new ServiceGroup member np0005486759.ooo.test to the compute group, service = <Service: host=np0005486759.ooo.test, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Oct 14 09:55:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:55:15 np0005486759.ooo.test systemd[1]: tmp-crun.pzO2FJ.mount: Deactivated successfully.
Oct 14 09:55:15 np0005486759.ooo.test podman[310783]: 2025-10-14 09:55:15.436721116 +0000 UTC m=+0.065073039 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 14 09:55:15 np0005486759.ooo.test podman[310783]: 2025-10-14 09:55:15.46653021 +0000 UTC m=+0.094882093 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:55:15 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:55:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:16.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:17.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:55:18 np0005486759.ooo.test podman[310801]: 2025-10-14 09:55:18.450378113 +0000 UTC m=+0.079663870 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:55:18 np0005486759.ooo.test podman[310801]: 2025-10-14 09:55:18.456246705 +0000 UTC m=+0.085532472 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:55:18 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:55:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62002 DF PROTO=TCP SPT=44886 DPT=9102 SEQ=1578630951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C0B470000000001030307) 
Oct 14 09:55:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62003 DF PROTO=TCP SPT=44886 DPT=9102 SEQ=1578630951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C0F420000000001030307) 
Oct 14 09:55:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:21.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:22.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62004 DF PROTO=TCP SPT=44886 DPT=9102 SEQ=1578630951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C17410000000001030307) 
Oct 14 09:55:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62005 DF PROTO=TCP SPT=44886 DPT=9102 SEQ=1578630951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C27010000000001030307) 
Oct 14 09:55:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:26.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:27.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:55:29 np0005486759.ooo.test podman[310824]: 2025-10-14 09:55:29.453806507 +0000 UTC m=+0.086315408 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:55:29 np0005486759.ooo.test podman[310824]: 2025-10-14 09:55:29.46843227 +0000 UTC m=+0.100941161 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009)
Oct 14 09:55:29 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:55:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:55:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:55:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:55:31 np0005486759.ooo.test podman[310843]: 2025-10-14 09:55:31.453188964 +0000 UTC m=+0.082508560 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:55:31 np0005486759.ooo.test podman[310843]: 2025-10-14 09:55:31.458214089 +0000 UTC m=+0.087533705 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:55:31 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:55:31 np0005486759.ooo.test podman[310850]: 2025-10-14 09:55:31.502314197 +0000 UTC m=+0.121794437 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct 14 09:55:31 np0005486759.ooo.test podman[310850]: 2025-10-14 09:55:31.513433352 +0000 UTC m=+0.132913662 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:55:31 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:55:31 np0005486759.ooo.test podman[310844]: 2025-10-14 09:55:31.55628701 +0000 UTC m=+0.178695251 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 09:55:31 np0005486759.ooo.test podman[310844]: 2025-10-14 09:55:31.563557216 +0000 UTC m=+0.185965407 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible)
Oct 14 09:55:31 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:55:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:31.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:32.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:32 np0005486759.ooo.test sshd[310905]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:55:33 np0005486759.ooo.test sshd[310906]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:55:33 np0005486759.ooo.test sshd[310906]: error: kex_exchange_identification: read: Connection reset by peer
Oct 14 09:55:33 np0005486759.ooo.test sshd[310906]: Connection reset by 45.140.17.97 port 34987
Oct 14 09:55:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:36.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:55:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:37.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:37 np0005486759.ooo.test podman[310907]: 2025-10-14 09:55:37.45160434 +0000 UTC m=+0.078814138 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 09:55:37 np0005486759.ooo.test podman[310907]: 2025-10-14 09:55:37.514795117 +0000 UTC m=+0.142004975 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 09:55:37 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:55:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:55:39 np0005486759.ooo.test podman[310932]: 2025-10-14 09:55:39.442294267 +0000 UTC m=+0.070787761 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Oct 14 09:55:39 np0005486759.ooo.test podman[310932]: 2025-10-14 09:55:39.452930224 +0000 UTC m=+0.081423738 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Oct 14 09:55:39 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:55:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:41.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:55:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:55:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:55:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 126003 "" "Go-http-client/1.1"
Oct 14 09:55:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:55:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15247 "" "Go-http-client/1.1"
Oct 14 09:55:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:42.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:55:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:55:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:55:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:55:46 np0005486759.ooo.test podman[310952]: 2025-10-14 09:55:46.443937932 +0000 UTC m=+0.075915438 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 09:55:46 np0005486759.ooo.test podman[310952]: 2025-10-14 09:55:46.452383531 +0000 UTC m=+0.084361047 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009)
Oct 14 09:55:46 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:55:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:46.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:47.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:48.369 2 DEBUG nova.compute.manager [None req-1eed7b81-662d-4baf-b71c-5c5239d9d7ba 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:55:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:48.373 2 INFO nova.compute.manager [None req-1eed7b81-662d-4baf-b71c-5c5239d9d7ba 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Retrieving diagnostics
Oct 14 09:55:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:55:49 np0005486759.ooo.test podman[310971]: 2025-10-14 09:55:49.448388322 +0000 UTC m=+0.081512740 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:55:49 np0005486759.ooo.test podman[310971]: 2025-10-14 09:55:49.461289738 +0000 UTC m=+0.094414146 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:55:49 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:55:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30559 DF PROTO=TCP SPT=42900 DPT=9102 SEQ=3179294618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C80770000000001030307) 
Oct 14 09:55:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30560 DF PROTO=TCP SPT=42900 DPT=9102 SEQ=3179294618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C84820000000001030307) 
Oct 14 09:55:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:52.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:52.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30561 DF PROTO=TCP SPT=42900 DPT=9102 SEQ=3179294618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C8C810000000001030307) 
Oct 14 09:55:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:54.161 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:55:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:54.162 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:55:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:54.163 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:55:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:56.393 2 DEBUG oslo_concurrency.lockutils [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:55:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:56.393 2 DEBUG oslo_concurrency.lockutils [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:55:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:56.394 2 DEBUG nova.compute.manager [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:55:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:56.398 2 DEBUG nova.compute.manager [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Oct 14 09:55:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:56.403 2 DEBUG nova.objects.instance [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'flavor' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:55:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:56.461 2 DEBUG nova.virt.libvirt.driver [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Oct 14 09:55:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30562 DF PROTO=TCP SPT=42900 DPT=9102 SEQ=3179294618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9C9C410000000001030307) 
Oct 14 09:55:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:57.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:57.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:58 np0005486759.ooo.test kernel: device tapeee08de8-f9 left promiscuous mode
Oct 14 09:55:58 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435758.8455] device (tapeee08de8-f9): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Oct 14 09:55:58 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:55:58Z|00040|binding|INFO|Releasing lport eee08de8-f983-4ebe-a654-f67f48659e50 from this chassis (sb_readonly=0)
Oct 14 09:55:58 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:55:58Z|00041|binding|INFO|Setting lport eee08de8-f983-4ebe-a654-f67f48659e50 down in Southbound
Oct 14 09:55:58 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:55:58Z|00042|binding|INFO|Removing iface tapeee08de8-f9 ovn-installed in OVS
Oct 14 09:55:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:58.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:58.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:58 np0005486759.ooo.test systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Oct 14 09:55:58 np0005486759.ooo.test systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 3min 34.088s CPU time.
Oct 14 09:55:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:58.891 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:cf:16 192.168.0.173'], port_security=['fa:16:3e:8e:cf:16 192.168.0.173'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.173/24', 'neutron:device_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005486759.ooo.test', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9197abc5-07db-4abf-9578-9360b49aea49', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'faabc66a-aada-4f6b-bec3-989808c74b8e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62cfbaba-fb96-4812-8b41-6ad8964122a3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=eee08de8-f983-4ebe-a654-f67f48659e50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:55:58 np0005486759.ooo.test systemd-machined[93972]: Machine qemu-1-instance-00000001 terminated.
Oct 14 09:55:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:58.893 183328 INFO neutron.agent.ovn.metadata.agent [-] Port eee08de8-f983-4ebe-a654-f67f48659e50 in datapath 9197abc5-07db-4abf-9578-9360b49aea49 unbound from our chassis
Oct 14 09:55:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:58.896 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9197abc5-07db-4abf-9578-9360b49aea49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 09:55:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:58.900 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[dde8b4eb-8b5a-4ac5-adfb-221bd813f1d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:55:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:58.901 183328 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 namespace which is not needed anymore
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.479 2 INFO nova.virt.libvirt.driver [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Instance shutdown successfully after 3 seconds.
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.486 2 INFO nova.virt.libvirt.driver [-] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Instance destroyed successfully.
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.487 2 DEBUG nova.objects.instance [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.506 2 DEBUG nova.compute.manager [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:55:59 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:55:59.565 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.580 2 DEBUG oslo_concurrency.lockutils [None req-4954a120-143b-48c2-8a86-2d341e1ad55b 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.707 2 DEBUG nova.compute.manager [req-74ec37d8-95d8-4eee-91d1-18dc4407f9ff req-8cbc6173-5eec-4bd0-9eb0-612e9864b2ac f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received event network-vif-unplugged-eee08de8-f983-4ebe-a654-f67f48659e50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.708 2 DEBUG oslo_concurrency.lockutils [req-74ec37d8-95d8-4eee-91d1-18dc4407f9ff req-8cbc6173-5eec-4bd0-9eb0-612e9864b2ac f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.708 2 DEBUG oslo_concurrency.lockutils [req-74ec37d8-95d8-4eee-91d1-18dc4407f9ff req-8cbc6173-5eec-4bd0-9eb0-612e9864b2ac f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.708 2 DEBUG oslo_concurrency.lockutils [req-74ec37d8-95d8-4eee-91d1-18dc4407f9ff req-8cbc6173-5eec-4bd0-9eb0-612e9864b2ac f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.709 2 DEBUG nova.compute.manager [req-74ec37d8-95d8-4eee-91d1-18dc4407f9ff req-8cbc6173-5eec-4bd0-9eb0-612e9864b2ac f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] No waiting events found dispatching network-vif-unplugged-eee08de8-f983-4ebe-a654-f67f48659e50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 09:55:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:55:59.709 2 WARNING nova.compute.manager [req-74ec37d8-95d8-4eee-91d1-18dc4407f9ff req-8cbc6173-5eec-4bd0-9eb0-612e9864b2ac f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received unexpected event network-vif-unplugged-eee08de8-f983-4ebe-a654-f67f48659e50 for instance with vm_state stopped and task_state None.
Oct 14 09:56:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:56:00 np0005486759.ooo.test systemd[1]: tmp-crun.qSj5q6.mount: Deactivated successfully.
Oct 14 09:56:00 np0005486759.ooo.test podman[311047]: 2025-10-14 09:56:00.467332101 +0000 UTC m=+0.090904019 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:56:00 np0005486759.ooo.test podman[311047]: 2025-10-14 09:56:00.506710868 +0000 UTC m=+0.130282746 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct 14 09:56:00 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:56:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:01.761 2 DEBUG nova.compute.manager [req-ad309367-39dd-4f0d-abcb-05bb92bf0f36 req-0fac9fbc-3de4-4a31-b591-718e6f9f1a29 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received event network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 09:56:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:01.761 2 DEBUG oslo_concurrency.lockutils [req-ad309367-39dd-4f0d-abcb-05bb92bf0f36 req-0fac9fbc-3de4-4a31-b591-718e6f9f1a29 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:01.762 2 DEBUG oslo_concurrency.lockutils [req-ad309367-39dd-4f0d-abcb-05bb92bf0f36 req-0fac9fbc-3de4-4a31-b591-718e6f9f1a29 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:01.762 2 DEBUG oslo_concurrency.lockutils [req-ad309367-39dd-4f0d-abcb-05bb92bf0f36 req-0fac9fbc-3de4-4a31-b591-718e6f9f1a29 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:01.762 2 DEBUG nova.compute.manager [req-ad309367-39dd-4f0d-abcb-05bb92bf0f36 req-0fac9fbc-3de4-4a31-b591-718e6f9f1a29 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] No waiting events found dispatching network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 09:56:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:01.763 2 WARNING nova.compute.manager [req-ad309367-39dd-4f0d-abcb-05bb92bf0f36 req-0fac9fbc-3de4-4a31-b591-718e6f9f1a29 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received unexpected event network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 for instance with vm_state stopped and task_state None.
Oct 14 09:56:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:56:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:56:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:56:01 np0005486759.ooo.test podman[311066]: 2025-10-14 09:56:01.871026901 +0000 UTC m=+0.085838244 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:56:01 np0005486759.ooo.test podman[311066]: 2025-10-14 09:56:01.905413965 +0000 UTC m=+0.120225298 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:56:01 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:56:01 np0005486759.ooo.test podman[311068]: 2025-10-14 09:56:01.921768096 +0000 UTC m=+0.130894264 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 09:56:01 np0005486759.ooo.test podman[311068]: 2025-10-14 09:56:01.934731923 +0000 UTC m=+0.143858151 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:56:01 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:56:01 np0005486759.ooo.test podman[311067]: 2025-10-14 09:56:01.982895471 +0000 UTC m=+0.192543205 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:56:01 np0005486759.ooo.test podman[311067]: 2025-10-14 09:56:01.992140984 +0000 UTC m=+0.201788688 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:56:02 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:56:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:02.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:02.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.815 2 DEBUG nova.compute.manager [None req-97a0351d-4110-41e1-867c-079029f76529 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server [None req-97a0351d-4110-41e1-867c-079029f76529 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     raise self.value
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     raise self.value
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Oct 14 09:56:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:03.842 2 ERROR oslo_messaging.rpc.server 
Oct 14 09:56:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:07.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:07.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:56:08 np0005486759.ooo.test podman[311126]: 2025-10-14 09:56:08.461292969 +0000 UTC m=+0.087086911 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 09:56:08 np0005486759.ooo.test podman[311126]: 2025-10-14 09:56:08.571308162 +0000 UTC m=+0.197102024 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 09:56:08 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:56:09 np0005486759.ooo.test systemd[1]: tmp-crun.2dU2Jz.mount: Deactivated successfully.
Oct 14 09:56:09 np0005486759.ooo.test systemd[1]: libpod-465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777.scope: Deactivated successfully.
Oct 14 09:56:09 np0005486759.ooo.test podman[311019]: 2025-10-14 09:56:09.063840305 +0000 UTC m=+10.081340021 container died 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, release=1, maintainer=OpenStack TripleO Team, version=17.1.9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.33.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 09:56:09 np0005486759.ooo.test podman[311019]: 2025-10-14 09:56:09.185842766 +0000 UTC m=+10.203342462 container cleanup 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, vcs-type=git, io.openshift.expose-services=, release=1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/agreements, version=17.1.9, tcib_managed=true, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, build-date=2025-07-21T16:28:53, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20250721.1, io.openshift.tags=rhosp osp openstack osp-17.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Oct 14 09:56:09 np0005486759.ooo.test podman[311153]: 2025-10-14 09:56:09.205310723 +0000 UTC m=+0.136307021 container cleanup 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, release=1, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, batch=17.1_20250721.1, io.buildah.version=1.33.12, maintainer=OpenStack TripleO Team, version=17.1.9, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1, build-date=2025-07-21T16:28:53, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Oct 14 09:56:09 np0005486759.ooo.test systemd[1]: libpod-conmon-465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777.scope: Deactivated successfully.
Oct 14 09:56:09 np0005486759.ooo.test podman[311170]: 2025-10-14 09:56:09.296433026 +0000 UTC m=+0.084577114 container remove 465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=6abf7c351fd73f1a4e60437aa721e00f9a9d02d3, build-date=2025-07-21T16:28:53, io.openshift.tags=rhosp osp openstack osp-17.1, release=1, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, version=17.1.9, batch=17.1_20250721.1, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhosp17/openstack-neutron-metadata-agent-ovn/images/17.1.9-1)
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.301 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[ded53003-286c-4341-9521-87087a41b0b0]: (4, ('Tue Oct 14 09:55:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 (465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777)\n465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777\nTue Oct 14 09:56:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 (465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777)\n465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777\n', 'time="2025-10-14T09:56:09Z" level=warning msg="StopSignal SIGTERM failed to stop container neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 in 10 seconds, resorting to SIGKILL"\n', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.303 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[f7dbe2f0-1630-4e32-80c5-a1b6c2e4ca09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.304 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9197abc5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:09.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:09 np0005486759.ooo.test kernel: device tap9197abc5-00 left promiscuous mode
Oct 14 09:56:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:09.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.324 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[84fda202-d439-438e-993c-ea25964f4c7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.338 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[2d720bb7-6b99-4ad3-8d36-3d6cd569a22f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.339 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[36797d69-f2e7-4b1b-a078-e9979a1dad79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.352 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce58bf6-4b2a-4f1e-aa1b-c754378d0708]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754096, 'reachable_time': 32117, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311193, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.361 183464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.362 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[edf637da-2896-40aa-8aac-af06cf1c92a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.363 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 09:56:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:09.364 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-34874d58ed9ea56da0acf8f63c3cf2d1b1b4035503c3f2994c6596fd2507826e-merged.mount: Deactivated successfully.
Oct 14 09:56:09 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-465b5104787f13c0185cf9cdbd8eaa456808dab5da5f280fd2fc8aafd0d0b777-userdata-shm.mount: Deactivated successfully.
Oct 14 09:56:09 np0005486759.ooo.test systemd[1]: run-netns-ovnmeta\x2d9197abc5\x2d07db\x2d4abf\x2d9578\x2d9360b49aea49.mount: Deactivated successfully.
Oct 14 09:56:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:56:10 np0005486759.ooo.test podman[311195]: 2025-10-14 09:56:10.462295663 +0000 UTC m=+0.089630108 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, managed_by=edpm_ansible)
Oct 14 09:56:10 np0005486759.ooo.test podman[311195]: 2025-10-14 09:56:10.479319356 +0000 UTC m=+0.106653731 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal)
Oct 14 09:56:10 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:56:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:12.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:56:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:56:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:56:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 124011 "" "Go-http-client/1.1"
Oct 14 09:56:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:56:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 14757 "" "Go-http-client/1.1"
Oct 14 09:56:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:12.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:56:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:56:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:14.134 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760435759.1334612, 4408214d-dae5-4452-92e9-eb4abd6589d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 09:56:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:14.135 2 INFO nova.compute.manager [-] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] VM Stopped (Lifecycle Event)
Oct 14 09:56:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:14.166 2 DEBUG nova.compute.manager [None req-06cab358-1f2f-424b-89dc-a2ae9123a320 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:14.170 2 DEBUG nova.compute.manager [None req-06cab358-1f2f-424b-89dc-a2ae9123a320 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 14 09:56:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:15.185 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:15.186 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:15.187 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:56:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:15.187 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:56:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:17.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:17.148 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:56:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:17.149 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:56:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:17.149 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:56:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:17.149 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:56:17 np0005486759.ooo.test systemd[1]: tmp-crun.DAGM6z.mount: Deactivated successfully.
Oct 14 09:56:17 np0005486759.ooo.test podman[311215]: 2025-10-14 09:56:17.449584746 +0000 UTC m=+0.081939484 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 14 09:56:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:17.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:17 np0005486759.ooo.test podman[311215]: 2025-10-14 09:56:17.4582204 +0000 UTC m=+0.090575148 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 09:56:17 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.213 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.238 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.239 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.239 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.240 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.240 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.241 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.241 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.242 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.258 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Triggering sync for uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.259 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.259 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.260 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.260 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.261 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.298 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.304 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.304 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.305 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.305 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.360 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.416 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.417 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.489 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.490 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.556 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.557 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.632 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.825 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.826 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12887MB free_disk=386.7154769897461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.826 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.827 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.899 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.899 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.899 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.948 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.963 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.986 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.986 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:18.987 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:56:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46632 DF PROTO=TCP SPT=60546 DPT=9102 SEQ=2920774483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9CF5A70000000001030307) 
Oct 14 09:56:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:56:20 np0005486759.ooo.test podman[311246]: 2025-10-14 09:56:20.453543131 +0000 UTC m=+0.083970985 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:56:20 np0005486759.ooo.test podman[311246]: 2025-10-14 09:56:20.459612137 +0000 UTC m=+0.090040021 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:56:20 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:56:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46633 DF PROTO=TCP SPT=60546 DPT=9102 SEQ=2920774483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9CF9C10000000001030307) 
Oct 14 09:56:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:22.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:22.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46634 DF PROTO=TCP SPT=60546 DPT=9102 SEQ=2920774483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9D01C20000000001030307) 
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.441 2 DEBUG nova.compute.manager [None req-1d701eb9-353e-46c7-981f-eb3600374169 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server [None req-1d701eb9-353e-46c7-981f-eb3600374169 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     raise self.value
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     raise self.value
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Oct 14 09:56:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:23.466 2 ERROR oslo_messaging.rpc.server 
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.449 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.451 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.453 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.454 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.454 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.455 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.456 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.458 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.459 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.459 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.460 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.460 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.461 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.461 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.462 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.463 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.463 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.464 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.465 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.465 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.467 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.467 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.468 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.468 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.469 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.469 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.469 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.470 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.470 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.471 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.471 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.472 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.472 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.473 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.474 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:56:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:56:24.474 12 DEBUG ceilometer.compute.pollsters [-] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000001, id=4408214d-dae5-4452-92e9-eb4abd6589d4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct 14 09:56:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46635 DF PROTO=TCP SPT=60546 DPT=9102 SEQ=2920774483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9D11810000000001030307) 
Oct 14 09:56:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:27.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:27.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:28 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:28Z|00043|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.298 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'flavor' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.331 2 DEBUG oslo_concurrency.lockutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.332 2 DEBUG oslo_concurrency.lockutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.332 2 DEBUG nova.network.neutron [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.333 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:56:31 np0005486759.ooo.test podman[311269]: 2025-10-14 09:56:31.449088153 +0000 UTC m=+0.072593697 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:56:31 np0005486759.ooo.test podman[311269]: 2025-10-14 09:56:31.462356561 +0000 UTC m=+0.085862075 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:56:31 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.784 2 DEBUG nova.network.neutron [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.808 2 DEBUG oslo_concurrency.lockutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.852 2 INFO nova.virt.libvirt.driver [-] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Instance destroyed successfully.
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.853 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.875 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'resources' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.901 2 DEBUG nova.virt.libvirt.vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='np0005486759.ooo.test',hostname='test',id=1,image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T08:45:21Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005486759.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='8bf64e81a4214f9490d231a2e79ab3d8',ramdisk_id='',reservation_id='r-8vq1axpu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-14T09:55:59Z,user_data=None,user_id='2aff2e6f927a42b1b822d05cd9349762',uuid=4408214d-dae5-4452-92e9-eb4abd6589d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.902 2 DEBUG nova.network.os_vif_util [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Converting VIF {"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.903 2 DEBUG nova.network.os_vif_util [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.904 2 DEBUG os_vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.907 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee08de8-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.914 2 INFO os_vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9')
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.917 2 DEBUG nova.virt.libvirt.host [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.917 2 INFO nova.virt.libvirt.host [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] UEFI support detected
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.922 2 DEBUG nova.virt.libvirt.driver [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Start _get_guest_xml network_info=[{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=d8afae20-8860-4649-9226-11ff3fdf8072,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}], 'ephemerals': [{'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vdb', 'size': 1}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.925 2 WARNING nova.virt.libvirt.driver [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.926 2 DEBUG nova.virt.libvirt.host [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Searching host: 'np0005486759.ooo.test' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.927 2 DEBUG nova.virt.libvirt.host [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.928 2 DEBUG nova.virt.libvirt.host [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Searching host: 'np0005486759.ooo.test' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.929 2 DEBUG nova.virt.libvirt.host [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.929 2 DEBUG nova.virt.libvirt.driver [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.929 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-14T08:44:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='0bdb7446-7a7f-4e51-8a88-180de2e09857',id=1,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=d8afae20-8860-4649-9226-11ff3fdf8072,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.930 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.930 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.930 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.930 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.930 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.930 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.931 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.931 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.931 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.931 2 DEBUG nova.virt.hardware [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.931 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.947 2 DEBUG nova.privsep.utils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.948 2 DEBUG nova.virt.libvirt.vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='np0005486759.ooo.test',hostname='test',id=1,image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T08:45:21Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005486759.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='8bf64e81a4214f9490d231a2e79ab3d8',ramdisk_id='',reservation_id='r-8vq1axpu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-14T09:55:59Z,user_data=None,user_id='2aff2e6f927a42b1b822d05cd9349762',uuid=4408214d-dae5-4452-92e9-eb4abd6589d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.948 2 DEBUG nova.network.os_vif_util [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Converting VIF {"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.949 2 DEBUG nova.network.os_vif_util [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.949 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.963 2 DEBUG nova.virt.libvirt.driver [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] End _get_guest_xml xml=<domain type="kvm">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <uuid>4408214d-dae5-4452-92e9-eb4abd6589d4</uuid>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <name>instance-00000001</name>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <memory>524288</memory>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <vcpu>1</vcpu>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <metadata>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:name>test</nova:name>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:creationTime>2025-10-14 09:56:31</nova:creationTime>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:flavor name="m1.small">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:memory>512</nova:memory>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:disk>1</nova:disk>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:swap>0</nova:swap>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:ephemeral>1</nova:ephemeral>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:vcpus>1</nova:vcpus>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       </nova:flavor>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:owner>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:user uuid="2aff2e6f927a42b1b822d05cd9349762">admin</nova:user>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:project uuid="8bf64e81a4214f9490d231a2e79ab3d8">admin</nova:project>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       </nova:owner>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:root type="image" uuid="d8afae20-8860-4649-9226-11ff3fdf8072"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <nova:ports>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         <nova:port uuid="eee08de8-f983-4ebe-a654-f67f48659e50">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:           <nova:ip type="fixed" address="192.168.0.173" ipVersion="4"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:         </nova:port>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       </nova:ports>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </nova:instance>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </metadata>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <sysinfo type="smbios">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <system>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <entry name="manufacturer">RDO</entry>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <entry name="product">OpenStack Compute</entry>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <entry name="serial">4408214d-dae5-4452-92e9-eb4abd6589d4</entry>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <entry name="uuid">4408214d-dae5-4452-92e9-eb4abd6589d4</entry>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <entry name="family">Virtual Machine</entry>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </system>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </sysinfo>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <os>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <boot dev="hd"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <smbios mode="sysinfo"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </os>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <features>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <acpi/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <apic/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <vmcoreinfo/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </features>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <clock offset="utc">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <timer name="pit" tickpolicy="delay"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <timer name="hpet" present="no"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </clock>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <cpu mode="host-model" match="exact">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <topology sockets="1" cores="1" threads="1"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </cpu>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   <devices>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <disk type="file" device="disk">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <source file="/var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <target dev="vda" bus="virtio"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <disk type="file" device="disk">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <source file="/var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <target dev="vdb" bus="virtio"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <interface type="ethernet">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <mac address="fa:16:3e:8e:cf:16"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <model type="virtio"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <driver name="vhost" rx_queue_size="512"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <mtu size="1292"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <target dev="tapeee08de8-f9"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </interface>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <serial type="pty">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <log file="/var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/console.log" append="off"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </serial>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <video>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <model type="virtio"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </video>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <input type="tablet" bus="usb"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <input type="keyboard" bus="usb"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <rng model="virtio">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <backend model="random">/dev/urandom</backend>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </rng>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <controller type="usb" index="0"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     <memballoon model="virtio">
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:       <stats period="10"/>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:     </memballoon>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:   </devices>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: </domain>
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 14 09:56:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:31.964 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.040 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.041 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.098 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.100 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.176 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.177 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.234 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.236 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.249 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d4dee7ea20c47bbf691f78ae3efd9dd29eccd913 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.290 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d4dee7ea20c47bbf691f78ae3efd9dd29eccd913 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.291 2 DEBUG nova.virt.disk.api [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Checking if we can resize image /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.292 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.335 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.336 2 DEBUG nova.virt.disk.api [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Cannot resize image /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.337 2 DEBUG nova.objects.instance [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.362 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.419 2 DEBUG oslo_concurrency.processutils [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.422 2 DEBUG nova.virt.libvirt.vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T08:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='np0005486759.ooo.test',hostname='test',id=1,image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T08:45:21Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='np0005486759.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='8bf64e81a4214f9490d231a2e79ab3d8',ramdisk_id='',reservation_id='r-8vq1axpu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='d8afae20-8860-4649-9226-11ff3fdf8072',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-10-14T09:55:59Z,user_data=None,user_id='2aff2e6f927a42b1b822d05cd9349762',uuid=4408214d-dae5-4452-92e9-eb4abd6589d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.423 2 DEBUG nova.network.os_vif_util [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Converting VIF {"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.424 2 DEBUG nova.network.os_vif_util [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.425 2 DEBUG os_vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.431 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.432 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.435 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee08de8-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeee08de8-f9, col_values=(('external_ids', {'iface-id': 'eee08de8-f983-4ebe-a654-f67f48659e50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:cf:16', 'vm-uuid': '4408214d-dae5-4452-92e9-eb4abd6589d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.489 2 INFO os_vif [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:cf:16,bridge_name='br-int',has_traffic_filtering=True,id=eee08de8-f983-4ebe-a654-f67f48659e50,network=Network(9197abc5-07db-4abf-9578-9360b49aea49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee08de8-f9')
Oct 14 09:56:32 np0005486759.ooo.test podman[311308]: 2025-10-14 09:56:32.521750533 +0000 UTC m=+0.139749946 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:56:32 np0005486759.ooo.test kernel: device tapeee08de8-f9 entered promiscuous mode
Oct 14 09:56:32 np0005486759.ooo.test podman[311307]: 2025-10-14 09:56:32.504572507 +0000 UTC m=+0.125211111 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 09:56:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:32Z|00044|binding|INFO|Claiming lport eee08de8-f983-4ebe-a654-f67f48659e50 for this chassis.
Oct 14 09:56:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:32Z|00045|binding|INFO|eee08de8-f983-4ebe-a654-f67f48659e50: Claiming fa:16:3e:8e:cf:16 192.168.0.173
Oct 14 09:56:32 np0005486759.ooo.test systemd-udevd[311378]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 09:56:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435792.5669] manager: (tapeee08de8-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/15)
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.575 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:cf:16 192.168.0.173'], port_security=['fa:16:3e:8e:cf:16 192.168.0.173'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.173/24', 'neutron:device_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9197abc5-07db-4abf-9578-9360b49aea49', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'faabc66a-aada-4f6b-bec3-989808c74b8e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62cfbaba-fb96-4812-8b41-6ad8964122a3, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=eee08de8-f983-4ebe-a654-f67f48659e50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.577 183328 INFO neutron.agent.ovn.metadata.agent [-] Port eee08de8-f983-4ebe-a654-f67f48659e50 in datapath 9197abc5-07db-4abf-9578-9360b49aea49 bound to our chassis
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.579 183328 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 09:56:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:32Z|00046|binding|INFO|Setting lport eee08de8-f983-4ebe-a654-f67f48659e50 ovn-installed in OVS
Oct 14 09:56:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:32Z|00047|binding|INFO|Setting lport eee08de8-f983-4ebe-a654-f67f48659e50 up in Southbound
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.587 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[ee58e28a-bddc-4b6a-8735-625f9e31c810]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.589 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9197abc5-01 in ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 14 09:56:32 np0005486759.ooo.test podman[311309]: 2025-10-14 09:56:32.589396077 +0000 UTC m=+0.202152149 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251009, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3)
Oct 14 09:56:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435792.5942] device (tapeee08de8-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 09:56:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435792.5952] device (tapeee08de8-f9): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.597 183433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9197abc5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.597 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[33095e48-c995-4c81-9017-0a46d947135d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.598 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[bd156316-9e39-402b-b35b-b85465e467dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test podman[311309]: 2025-10-14 09:56:32.600534129 +0000 UTC m=+0.213290201 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.607 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[eb89800b-a705-4932-933c-c7006b5ff395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:56:32 np0005486759.ooo.test systemd-machined[93972]: New machine qemu-2-instance-00000001.
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.629 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[1b38c2cd-1fe8-4cba-b2af-8326f8d0c121]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test podman[311307]: 2025-10-14 09:56:32.640908276 +0000 UTC m=+0.261546910 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.646 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[d7291e20-25aa-4c44-8434-a13689911333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.650 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[484c2b10-b81d-4a62-ba49-18f7fa9f608a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test systemd-udevd[311380]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 09:56:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435792.6524] manager: (tap9197abc5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/16)
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:56:32 np0005486759.ooo.test podman[311308]: 2025-10-14 09:56:32.656025371 +0000 UTC m=+0.274024774 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:56:32 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.678 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[549f89fe-b94d-4f95-976d-ba7b53db79c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.681 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8d3ae6-7328-4144-8138-04a7b7e3c127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9197abc5-01: link becomes ready
Oct 14 09:56:32 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9197abc5-00: link becomes ready
Oct 14 09:56:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435792.6989] device (tap9197abc5-00): carrier: link connected
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.703 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[bc573248-9589-4149-bc09-c2092838aebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.715 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd2a345-fd1a-4d46-8871-3455dc11c1de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9197abc5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:b0:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1181282, 'reachable_time': 16492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311419, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.729 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfa259f-3c40-4acd-8b0a-683522bc22c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:b0ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1181282, 'tstamp': 1181282}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311424, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.741 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[77357a03-2e35-4fa5-b147-bbf043cbed13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9197abc5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:b0:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1181282, 'reachable_time': 16492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 311427, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.764 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[beefcf59-4629-411d-a048-01a6d1ec9c1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.816 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[3dceae26-ff50-4129-9e0f-40f73cdc51ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.817 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9197abc5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.817 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.818 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9197abc5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test kernel: device tap9197abc5-00 entered promiscuous mode
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.824 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9197abc5-00, col_values=(('external_ids', {'iface-id': '25844137-067c-4137-b11d-9fc6e75f59fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:32Z|00048|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 09:56:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:32.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.832 183328 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.833 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[f27bdcbb-5cdb-4e3c-9a37-7b30144f24b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.834 183328 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: global
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     log         /dev/log local0 debug
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     log-tag     haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     user        root
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     group       root
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     maxconn     1024
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     pidfile     /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     daemon
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: defaults
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     log global
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     mode http
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     option httplog
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     option dontlognull
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     option http-server-close
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     option forwardfor
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     retries                 3
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-request    30s
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout connect         30s
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout client          32s
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout server          32s
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-keep-alive 30s
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: listen listener
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     bind 169.254.169.254:80
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     server metadata /var/lib/neutron/metadata_proxy
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:     http-request add-header X-OVN-Network-ID 9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 14 09:56:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:32.835 183328 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'env', 'PROCESS_TAG=haproxy-9197abc5-07db-4abf-9578-9360b49aea49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 14 09:56:32 np0005486759.ooo.test snmpd[52493]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.097 2 DEBUG nova.virt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Emitting event <LifecycleEvent: 1760435793.0967178, 4408214d-dae5-4452-92e9-eb4abd6589d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.097 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] VM Resumed (Lifecycle Event)
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.101 2 DEBUG nova.compute.manager [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.124 2 INFO nova.virt.libvirt.driver [-] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Instance rebooted successfully.
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.124 2 DEBUG nova.compute.manager [None req-152e7780-7a40-4d50-b5d4-4cf963829da6 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.129 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.137 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.170 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] During sync_power_state the instance has a pending task (powering-on). Skip.
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.170 2 DEBUG nova.virt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Emitting event <LifecycleEvent: 1760435793.1020024, 4408214d-dae5-4452-92e9-eb4abd6589d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.171 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] VM Started (Lifecycle Event)
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.211 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.214 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 14 09:56:33 np0005486759.ooo.test podman[311457]: 
Oct 14 09:56:33 np0005486759.ooo.test podman[311457]: 2025-10-14 09:56:33.245317289 +0000 UTC m=+0.084438370 container create a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 09:56:33 np0005486759.ooo.test systemd[1]: Started libpod-conmon-a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc.scope.
Oct 14 09:56:33 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:56:33 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7df68e89a5a3091b7ccd00355d773dc81bad90001ad9c4fbaf3a8bf1ac010bfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:56:33 np0005486759.ooo.test podman[311457]: 2025-10-14 09:56:33.309717494 +0000 UTC m=+0.148838585 container init a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 09:56:33 np0005486759.ooo.test podman[311457]: 2025-10-14 09:56:33.209761799 +0000 UTC m=+0.048882870 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 09:56:33 np0005486759.ooo.test podman[311457]: 2025-10-14 09:56:33.316715109 +0000 UTC m=+0.155836190 container start a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.316 2 DEBUG nova.compute.manager [req-b7b4007c-c989-4193-a27b-6000dafda895 req-8e5d733d-c980-48f9-877b-368141b92565 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received event network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.317 2 DEBUG oslo_concurrency.lockutils [req-b7b4007c-c989-4193-a27b-6000dafda895 req-8e5d733d-c980-48f9-877b-368141b92565 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.317 2 DEBUG oslo_concurrency.lockutils [req-b7b4007c-c989-4193-a27b-6000dafda895 req-8e5d733d-c980-48f9-877b-368141b92565 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.318 2 DEBUG oslo_concurrency.lockutils [req-b7b4007c-c989-4193-a27b-6000dafda895 req-8e5d733d-c980-48f9-877b-368141b92565 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.318 2 DEBUG nova.compute.manager [req-b7b4007c-c989-4193-a27b-6000dafda895 req-8e5d733d-c980-48f9-877b-368141b92565 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] No waiting events found dispatching network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 09:56:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:33.319 2 WARNING nova.compute.manager [req-b7b4007c-c989-4193-a27b-6000dafda895 req-8e5d733d-c980-48f9-877b-368141b92565 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received unexpected event network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 for instance with vm_state active and task_state None.
Oct 14 09:56:33 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [NOTICE]   (311474) : New worker (311476) forked
Oct 14 09:56:33 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [NOTICE]   (311474) : Loading success.
Oct 14 09:56:33 np0005486759.ooo.test systemd[1]: tmp-crun.5iPnRH.mount: Deactivated successfully.
Oct 14 09:56:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:35.368 2 DEBUG nova.compute.manager [req-c3b8285b-ec36-4d91-b894-4a01f08d11cc req-304caad7-6fd0-446b-91bb-bc12ff080636 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received event network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 09:56:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:35.368 2 DEBUG oslo_concurrency.lockutils [req-c3b8285b-ec36-4d91-b894-4a01f08d11cc req-304caad7-6fd0-446b-91bb-bc12ff080636 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:35.369 2 DEBUG oslo_concurrency.lockutils [req-c3b8285b-ec36-4d91-b894-4a01f08d11cc req-304caad7-6fd0-446b-91bb-bc12ff080636 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:35.369 2 DEBUG oslo_concurrency.lockutils [req-c3b8285b-ec36-4d91-b894-4a01f08d11cc req-304caad7-6fd0-446b-91bb-bc12ff080636 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:35.369 2 DEBUG nova.compute.manager [req-c3b8285b-ec36-4d91-b894-4a01f08d11cc req-304caad7-6fd0-446b-91bb-bc12ff080636 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] No waiting events found dispatching network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 09:56:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:35.369 2 WARNING nova.compute.manager [req-c3b8285b-ec36-4d91-b894-4a01f08d11cc req-304caad7-6fd0-446b-91bb-bc12ff080636 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Received unexpected event network-vif-plugged-eee08de8-f983-4ebe-a654-f67f48659e50 for instance with vm_state active and task_state None.
Oct 14 09:56:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:37.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:37.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:38.254 2 DEBUG nova.compute.manager [None req-f323739d-f2a4-4755-9767-391f1e6a62af 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 09:56:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:38.258 2 INFO nova.compute.manager [None req-f323739d-f2a4-4755-9767-391f1e6a62af 2aff2e6f927a42b1b822d05cd9349762 8bf64e81a4214f9490d231a2e79ab3d8 - - default default] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Retrieving diagnostics
Oct 14 09:56:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:56:39 np0005486759.ooo.test podman[311486]: 2025-10-14 09:56:39.464811529 +0000 UTC m=+0.089470594 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251009)
Oct 14 09:56:39 np0005486759.ooo.test podman[311486]: 2025-10-14 09:56:39.531009469 +0000 UTC m=+0.155668504 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 14 09:56:39 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:56:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:56:41 np0005486759.ooo.test podman[311509]: 2025-10-14 09:56:41.452490655 +0000 UTC m=+0.079208609 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Oct 14 09:56:41 np0005486759.ooo.test podman[311509]: 2025-10-14 09:56:41.470390124 +0000 UTC m=+0.097108078 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Oct 14 09:56:41 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:56:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:42.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:42.155 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:56:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:42.158 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 09:56:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:42.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:56:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:56:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:56:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:56:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:56:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15240 "" "Go-http-client/1.1"
Oct 14 09:56:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:42.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:56:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:56:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:56:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:56:44Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:cf:16 192.168.0.173
Oct 14 09:56:45 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:45.161 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:56:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:47.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:47.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:56:48 np0005486759.ooo.test podman[311540]: 2025-10-14 09:56:48.475316627 +0000 UTC m=+0.095231281 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:56:48 np0005486759.ooo.test podman[311540]: 2025-10-14 09:56:48.507657168 +0000 UTC m=+0.127571802 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Oct 14 09:56:48 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:56:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42494 DF PROTO=TCP SPT=57968 DPT=9102 SEQ=2614531953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9D6AD80000000001030307) 
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:49.823 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:49.825 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:49 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42495 DF PROTO=TCP SPT=57968 DPT=9102 SEQ=2614531953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9D6EC20000000001030307) 
Oct 14 09:56:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:56:51 np0005486759.ooo.test systemd[1]: tmp-crun.zUg9Qd.mount: Deactivated successfully.
Oct 14 09:56:51 np0005486759.ooo.test podman[311558]: 2025-10-14 09:56:51.468584756 +0000 UTC m=+0.094519438 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:56:51 np0005486759.ooo.test podman[311558]: 2025-10-14 09:56:51.481467201 +0000 UTC m=+0.107401933 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:56:51 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:52.210 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:52.211 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.3864408
Oct 14 09:56:52 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32894 [14/Oct/2025:09:56:49.821] listener listener/metadata 0/0/0/2389/2389 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:52.226 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:52.227 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:52 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:52.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:52.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42496 DF PROTO=TCP SPT=57968 DPT=9102 SEQ=2614531953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9D76C10000000001030307) 
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.162 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.162 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.163 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.185 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 1.9582269
Oct 14 09:56:54 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32906 [14/Oct/2025:09:56:52.225] listener listener/metadata 0/0/0/1959/1959 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.204 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.205 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.453 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.455 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.2507286
Oct 14 09:56:54 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32908 [14/Oct/2025:09:56:54.203] listener listener/metadata 0/0/0/252/252 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.465 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:54.466 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:54 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.157 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:55 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32914 [14/Oct/2025:09:56:54.464] listener listener/metadata 0/0/0/693/693 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.158 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.6921799
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.169 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.169 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.307 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.308 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.1389675
Oct 14 09:56:55 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32928 [14/Oct/2025:09:56:55.168] listener listener/metadata 0/0/0/139/139 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.318 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.318 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.485 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:55 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32942 [14/Oct/2025:09:56:55.317] listener listener/metadata 0/0/0/168/168 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.485 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 149 time: 0.1669350
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.493 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.494 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.613 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:55 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32958 [14/Oct/2025:09:56:55.493] listener listener/metadata 0/0/0/121/121 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.615 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.1202049
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.623 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:55.624 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:55 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.231 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.232 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.6081641
Oct 14 09:56:56 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32968 [14/Oct/2025:09:56:55.622] listener listener/metadata 0/0/0/609/609 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.240 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.242 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.442 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.443 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.2009718
Oct 14 09:56:56 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32974 [14/Oct/2025:09:56:56.240] listener listener/metadata 0/0/0/202/202 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.452 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.453 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:56 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32986 [14/Oct/2025:09:56:56.451] listener listener/metadata 0/0/0/127/127 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.578 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.1253347
Oct 14 09:56:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42497 DF PROTO=TCP SPT=57968 DPT=9102 SEQ=2614531953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9D86810000000001030307) 
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.594 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:56.595 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:56 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.150 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:57 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32996 [14/Oct/2025:09:56:56.593] listener listener/metadata 0/0/0/558/558 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.151 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.5564115
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.158 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.159 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:57.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.330 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:57 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:33006 [14/Oct/2025:09:56:57.157] listener listener/metadata 0/0/0/173/173 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.331 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.1726613
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.337 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.338 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:56:57.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.492 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:57 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:33018 [14/Oct/2025:09:56:57.336] listener listener/metadata 0/0/0/156/156 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.493 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.1548734
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.500 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.501 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.700 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.701 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.2002294
Oct 14 09:56:57 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:33030 [14/Oct/2025:09:56:57.499] listener listener/metadata 0/0/0/201/201 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.708 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:57.709 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:57 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:58.182 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:58 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32856 [14/Oct/2025:09:56:57.708] listener listener/metadata 0/0/0/475/475 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:58.183 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.4739299
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:58.191 183428 DEBUG eventlet.wsgi.server [-] (183428) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:58.192 183428 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: Accept: */*
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: Connection: close
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: Content-Type: text/plain
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: Host: 169.254.169.254
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: User-Agent: curl/7.84.0
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Forwarded-For: 192.168.0.173
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: X-Ovn-Network-Id: 9197abc5-07db-4abf-9578-9360b49aea49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:58.370 183428 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Oct 14 09:56:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:56:58.371 183428 INFO eventlet.wsgi.server [-] 192.168.0.173,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.1789107
Oct 14 09:56:58 np0005486759.ooo.test haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49[311476]: 192.168.0.173:32858 [14/Oct/2025:09:56:58.190] listener listener/metadata 0/0/0/180/180 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Oct 14 09:57:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:02.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:57:02 np0005486759.ooo.test podman[311582]: 2025-10-14 09:57:02.446921323 +0000 UTC m=+0.074581117 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:57:02 np0005486759.ooo.test podman[311582]: 2025-10-14 09:57:02.484357622 +0000 UTC m=+0.112017416 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 09:57:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:02.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:02 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:57:02 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:57:02Z|00049|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:57:03 np0005486759.ooo.test podman[311603]: 2025-10-14 09:57:03.437724673 +0000 UTC m=+0.059813654 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009)
Oct 14 09:57:03 np0005486759.ooo.test podman[311603]: 2025-10-14 09:57:03.44937188 +0000 UTC m=+0.071460931 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:57:03 np0005486759.ooo.test podman[311602]: 2025-10-14 09:57:03.50055459 +0000 UTC m=+0.122191097 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: tmp-crun.f0CalZ.mount: Deactivated successfully.
Oct 14 09:57:03 np0005486759.ooo.test podman[311604]: 2025-10-14 09:57:03.515841709 +0000 UTC m=+0.131145392 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3)
Oct 14 09:57:03 np0005486759.ooo.test podman[311604]: 2025-10-14 09:57:03.521470151 +0000 UTC m=+0.136773854 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:57:03 np0005486759.ooo.test podman[311602]: 2025-10-14 09:57:03.535481851 +0000 UTC m=+0.157118328 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:57:03 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:57:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:07.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:07.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:57:10 np0005486759.ooo.test podman[311664]: 2025-10-14 09:57:10.444170944 +0000 UTC m=+0.072714471 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:57:10 np0005486759.ooo.test podman[311664]: 2025-10-14 09:57:10.478202757 +0000 UTC m=+0.106746244 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 09:57:10 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:57:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:57:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:57:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:12.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:57:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:57:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:57:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:57:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15236 "" "Go-http-client/1.1"
Oct 14 09:57:12 np0005486759.ooo.test systemd[1]: tmp-crun.1AqTfv.mount: Deactivated successfully.
Oct 14 09:57:12 np0005486759.ooo.test podman[311690]: 2025-10-14 09:57:12.442662501 +0000 UTC m=+0.068825261 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 09:57:12 np0005486759.ooo.test podman[311690]: 2025-10-14 09:57:12.449985626 +0000 UTC m=+0.076148406 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, release=1755695350, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=)
Oct 14 09:57:12 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:57:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:12.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:57:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.011 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.012 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.032 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.032 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.033 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.161 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.162 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.162 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:57:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:15.162 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.294 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.318 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.319 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.319 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.320 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.320 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.321 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.321 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.322 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.322 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.322 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.345 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.345 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.345 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.346 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.413 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.487 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.488 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.543 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.544 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.619 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.620 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.695 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.918 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.920 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12665MB free_disk=386.71138763427734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.920 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:57:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:16.921 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.016 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.017 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.017 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.065 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.085 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.119 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.120 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:17.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:57:19 np0005486759.ooo.test podman[311724]: 2025-10-14 09:57:19.454270889 +0000 UTC m=+0.081689835 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:57:19 np0005486759.ooo.test podman[311724]: 2025-10-14 09:57:19.464354099 +0000 UTC m=+0.091773045 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:57:19 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:57:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21888 DF PROTO=TCP SPT=60542 DPT=9102 SEQ=2772892384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9DE0090000000001030307) 
Oct 14 09:57:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21889 DF PROTO=TCP SPT=60542 DPT=9102 SEQ=2772892384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9DE4010000000001030307) 
Oct 14 09:57:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:22.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:57:22 np0005486759.ooo.test podman[311743]: 2025-10-14 09:57:22.49523435 +0000 UTC m=+0.077491417 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:57:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:22.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:22 np0005486759.ooo.test podman[311743]: 2025-10-14 09:57:22.507367452 +0000 UTC m=+0.089624539 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:57:22 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:57:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21890 DF PROTO=TCP SPT=60542 DPT=9102 SEQ=2772892384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9DEC010000000001030307) 
Oct 14 09:57:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 09:57:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 09:57:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21891 DF PROTO=TCP SPT=60542 DPT=9102 SEQ=2772892384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9DFBC10000000001030307) 
Oct 14 09:57:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:27.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:27.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:32.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:32.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:57:33 np0005486759.ooo.test podman[311766]: 2025-10-14 09:57:33.459318808 +0000 UTC m=+0.084935296 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:57:33 np0005486759.ooo.test podman[311766]: 2025-10-14 09:57:33.49853614 +0000 UTC m=+0.124152658 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: tmp-crun.vHFY90.mount: Deactivated successfully.
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:57:33 np0005486759.ooo.test podman[311786]: 2025-10-14 09:57:33.582601158 +0000 UTC m=+0.088778273 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:57:33 np0005486759.ooo.test podman[311786]: 2025-10-14 09:57:33.623432939 +0000 UTC m=+0.129610014 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:57:33 np0005486759.ooo.test podman[311807]: 2025-10-14 09:57:33.672826514 +0000 UTC m=+0.078016033 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:57:33 np0005486759.ooo.test podman[311806]: 2025-10-14 09:57:33.727705026 +0000 UTC m=+0.133415371 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:57:33 np0005486759.ooo.test podman[311806]: 2025-10-14 09:57:33.738221989 +0000 UTC m=+0.143932334 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:57:33 np0005486759.ooo.test podman[311807]: 2025-10-14 09:57:33.760046938 +0000 UTC m=+0.165236477 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 09:57:33 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:57:34 np0005486759.ooo.test systemd[1]: tmp-crun.hDVFaK.mount: Deactivated successfully.
Oct 14 09:57:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:37.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:37.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:57:41 np0005486759.ooo.test systemd[1]: tmp-crun.he9bJK.mount: Deactivated successfully.
Oct 14 09:57:41 np0005486759.ooo.test podman[311848]: 2025-10-14 09:57:41.445707955 +0000 UTC m=+0.074406712 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:57:41 np0005486759.ooo.test podman[311848]: 2025-10-14 09:57:41.513538966 +0000 UTC m=+0.142237753 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 14 09:57:41 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:57:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:57:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:57:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:57:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:57:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:57:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15240 "" "Go-http-client/1.1"
Oct 14 09:57:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:42.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:42.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:57:43 np0005486759.ooo.test podman[311873]: 2025-10-14 09:57:43.442261233 +0000 UTC m=+0.070407660 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct 14 09:57:43 np0005486759.ooo.test podman[311873]: 2025-10-14 09:57:43.478387261 +0000 UTC m=+0.106533708 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct 14 09:57:43 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:57:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:57:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:57:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:47.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:47.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14734 DF PROTO=TCP SPT=43974 DPT=9102 SEQ=1211380428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9E55380000000001030307) 
Oct 14 09:57:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:57:50 np0005486759.ooo.test systemd[1]: tmp-crun.5dXkiF.mount: Deactivated successfully.
Oct 14 09:57:50 np0005486759.ooo.test podman[311905]: 2025-10-14 09:57:50.448471601 +0000 UTC m=+0.081087169 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:57:50 np0005486759.ooo.test podman[311905]: 2025-10-14 09:57:50.477618614 +0000 UTC m=+0.110234233 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:57:50 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:57:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14735 DF PROTO=TCP SPT=43974 DPT=9102 SEQ=1211380428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9E59420000000001030307) 
Oct 14 09:57:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:52.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:52.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14736 DF PROTO=TCP SPT=43974 DPT=9102 SEQ=1211380428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9E61410000000001030307) 
Oct 14 09:57:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:57:53 np0005486759.ooo.test podman[311923]: 2025-10-14 09:57:53.448637797 +0000 UTC m=+0.078361246 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:57:53 np0005486759.ooo.test podman[311923]: 2025-10-14 09:57:53.457329831 +0000 UTC m=+0.087053310 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:57:53 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:57:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:57:54.162 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:57:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:57:54.163 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:57:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:57:54.163 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:57:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14737 DF PROTO=TCP SPT=43974 DPT=9102 SEQ=1211380428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9E71010000000001030307) 
Oct 14 09:57:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:57.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:57:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:57.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:57:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:57.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:57:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:57.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:57:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:57.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:57:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:57:57.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:57:58 np0005486759.ooo.test sshd[311946]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 09:57:58 np0005486759.ooo.test sshd[311946]: Accepted publickey for zuul from 38.102.83.114 port 45090 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 09:57:58 np0005486759.ooo.test systemd-logind[759]: New session 45 of user zuul.
Oct 14 09:57:58 np0005486759.ooo.test systemd[1]: Started Session 45 of User zuul.
Oct 14 09:57:58 np0005486759.ooo.test sshd[311946]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 09:57:58 np0005486759.ooo.test sudo[311966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rojosjbnlqdwebmgzielxlqphclnehlq ; /usr/bin/python3
Oct 14 09:57:58 np0005486759.ooo.test sudo[311966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 09:57:58 np0005486759.ooo.test python3[311968]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 09:57:58 np0005486759.ooo.test subscription-manager[311969]: Unregistered machine with identity: 00d3e29c-79e6-406a-a1db-b33eef9df3e4
Oct 14 09:57:59 np0005486759.ooo.test sudo[311966]: pam_unix(sudo:session): session closed for user root
Oct 14 09:58:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:02.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:02.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:02.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:02.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:02.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:02.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:58:04 np0005486759.ooo.test podman[311972]: 2025-10-14 09:58:04.432731921 +0000 UTC m=+0.082199813 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid)
Oct 14 09:58:04 np0005486759.ooo.test podman[311971]: 2025-10-14 09:58:04.487668967 +0000 UTC m=+0.136382266 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:58:04 np0005486759.ooo.test podman[311971]: 2025-10-14 09:58:04.501293489 +0000 UTC m=+0.150006788 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:58:04 np0005486759.ooo.test podman[311974]: 2025-10-14 09:58:04.458182862 +0000 UTC m=+0.095677951 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:58:04 np0005486759.ooo.test podman[311972]: 2025-10-14 09:58:04.516251923 +0000 UTC m=+0.165719795 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:58:04 np0005486759.ooo.test podman[311974]: 2025-10-14 09:58:04.543350004 +0000 UTC m=+0.180845053 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:58:04 np0005486759.ooo.test podman[311973]: 2025-10-14 09:58:04.588428382 +0000 UTC m=+0.230691816 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible)
Oct 14 09:58:04 np0005486759.ooo.test podman[311973]: 2025-10-14 09:58:04.605501209 +0000 UTC m=+0.247764683 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct 14 09:58:04 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:58:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:07.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:07.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:07.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:07.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:07.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:07.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:58:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:58:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:58:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:58:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:58:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15239 "" "Go-http-client/1.1"
Oct 14 09:58:12 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:58:12 np0005486759.ooo.test podman[312051]: 2025-10-14 09:58:12.460546171 +0000 UTC m=+0.088294119 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 14 09:58:12 np0005486759.ooo.test podman[312051]: 2025-10-14 09:58:12.552371194 +0000 UTC m=+0.180119122 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:58:12 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:58:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:12.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:12.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:58:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:58:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:58:14 np0005486759.ooo.test podman[312077]: 2025-10-14 09:58:14.454694085 +0000 UTC m=+0.078833211 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git)
Oct 14 09:58:14 np0005486759.ooo.test podman[312077]: 2025-10-14 09:58:14.466941037 +0000 UTC m=+0.091080173 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 09:58:14 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.122 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.123 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.123 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.123 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:17.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:18.197 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:58:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:18.197 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:58:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:18.198 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:58:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:18.198 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:58:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11440 DF PROTO=TCP SPT=47002 DPT=9102 SEQ=1528705622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9ECA680000000001030307) 
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.738 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.759 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.760 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.761 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.762 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.763 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.763 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.764 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.765 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.765 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.765 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.794 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.794 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.794 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.794 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.853 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.928 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:58:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.930 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:19.999 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.001 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.080 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.081 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.137 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.349 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.351 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12674MB free_disk=386.7125701904297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.351 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.352 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.428 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.429 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.429 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.471 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.484 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.486 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:58:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:20.486 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:58:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11441 DF PROTO=TCP SPT=47002 DPT=9102 SEQ=1528705622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9ECE810000000001030307) 
Oct 14 09:58:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:58:21 np0005486759.ooo.test podman[312110]: 2025-10-14 09:58:21.449006501 +0000 UTC m=+0.077884642 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Oct 14 09:58:21 np0005486759.ooo.test podman[312110]: 2025-10-14 09:58:21.483448745 +0000 UTC m=+0.112326846 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 09:58:21 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:58:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11442 DF PROTO=TCP SPT=47002 DPT=9102 SEQ=1528705622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9ED6810000000001030307) 
Oct 14 09:58:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:22.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:22.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:22.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:22.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:22.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:22.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:58:24 np0005486759.ooo.test podman[312130]: 2025-10-14 09:58:24.449591199 +0000 UTC m=+0.082174092 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.450 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test podman[312130]: 2025-10-14 09:58:24.456409796 +0000 UTC m=+0.088992659 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:58:24 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.485 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.486 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f24d2d2-803e-4e21-8b29-3ea1c7f1ff4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.451014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52df600c-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '2e90009fd936d408ff71b15133e4605f922b69a3128d5f20715c5a099538cdfe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.451014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52df76be-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '92ce4923b9d43cc8e4c8e9204d0d5d4ab26f7ca19d8f3dd1b5ef3d3e93576c08'}]}, 'timestamp': '2025-10-14 09:58:24.487413', '_unique_id': 'b36b7095827c46acb980c148b041c836'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.490 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.509 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.509 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bfc3d03-b309-4b03-970a-dc4e723369fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.490544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52e2d7f0-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.685645307, 'message_signature': 'f3f0fa0506196f121eaef64459d6042436e20ef9052f437532017f1437f2c50a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.490544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52e2ebaa-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.685645307, 'message_signature': '10a017a386d96f069a8e95d5875aa3acef31591819cdc29f6709640f1c627c45'}]}, 'timestamp': '2025-10-14 09:58:24.510107', '_unique_id': 'bf3c7643fd7142f5b01f6c2f53298a11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.511 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.513 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '594548bb-f398-4ad1-8c06-300b2ccc3f04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.512767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52e368be-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': 'b4b6c78b46e1787802f4949891d6569aa49e1fe06c4feb6fe899092ae0902052'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.512767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52e379c6-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '297368e7e98142cdc2800e87b85ea83de35254a781b363dda1c179591141484c'}]}, 'timestamp': '2025-10-14 09:58:24.513683', '_unique_id': 'bc7e04ec39e24f849b9b7ad24693693d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.514 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.515 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.519 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2b0d4f2-5656-4919-b7b6-88af733ebfcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.516017', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e481ae-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': 'b3654e7be1f7bd693363bf9c86771f3601ff2564c634378a69858c0f447babd7'}]}, 'timestamp': '2025-10-14 09:58:24.520482', '_unique_id': '2af81b92e5a04c36819e203fca06debc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.521 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6b6115a-1e7b-426c-a9db-c102f01b4214', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.522126', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e4d050-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '8715a765d378ec02a9eb30762819ac1a0e570a5da6b709f480bd3ba92888e5fd'}]}, 'timestamp': '2025-10-14 09:58:24.522371', '_unique_id': '26ae84cd8bc544749729ec2c90f407ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.522 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a299631-5563-4535-bb16-26030809f08f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.523369', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e50016-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': 'ac2ca81f823fb6077c3dd174157c505228e2d5c6cb4a9eb2e60777249370ad68'}]}, 'timestamp': '2025-10-14 09:58:24.523576', '_unique_id': '234f198dd268406c8749ac1f42365b70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1fd26f2-5b4c-4e30-8056-0b7239012663', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.524513', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52e52cb2-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '51875d126c96b1fa6e66d225683386667e640f0c2c4963c9ccead64c92d79fb0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.524513', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52e533a6-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '211fe38303721a8336baa2bdcf5577c954713fd3c6066ee7600bd8b0ddc468d7'}]}, 'timestamp': '2025-10-14 09:58:24.524881', '_unique_id': 'b7cf5dfe9bed43f5aad521d29d406efa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.525 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57e30a2a-4a31-43f4-8c9b-bf42c2302880', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.525842', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e560a6-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '77b35c041eaf5ea72585c96dd4af233cb193e3f880dedd9cd8101541861e2a04'}]}, 'timestamp': '2025-10-14 09:58:24.526062', '_unique_id': '8a8a1cdf61e84645bab4f226834a2da1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.526 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdbccf77-eced-4ae5-ae71-b03f3f1e6407', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.527033', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52e59044-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.685645307, 'message_signature': '261248e25b4b645df06cbaccc95229764452dca43bfe3d3791aa590cbebcbe01'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.527033', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52e59756-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.685645307, 'message_signature': '9cc0e8cc9b5a3777e297726a7d34f6f1a93cf1a29b2ed63fe102a8de8861c962'}]}, 'timestamp': '2025-10-14 09:58:24.527434', '_unique_id': 'ca7d9fe1ff1b47d297ee4f2622fc4e0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.527 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.528 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.528 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.528 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 8191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2af42b-2121-4137-b356-be10653f5bfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 8191, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.528469', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e5c744-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '77d53884b8696146d9b45a0516bed9c14a51f020427865b0add626f6c9633b6c'}]}, 'timestamp': '2025-10-14 09:58:24.528674', '_unique_id': '279f1a3041c4490e9f161cc60c1074f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.529 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a63614b4-e4bf-494b-bffe-b085db51e15a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.529663', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e5f5c0-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': 'd1af53a18dac9f4ac3ff976b5d990638ce50aa1277dafdf0fe50a1f5dc6b0b5b'}]}, 'timestamp': '2025-10-14 09:58:24.529863', '_unique_id': '1959372f05d746469ca2daef6e1b5715'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '531a3d1a-bc3f-4a0a-8c99-edf4293c97d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8191, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.530802', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e62266-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '9b5e30a2b84799fd8576925a170e0a6e0aa1bb134f25514325f5f1e377627e3b'}]}, 'timestamp': '2025-10-14 09:58:24.531025', '_unique_id': '82f6a9c7182a4a8aa107cbf0ad4655cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.531 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '922d56cc-a270-4cb0-9b01-23c63d5afa87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:58:24.531987', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '52e8c520-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.743050007, 'message_signature': '842fa0b8129a4465e564d3c2193b8f1e43efcad416a3ea98b4a488cf05e581fc'}]}, 'timestamp': '2025-10-14 09:58:24.548279', '_unique_id': '0f435ee439914fe7a1025a1d4a9f0411'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97e2ab29-fad5-4900-bd54-0fcc10a93dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.549236', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e8f270-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '51e2de4d6d0d0c323d599a5efa5f637325e6089e6564c6d1df417bf8993cfe47'}]}, 'timestamp': '2025-10-14 09:58:24.549439', '_unique_id': 'ad9b2b30e2fb48e49962fd2eb7ac5bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.550 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1402557-bd87-4563-bd27-7cd1d2754061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.550515', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e92470-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '263eddd8a8fa0c02e43e4d9a1cbeb2b5c1376336d88f475cc2b17b74ba7559d9'}]}, 'timestamp': '2025-10-14 09:58:24.550720', '_unique_id': '2564ed45a2444c1ba41a966f4cf6ebea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.551 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 10710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '638ed815-b4c3-433b-a8b8-9f569f78a9c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10710000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T09:58:24.551721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '52e9535a-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.743050007, 'message_signature': 'f93c005c1e2d8a56d2bc16f853e82afc911a97ad1a7a189d89b5627767db1896'}]}, 'timestamp': '2025-10-14 09:58:24.551913', '_unique_id': '8dcd66049ee4430fb78056afa036224d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.552 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd29e3e5-edbe-446b-80fe-da39d1d4c51b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T09:58:24.552882', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '52e9814a-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.711125579, 'message_signature': '89af2fad6b48ace4514d8dd3a13c9cef672f9c6a348a14c74a896fff2ed2b553'}]}, 'timestamp': '2025-10-14 09:58:24.553106', '_unique_id': '276760fef3584bbbae6e7a1c60205478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.554 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.554 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.554 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37fe17e1-6592-4b7c-b0be-019dee85d201', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.554068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52e9af6c-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': 'f1827fe19f8c5c3061284d87528c18e119ba215cfa1b333bb94eab755e6d0e6a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.554068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52e9bf3e-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '3bd6c4c3c954d386bf2f60754193ca5bb61283e76d120920b3cd3b0d161e696f'}]}, 'timestamp': '2025-10-14 09:58:24.554791', '_unique_id': '39d5e76d3b2a4b3fb6d99fb365f348bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.556 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.556 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53dafe17-d57a-4194-bb71-06123c197e63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.556879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52ea2000-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '3c56fdb4c48afc4129759615cc987c07d0f63bb0eace8a3c6475c024585236c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.556879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52ea2adc-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': 'd5a6823e0a79d6b2923abf40f97258e4c1e8e550a3627e0b0e381326952a4bc8'}]}, 'timestamp': '2025-10-14 09:58:24.557471', '_unique_id': 'c95d70eda2064a159257a57d3b4e6ff6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.559 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb4857f1-b70f-4319-8463-5687196f51e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.558873', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52ea6d8a-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.685645307, 'message_signature': '2ba3d02df603bd6a9bc92bb34b9197ae0a56ce705ed5fe3b297ffa5bb15febd7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.558873', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52ea78c0-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.685645307, 'message_signature': '7b7c946ea10cda120ecbfee95a734b0413029f9a42f510a5184fb2f6f07be5de'}]}, 'timestamp': '2025-10-14 09:58:24.559475', '_unique_id': 'bd78b862cb1946098211efce2b6f77b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da11d87c-287f-4066-9164-0494536e5015', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T09:58:24.560854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52eabac4-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': '596cc8deab241350f6602f00b0874d2fe3881bc0ec3ed310e9360d6639b006a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T09:58:24.560854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52eac4e2-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 11924.646096228, 'message_signature': 'f1bf2494d95d66779f651c904deaa9f631fda9b1ecbce7a1f60f19d701053906'}]}, 'timestamp': '2025-10-14 09:58:24.561409', '_unique_id': '1507085a9fba4bed9915696fa673d989'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 09:58:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 09:58:24.562 12 ERROR oslo_messaging.notify.messaging 
Oct 14 09:58:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11443 DF PROTO=TCP SPT=47002 DPT=9102 SEQ=1528705622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9EE6420000000001030307) 
Oct 14 09:58:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:27.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:27.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:27.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:27.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:27.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:27.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:28 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=167.94.146.57 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=51 ID=47372 DF PROTO=TCP SPT=48480 DPT=19885 SEQ=935033332 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8FA76629000000000103030A) 
Oct 14 09:58:29 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=167.94.146.57 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=51 ID=47373 DF PROTO=TCP SPT=48480 DPT=19885 SEQ=935033332 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8FA76A3C000000000103030A) 
Oct 14 09:58:29 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=167.94.146.57 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=51 ID=40890 DF PROTO=TCP SPT=48510 DPT=19885 SEQ=525804276 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8FA76A90000000000103030A) 
Oct 14 09:58:30 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=167.94.146.57 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=51 ID=40891 DF PROTO=TCP SPT=48510 DPT=19885 SEQ=525804276 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8FA76E7C000000000103030A) 
Oct 14 09:58:30 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=167.94.146.57 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=51 ID=51862 DF PROTO=TCP SPT=46924 DPT=19885 SEQ=2579109665 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8FA76E9C000000000103030A) 
Oct 14 09:58:31 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=167.94.146.57 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=51 ID=51863 DF PROTO=TCP SPT=46924 DPT=19885 SEQ=2579109665 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8FA772BD000000000103030A) 
Oct 14 09:58:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:32.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:32.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:32.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:32.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:32.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:32.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:58:35 np0005486759.ooo.test podman[312162]: 2025-10-14 09:58:35.465417205 +0000 UTC m=+0.081153281 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct 14 09:58:35 np0005486759.ooo.test podman[312162]: 2025-10-14 09:58:35.477250673 +0000 UTC m=+0.092986739 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm)
Oct 14 09:58:35 np0005486759.ooo.test podman[312153]: 2025-10-14 09:58:35.435351243 +0000 UTC m=+0.066103815 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:58:35 np0005486759.ooo.test podman[312153]: 2025-10-14 09:58:35.515348668 +0000 UTC m=+0.146101280 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:58:35 np0005486759.ooo.test podman[312161]: 2025-10-14 09:58:35.561474967 +0000 UTC m=+0.178635957 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct 14 09:58:35 np0005486759.ooo.test podman[312161]: 2025-10-14 09:58:35.574425679 +0000 UTC m=+0.191586699 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:58:35 np0005486759.ooo.test podman[312154]: 2025-10-14 09:58:35.620203547 +0000 UTC m=+0.242154893 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:58:35 np0005486759.ooo.test podman[312154]: 2025-10-14 09:58:35.625993003 +0000 UTC m=+0.247944449 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3)
Oct 14 09:58:35 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:58:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:37.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:37.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:37.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:37.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:37.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:37.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:58:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:58:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:58:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:58:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:58:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15245 "" "Go-http-client/1.1"
Oct 14 09:58:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:42.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:43 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:58:43 np0005486759.ooo.test podman[312230]: 2025-10-14 09:58:43.429913596 +0000 UTC m=+0.061365022 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:58:43 np0005486759.ooo.test podman[312230]: 2025-10-14 09:58:43.502259368 +0000 UTC m=+0.133710774 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Oct 14 09:58:43 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:58:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:58:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:58:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:58:45 np0005486759.ooo.test podman[312255]: 2025-10-14 09:58:45.44740475 +0000 UTC m=+0.074103988 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public)
Oct 14 09:58:45 np0005486759.ooo.test podman[312255]: 2025-10-14 09:58:45.456852566 +0000 UTC m=+0.083551734 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Oct 14 09:58:45 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:58:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:47.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13999 DF PROTO=TCP SPT=57830 DPT=9102 SEQ=3649660340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9F3F980000000001030307) 
Oct 14 09:58:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14000 DF PROTO=TCP SPT=57830 DPT=9102 SEQ=3649660340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9F43810000000001030307) 
Oct 14 09:58:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:58:52 np0005486759.ooo.test podman[312276]: 2025-10-14 09:58:52.451541153 +0000 UTC m=+0.076430178 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 14 09:58:52 np0005486759.ooo.test podman[312276]: 2025-10-14 09:58:52.456262246 +0000 UTC m=+0.081151221 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 09:58:52 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:58:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14001 DF PROTO=TCP SPT=57830 DPT=9102 SEQ=3649660340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9F4B820000000001030307) 
Oct 14 09:58:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:52.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:52.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:52.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:52.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:52.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:52.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:58:54.164 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:58:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:58:54.164 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:58:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:58:54.165 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:58:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:58:55 np0005486759.ooo.test podman[312294]: 2025-10-14 09:58:55.424879094 +0000 UTC m=+0.056562135 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:58:55 np0005486759.ooo.test podman[312294]: 2025-10-14 09:58:55.434466306 +0000 UTC m=+0.066149367 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 09:58:55 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:58:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14002 DF PROTO=TCP SPT=57830 DPT=9102 SEQ=3649660340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9F5B420000000001030307) 
Oct 14 09:58:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:57.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:57.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:58:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:57.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5048 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:58:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:57.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:57.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:58:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:58:57.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:58:59 np0005486759.ooo.test sshd[311949]: Received disconnect from 38.102.83.114 port 45090:11: disconnected by user
Oct 14 09:58:59 np0005486759.ooo.test sshd[311949]: Disconnected from user zuul 38.102.83.114 port 45090
Oct 14 09:58:59 np0005486759.ooo.test sshd[311946]: pam_unix(sshd:session): session closed for user zuul
Oct 14 09:58:59 np0005486759.ooo.test systemd[1]: session-45.scope: Deactivated successfully.
Oct 14 09:58:59 np0005486759.ooo.test systemd-logind[759]: Session 45 logged out. Waiting for processes to exit.
Oct 14 09:58:59 np0005486759.ooo.test systemd-logind[759]: Removed session 45.
Oct 14 09:59:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:02.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:02.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:02.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5008 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:02.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:02.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:02.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:59:06 np0005486759.ooo.test podman[312317]: 2025-10-14 09:59:06.469217292 +0000 UTC m=+0.101854220 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 09:59:06 np0005486759.ooo.test podman[312317]: 2025-10-14 09:59:06.476346098 +0000 UTC m=+0.108983046 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:59:06 np0005486759.ooo.test podman[312320]: 2025-10-14 09:59:06.448979479 +0000 UTC m=+0.069872300 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 14 09:59:06 np0005486759.ooo.test podman[312318]: 2025-10-14 09:59:06.429739365 +0000 UTC m=+0.060826155 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:59:06 np0005486759.ooo.test podman[312319]: 2025-10-14 09:59:06.537018487 +0000 UTC m=+0.164179169 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 09:59:06 np0005486759.ooo.test podman[312319]: 2025-10-14 09:59:06.547260368 +0000 UTC m=+0.174421060 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:59:06 np0005486759.ooo.test podman[312318]: 2025-10-14 09:59:06.566249423 +0000 UTC m=+0.197336193 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 09:59:06 np0005486759.ooo.test podman[312320]: 2025-10-14 09:59:06.583463116 +0000 UTC m=+0.204355987 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:59:06 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:59:07 np0005486759.ooo.test systemd[1]: tmp-crun.mBPnNE.mount: Deactivated successfully.
Oct 14 09:59:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:07.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:07.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:07.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:07.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:07.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:07.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:59:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:59:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:59:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:59:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:59:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15243 "" "Go-http-client/1.1"
Oct 14 09:59:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:12.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:12.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:12.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:12.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:13.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:13.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:59:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:59:14 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:59:14 np0005486759.ooo.test podman[312394]: 2025-10-14 09:59:14.448861206 +0000 UTC m=+0.079003086 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 14 09:59:14 np0005486759.ooo.test podman[312394]: 2025-10-14 09:59:14.489444207 +0000 UTC m=+0.119586127 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 09:59:14 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:59:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:59:16 np0005486759.ooo.test systemd[1]: tmp-crun.hBkxMG.mount: Deactivated successfully.
Oct 14 09:59:16 np0005486759.ooo.test podman[312419]: 2025-10-14 09:59:16.449575881 +0000 UTC m=+0.082235724 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 09:59:16 np0005486759.ooo.test podman[312419]: 2025-10-14 09:59:16.491499842 +0000 UTC m=+0.124159675 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Oct 14 09:59:16 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:59:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:17.549 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:17.550 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:17.576 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:17.576 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 09:59:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:17.576 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.224 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.224 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.225 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.225 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.576 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.594 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.594 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.595 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.595 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.596 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.596 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.596 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.597 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.597 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.597 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.619 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.619 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.619 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.620 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.694 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.774 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.775 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.830 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.832 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.878 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.879 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 09:59:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:18.956 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.143 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.145 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12673MB free_disk=386.7118339538574GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.145 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.145 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.329 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.330 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.330 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.388 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.481 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.483 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 09:59:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:19.484 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:59:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19968 DF PROTO=TCP SPT=35940 DPT=9102 SEQ=577497466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9FB4C80000000001030307) 
Oct 14 09:59:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19969 DF PROTO=TCP SPT=35940 DPT=9102 SEQ=577497466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9FB8C10000000001030307) 
Oct 14 09:59:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19970 DF PROTO=TCP SPT=35940 DPT=9102 SEQ=577497466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9FC0C20000000001030307) 
Oct 14 09:59:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:23.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:23.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:23.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:23.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:23.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:23.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:59:23 np0005486759.ooo.test podman[312451]: 2025-10-14 09:59:23.452765596 +0000 UTC m=+0.081850592 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Oct 14 09:59:23 np0005486759.ooo.test podman[312451]: 2025-10-14 09:59:23.4581504 +0000 UTC m=+0.087235376 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 09:59:23 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:59:26 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:59:26 np0005486759.ooo.test podman[312470]: 2025-10-14 09:59:26.437652749 +0000 UTC m=+0.072477318 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 09:59:26 np0005486759.ooo.test podman[312470]: 2025-10-14 09:59:26.443427014 +0000 UTC m=+0.078251613 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:59:26 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:59:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19971 DF PROTO=TCP SPT=35940 DPT=9102 SEQ=577497466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3F9FD0810000000001030307) 
Oct 14 09:59:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:28.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:28.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:28.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:28.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:28.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:28.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:33.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:36.344 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:59:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:36.345 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 09:59:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:36.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: tmp-crun.1G2RK9.mount: Deactivated successfully.
Oct 14 09:59:37 np0005486759.ooo.test podman[312493]: 2025-10-14 09:59:37.471111198 +0000 UTC m=+0.095642650 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 09:59:37 np0005486759.ooo.test podman[312493]: 2025-10-14 09:59:37.47708914 +0000 UTC m=+0.101620552 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 09:59:37 np0005486759.ooo.test podman[312502]: 2025-10-14 09:59:37.531782768 +0000 UTC m=+0.140229142 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251009)
Oct 14 09:59:37 np0005486759.ooo.test podman[312502]: 2025-10-14 09:59:37.543321508 +0000 UTC m=+0.151767862 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 09:59:37 np0005486759.ooo.test podman[312494]: 2025-10-14 09:59:37.609397051 +0000 UTC m=+0.227158837 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 09:59:37 np0005486759.ooo.test podman[312494]: 2025-10-14 09:59:37.622379824 +0000 UTC m=+0.240141660 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 09:59:37 np0005486759.ooo.test podman[312501]: 2025-10-14 09:59:37.664242764 +0000 UTC m=+0.278331789 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 09:59:37 np0005486759.ooo.test podman[312501]: 2025-10-14 09:59:37.678258678 +0000 UTC m=+0.292347693 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Oct 14 09:59:37 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 09:59:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:38.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:40 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:40.576 287366 INFO oslo.privsep.daemon [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpcufgi9qx/privsep.sock']
Oct 14 09:59:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:41.167 287366 INFO oslo.privsep.daemon [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Spawned new privsep daemon via rootwrap
Oct 14 09:59:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:41.071 312573 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:59:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:41.074 312573 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:59:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:41.077 312573 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 14 09:59:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:41.077 312573 INFO oslo.privsep.daemon [-] privsep daemon running as pid 312573
Oct 14 09:59:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:41.348 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 09:59:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:41.706 287366 INFO oslo.privsep.daemon [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpftm0qdj6/privsep.sock']
Oct 14 09:59:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T09:59:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 09:59:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:59:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 125198 "" "Go-http-client/1.1"
Oct 14 09:59:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:42.314 287366 INFO oslo.privsep.daemon [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Spawned new privsep daemon via rootwrap
Oct 14 09:59:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:42.205 312582 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:59:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:42.208 312582 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:59:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:42.210 312582 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 14 09:59:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:42.210 312582 INFO oslo.privsep.daemon [-] privsep daemon running as pid 312582
Oct 14 09:59:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:09:59:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15245 "" "Go-http-client/1.1"
Oct 14 09:59:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:43.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:43.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:43.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:43.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:43.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:43.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:43.310 287366 INFO oslo.privsep.daemon [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpvhtyjbw5/privsep.sock']
Oct 14 09:59:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:43.919 287366 INFO oslo.privsep.daemon [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Spawned new privsep daemon via rootwrap
Oct 14 09:59:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:43.802 312594 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 09:59:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:43.807 312594 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 09:59:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:43.810 312594 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 14 09:59:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:43.811 312594 INFO oslo.privsep.daemon [-] privsep daemon running as pid 312594
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   09:59:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 09:59:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 09:59:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 09:59:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:45.394 287366 INFO neutron.agent.linux.ip_lib [None req-a9c15020-6e3d-4992-89a4-1826d7a0e4d0 - - - - - -] Device tap23a54c67-67 cannot be used as it has no MAC address
Oct 14 09:59:45 np0005486759.ooo.test podman[312604]: 2025-10-14 09:59:45.460036329 +0000 UTC m=+0.084721009 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 14 09:59:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:45.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:45 np0005486759.ooo.test kernel: device tap23a54c67-67 entered promiscuous mode
Oct 14 09:59:45 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:59:45Z|00050|binding|INFO|Claiming lport 23a54c67-6769-44ab-9ecf-154bedc639fc for this chassis.
Oct 14 09:59:45 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:59:45Z|00051|binding|INFO|23a54c67-6769-44ab-9ecf-154bedc639fc: Claiming unknown
Oct 14 09:59:45 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760435985.4834] manager: (tap23a54c67-67): new Generic device (/org/freedesktop/NetworkManager/Devices/17)
Oct 14 09:59:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:45.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:45 np0005486759.ooo.test systemd-udevd[312628]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 09:59:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:45.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:45 np0005486759.ooo.test podman[312604]: 2025-10-14 09:59:45.516058838 +0000 UTC m=+0.140743588 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 09:59:45 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:59:45Z|00052|binding|INFO|Setting lport 23a54c67-6769-44ab-9ecf-154bedc639fc ovn-installed in OVS
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: hostname: np0005486759.ooo.test
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:45.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T09:59:45Z|00053|binding|INFO|Setting lport 23a54c67-6769-44ab-9ecf-154bedc639fc up in Southbound
Oct 14 09:59:45 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:45.525 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-50f16c59-a767-4425-9416-3dac1282e38f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50f16c59-a767-4425-9416-3dac1282e38f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f515bdf62a094aee9928b50d1bd2b6e0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=25e99509-80c7-498b-9cb4-09904ec15ad0, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=23a54c67-6769-44ab-9ecf-154bedc639fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:45.527 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 23a54c67-6769-44ab-9ecf-154bedc639fc in datapath 50f16c59-a767-4425-9416-3dac1282e38f bound to our chassis
Oct 14 09:59:45 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:45.531 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port a34ab7f7-d329-47a1-b1a1-673f85e4b4b4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:45.532 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50f16c59-a767-4425-9416-3dac1282e38f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 09:59:45 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:45.535 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1af24b-6f32-4817-a53c-c68dc46f16f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap23a54c67-67: No such device
Oct 14 09:59:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:45.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:45.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:45 np0005486759.ooo.test kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 14 09:59:46 np0005486759.ooo.test podman[312706]: 
Oct 14 09:59:46 np0005486759.ooo.test podman[312706]: 2025-10-14 09:59:46.968734749 +0000 UTC m=+0.081232554 container create 7a5277be7d673f693b956c6084115b9ef0db8dc629463992d20549bfbf131112 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f16c59-a767-4425-9416-3dac1282e38f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS)
Oct 14 09:59:46 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 09:59:46 np0005486759.ooo.test systemd[1]: Started libpod-conmon-7a5277be7d673f693b956c6084115b9ef0db8dc629463992d20549bfbf131112.scope.
Oct 14 09:59:47 np0005486759.ooo.test systemd[1]: tmp-crun.JFzoDD.mount: Deactivated successfully.
Oct 14 09:59:47 np0005486759.ooo.test podman[312706]: 2025-10-14 09:59:46.926359924 +0000 UTC m=+0.038857749 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 09:59:47 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 09:59:47 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/372639d3a05837588fb956525f3d670d8d12266ae04eda819bdc1d18b93b67a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 09:59:47 np0005486759.ooo.test podman[312706]: 2025-10-14 09:59:47.052172809 +0000 UTC m=+0.164670614 container init 7a5277be7d673f693b956c6084115b9ef0db8dc629463992d20549bfbf131112 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f16c59-a767-4425-9416-3dac1282e38f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 09:59:47 np0005486759.ooo.test podman[312706]: 2025-10-14 09:59:47.06247477 +0000 UTC m=+0.174972565 container start 7a5277be7d673f693b956c6084115b9ef0db8dc629463992d20549bfbf131112 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f16c59-a767-4425-9416-3dac1282e38f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq[312735]: started, version 2.85 cachesize 150
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq[312735]: DNS service limited to local subnets
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq[312735]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq[312735]: warning: no upstream servers configured
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq-dhcp[312735]: DHCP, static leases only on 192.168.199.0, lease time 1d
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq[312735]: read /var/lib/neutron/dhcp/50f16c59-a767-4425-9416-3dac1282e38f/addn_hosts - 0 addresses
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq-dhcp[312735]: read /var/lib/neutron/dhcp/50f16c59-a767-4425-9416-3dac1282e38f/host
Oct 14 09:59:47 np0005486759.ooo.test dnsmasq-dhcp[312735]: read /var/lib/neutron/dhcp/50f16c59-a767-4425-9416-3dac1282e38f/opts
Oct 14 09:59:47 np0005486759.ooo.test podman[312721]: 2025-10-14 09:59:47.139362222 +0000 UTC m=+0.135275163 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct 14 09:59:47 np0005486759.ooo.test podman[312721]: 2025-10-14 09:59:47.154731497 +0000 UTC m=+0.150644478 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 09:59:47 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 09:59:47 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 09:59:47.296 287366 INFO neutron.agent.dhcp.agent [None req-dfde1b2f-49df-4ca6-adae-b44cb04a121c - - - - - -] DHCP configuration for ports {'0966a35b-b360-4b68-9d41-6abcd74f66fe'} is completed
Oct 14 09:59:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:48.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63229 DF PROTO=TCP SPT=41642 DPT=9102 SEQ=814401225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA029F70000000001030307) 
Oct 14 09:59:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63230 DF PROTO=TCP SPT=41642 DPT=9102 SEQ=814401225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA02E010000000001030307) 
Oct 14 09:59:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63231 DF PROTO=TCP SPT=41642 DPT=9102 SEQ=814401225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA036010000000001030307) 
Oct 14 09:59:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:53.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:53.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:53.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:53.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:53.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:53.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:54.165 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 09:59:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:54.166 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 09:59:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 09:59:54.167 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 09:59:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 09:59:54 np0005486759.ooo.test podman[312747]: 2025-10-14 09:59:54.452121174 +0000 UTC m=+0.080224314 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 09:59:54 np0005486759.ooo.test podman[312747]: 2025-10-14 09:59:54.483275108 +0000 UTC m=+0.111378218 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 14 09:59:54 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 09:59:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63232 DF PROTO=TCP SPT=41642 DPT=9102 SEQ=814401225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA045C10000000001030307) 
Oct 14 09:59:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 09:59:57 np0005486759.ooo.test podman[312765]: 2025-10-14 09:59:57.422636439 +0000 UTC m=+0.055592025 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 09:59:57 np0005486759.ooo.test podman[312765]: 2025-10-14 09:59:57.454767054 +0000 UTC m=+0.087722620 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 09:59:57 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 09:59:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:58.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:58.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 09:59:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:58.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 09:59:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:58.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 09:59:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:58.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 09:59:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 09:59:58.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:03.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:08.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:00:08 np0005486759.ooo.test podman[312791]: 2025-10-14 10:00:08.528085931 +0000 UTC m=+0.153597798 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:00:08 np0005486759.ooo.test podman[312791]: 2025-10-14 10:00:08.53830743 +0000 UTC m=+0.163819327 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:00:08 np0005486759.ooo.test podman[312790]: 2025-10-14 10:00:08.438398782 +0000 UTC m=+0.071199500 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:00:08 np0005486759.ooo.test podman[312790]: 2025-10-14 10:00:08.58939125 +0000 UTC m=+0.222191978 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:00:08 np0005486759.ooo.test podman[312796]: 2025-10-14 10:00:08.509188048 +0000 UTC m=+0.131979142 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Oct 14 10:00:08 np0005486759.ooo.test podman[312796]: 2025-10-14 10:00:08.642433897 +0000 UTC m=+0.265224981 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:00:08 np0005486759.ooo.test podman[312798]: 2025-10-14 10:00:08.695083963 +0000 UTC m=+0.313363411 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Oct 14 10:00:08 np0005486759.ooo.test podman[312798]: 2025-10-14 10:00:08.731047914 +0000 UTC m=+0.349327382 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:00:08 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:00:09 np0005486759.ooo.test systemd[1]: tmp-crun.4hbkBR.mount: Deactivated successfully.
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.192 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.192 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.193 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.209 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.209 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.209 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 14 10:00:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:11.221 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:12.227 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:00:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:00:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:00:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 127024 "" "Go-http-client/1.1"
Oct 14 10:00:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:00:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15718 "" "Go-http-client/1.1"
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:13.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:00:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:00:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:14.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:14.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.187 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.189 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.227 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.228 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.229 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.229 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.341 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.413 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.415 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.469 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.472 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:00:15 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:00:15Z|00054|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.533 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.535 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.578 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.798 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.800 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12388MB free_disk=386.71166229248047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.801 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:00:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:15.801 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:00:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:00:16 np0005486759.ooo.test podman[312883]: 2025-10-14 10:00:16.454638809 +0000 UTC m=+0.078602934 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:00:16 np0005486759.ooo.test podman[312883]: 2025-10-14 10:00:16.491048923 +0000 UTC m=+0.115013008 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:00:16 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:00:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:16.519 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:00:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:16.520 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:00:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:16.520 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.151 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 10:00:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:00:17 np0005486759.ooo.test podman[312908]: 2025-10-14 10:00:17.444043104 +0000 UTC m=+0.075504930 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=)
Oct 14 10:00:17 np0005486759.ooo.test systemd[1]: tmp-crun.tVVlmy.mount: Deactivated successfully.
Oct 14 10:00:17 np0005486759.ooo.test podman[312908]: 2025-10-14 10:00:17.477731395 +0000 UTC m=+0.109193201 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Oct 14 10:00:17 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.602 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.603 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.654 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.682 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.734 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.762 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.764 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:00:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:17.764 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.765 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.765 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:00:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:18.765 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.292 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.292 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.292 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.293 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:00:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=482 DF PROTO=TCP SPT=34048 DPT=9102 SEQ=971701511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA09F280000000001030307) 
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.795 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.813 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:00:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:19.814 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:00:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=483 DF PROTO=TCP SPT=34048 DPT=9102 SEQ=971701511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA0A3410000000001030307) 
Oct 14 10:00:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=484 DF PROTO=TCP SPT=34048 DPT=9102 SEQ=971701511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA0AB420000000001030307) 
Oct 14 10:00:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:23.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:23.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:23.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:23.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:23.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:23.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.450 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.451 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.479 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.479 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5f6ab94-d65c-454f-9000-b3b30b5779cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.451587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a64ddb2-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': 'f4fa5800847313defcd9aa94793f8e0ee71d30a8b6c9c65ea495c629bde8657e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.451587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a64f306-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': 'eace33e9a7306580553c9ef1383702f7cb26654f8526aaf4686863ed611e7dbc'}]}, 'timestamp': '2025-10-14 10:00:24.480456', '_unique_id': '857fd1c9c6104d3abd0e0946365d3d62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.482 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.483 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.487 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01ec66bf-ca84-4185-8641-f66383add073', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.483561', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a661e5c-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': 'd728f40c05e16a412d07a0a1cd2aa4baf43d5e6c038e4c0cc0fbaf3390df4ffb'}]}, 'timestamp': '2025-10-14 10:00:24.488197', '_unique_id': '1527670183b04673b15c1a024005b6a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.490 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '611325ad-ddd7-4617-86e5-e9aa979cd2a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.490516', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a668d92-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': 'aa7e6107792001ae538866d0c7443de05cb56272a88b20f2820c025f20930a3b'}]}, 'timestamp': '2025-10-14 10:00:24.491014', '_unique_id': '936d93d68785453e87fde545de79510b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.491 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.493 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e37283a1-8406-49f6-997c-5d25847b67ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.493198', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a66f64c-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': '4933bf417da9cff422e1db01df95abf697204013750264ff551538dcd9822633'}]}, 'timestamp': '2025-10-14 10:00:24.493658', '_unique_id': '4b4cbc6f5239476d8b2ad412a57d9c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.494 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.495 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.496 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f989ab32-1f0e-4956-bb2b-1aba3c93f61d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.495756', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a675b14-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '2b58bbb64f8dcfc0e79e101ff36b2e9ad3d704264b408a72dca8213cbbd54ff8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.495756', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a676b54-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '58119608c6c5874e3982b295f3918ca19c4aebf9e7337feed4c180dccd4671fb'}]}, 'timestamp': '2025-10-14 10:00:24.496626', '_unique_id': 'b0a6f9068ea24c01ad87204c54eb9af3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.497 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.498 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.515 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 11300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1030fb4a-9c66-43bb-92a1-ed582912817a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11300000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:00:24.498810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9a6a68e0-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.710652375, 'message_signature': '17930901301c3a3096974e66ace8258de38a07058216eeb329113de89a73438c'}]}, 'timestamp': '2025-10-14 10:00:24.516267', '_unique_id': '1f582c98ea5b431d8b21c572fba958cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.517 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.518 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.518 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6675c2fb-067d-4562-ba8f-908b192c8766', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:00:24.518631', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9a6ad7bc-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.710652375, 'message_signature': '9cd305c42983a0ff84c6583bff26c169971a09056b1b61bf3f0ac426a9f65c08'}]}, 'timestamp': '2025-10-14 10:00:24.519106', '_unique_id': '5e328fb73d8247978bf35b559a336ecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.520 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.521 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33361396-3bf3-44d5-98a7-83bf49cff20f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.521421', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a6b44c2-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '611aad2e9ac794fe468ebb908e44ec2efcf80d4f1b809cffed1f54cbb0ede162'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.521421', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a6b56f6-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '7d8da26a515b0a7d15c3b9e7f03ebe83971f02440262122ccc7d0e1a1ec49f36'}]}, 'timestamp': '2025-10-14 10:00:24.522321', '_unique_id': 'acc35759f257457fb8afb45327cd6ad8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.524 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '350e3c77-91a3-486c-baf8-42c922e23e73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.524764', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a6bcb22-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': 'e15b0505042b062c831e7494e3ac4442f752f7ccb4ee4e5da3b9463656e63e74'}]}, 'timestamp': '2025-10-14 10:00:24.525318', '_unique_id': 'ea61237506264d739aa07bd878485767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8536bb2b-1f16-4080-adf5-22469282c5d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.527452', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a6c3030-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': '530e02806166352238b34f427ac8516740d43f9b876301494a5bc19a54cc1f13'}]}, 'timestamp': '2025-10-14 10:00:24.527905', '_unique_id': '9a35a9e81e604bfdb55c91e75751e616'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.528 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63f52498-a23b-48a5-8a58-5876f2274730', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.530109', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a6c9822-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '917900aed9749bb6c67ec9f8d2b742c79a21f59ad26901e73443bad973402101'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.530109', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a6ca826-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': 'e57df00f9b5f98fd946d40af022907b4c3da36676365a72df47b64d70e8c2946'}]}, 'timestamp': '2025-10-14 10:00:24.530950', '_unique_id': 'c3ff35677bba4cf8b72d829e1a8f3bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.531 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.546 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.546 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88eb2a27-3486-4f5f-b35c-6bcb02931bf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.533405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a6f0e36-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.728494606, 'message_signature': '7bff251de6382890cbd2dc335d018cba059f2f8909989141da02620fe1ac6cb6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.533405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a6f1e80-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.728494606, 'message_signature': '0ce1f4b951d5e7e335bb15a2439f1a2d3355ff8443890ed923f32cd78368a9ea'}]}, 'timestamp': '2025-10-14 10:00:24.547114', '_unique_id': '28154ab1a45143eca1dac469fed69dd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.549 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30466bc1-5d9a-462a-86f8-7c890b77f53c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.549365', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a6f88ca-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': '164fdd9691e0800bee56256a2967307117d58a6161a3477331c9831e5dad6cdd'}]}, 'timestamp': '2025-10-14 10:00:24.549834', '_unique_id': 'ea7ce1866aa7429da9eff870b278b1c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.550 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.551 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.552 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.552 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.552 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e06f694b-763f-48c8-a25b-43985ba7805e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.552362', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a6ffd78-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': 'aa8ba3827d79b1243afa4928b2e957f118406f5df64142d4ffcdc752658acebe'}]}, 'timestamp': '2025-10-14 10:00:24.552822', '_unique_id': 'cdccc966cb774aa482a1ea56530b8c62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.553 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.555 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.555 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efdc17a7-64a6-4fb7-9149-1043b71aaa1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.555220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a706c90-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.728494606, 'message_signature': '6f11fad19da5c62ad288e25a5be9a911ffb0d9030fbcd2ac6b00f3ee3ee3dece'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.555220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a707c8a-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.728494606, 'message_signature': '05aed8be453d5ddca454aa00fbd741c2342ad0e7e3c563868ae4707df543ab19'}]}, 'timestamp': '2025-10-14 10:00:24.556076', '_unique_id': 'f30755e6861f4a79b77717e9dd092025'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.558 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc4747bc-0435-4fd4-b482-6fd9b944884f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.558275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a70e418-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': 'a7ebbf069e74c5b1bd38b2e1694994a5f307bfd996a01fc923e8b161a38acb31'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.558275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a70f3e0-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '433e8ebb9e0da8a70382136fd0729cd13b12ff40730d3fe0bca5b855cdd0ace8'}]}, 'timestamp': '2025-10-14 10:00:24.559156', '_unique_id': 'a482de5ab5b849df9ea27f6945e5c3fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.561 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8830734a-f194-459b-b486-8d67ac328c27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.561347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a715c68-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.728494606, 'message_signature': '51b89311c4a7f8c3920be30348f204a4709c2dde69fb5c319a8e199f2984eb46'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.561347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a717518-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.728494606, 'message_signature': '386657e1256b8d727feae99ba8069b589c11a4d67b135405fe322bce5cc03367'}]}, 'timestamp': '2025-10-14 10:00:24.562357', '_unique_id': '68dd97a903a34a2bbecd70d54c573458'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.563 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1645d94-2fe1-45c2-8c9c-2bd4da780bd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:00:24.563789', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a71b8ac-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '52730febd51f223e076dd4f99570388d7414fdb1e94c03b071ffe7709dc58f4b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:00:24.563789', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a71c482-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.646671945, 'message_signature': '43aa3b753984b6090caf33e64e06d5497c20b07bc81bcc3f5cb12b3e08390412'}]}, 'timestamp': '2025-10-14 10:00:24.564367', '_unique_id': '763a4ea2264e40cd9cb039266dc776a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.564 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.565 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aae8e5df-cfd4-46e8-9995-f57696bae607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.565794', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a720726-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': '07334e426935b5d7ce116261907d0963ce44a1910947d64b605d3fba933a2e87'}]}, 'timestamp': '2025-10-14 10:00:24.566120', '_unique_id': '40bcd00c9b154fe3852112929acd83ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.567 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2ad3a84-bb35-4398-ad38-2cb75b7bdde8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8191, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.567506', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a7249fc-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': '66cf903813e59a619476bbe73a10b7c6b20bbf3c302eb1f17b42738af28b5a10'}]}, 'timestamp': '2025-10-14 10:00:24.567811', '_unique_id': 'b749607a84754238a44ff07b7a71f977'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.569 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.569 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f207ae1-ddcd-4031-9530-04c2cb08493e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:00:24.569251', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '9a728e08-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12044.678682175, 'message_signature': '9f4d1d6045ffcaa5c4eabcfca9d22547972c9f188116cb39012cbc9b5d8ac7ed'}]}, 'timestamp': '2025-10-14 10:00:24.569548', '_unique_id': '1aebe605eb06431c88a3eba8b0c27669'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:00:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:00:24.570 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:00:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:00:25 np0005486759.ooo.test podman[312928]: 2025-10-14 10:00:25.453857916 +0000 UTC m=+0.078927515 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:00:25 np0005486759.ooo.test podman[312928]: 2025-10-14 10:00:25.488431534 +0000 UTC m=+0.113501163 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:00:25 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:00:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=485 DF PROTO=TCP SPT=34048 DPT=9102 SEQ=971701511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA0BB010000000001030307) 
Oct 14 10:00:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:00:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:28.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:28.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:28.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:28.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:28.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:28.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:28 np0005486759.ooo.test podman[312946]: 2025-10-14 10:00:28.48699537 +0000 UTC m=+0.072913622 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:00:28 np0005486759.ooo.test podman[312946]: 2025-10-14 10:00:28.522536567 +0000 UTC m=+0.108454789 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:00:28 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:00:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:33.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:00:36.455 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:00:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:36.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:00:36.458 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:00:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:38.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:00:39 np0005486759.ooo.test podman[312970]: 2025-10-14 10:00:39.485759016 +0000 UTC m=+0.095439764 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: tmp-crun.zHRihD.mount: Deactivated successfully.
Oct 14 10:00:39 np0005486759.ooo.test podman[312969]: 2025-10-14 10:00:39.463519043 +0000 UTC m=+0.080362278 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 14 10:00:39 np0005486759.ooo.test podman[312968]: 2025-10-14 10:00:39.53043281 +0000 UTC m=+0.148888844 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:00:39 np0005486759.ooo.test podman[312968]: 2025-10-14 10:00:39.536521666 +0000 UTC m=+0.154977730 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:00:39 np0005486759.ooo.test podman[312969]: 2025-10-14 10:00:39.542414484 +0000 UTC m=+0.159257669 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:00:39 np0005486759.ooo.test podman[312970]: 2025-10-14 10:00:39.552888391 +0000 UTC m=+0.162569179 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, org.label-schema.build-date=20251009)
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:00:39 np0005486759.ooo.test podman[312981]: 2025-10-14 10:00:39.643723086 +0000 UTC m=+0.247103043 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct 14 10:00:39 np0005486759.ooo.test podman[312981]: 2025-10-14 10:00:39.65839462 +0000 UTC m=+0.261774617 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 10:00:39 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:00:40 np0005486759.ooo.test systemd[1]: tmp-crun.9NMCgY.mount: Deactivated successfully.
Oct 14 10:00:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:00:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:00:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:00:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 127024 "" "Go-http-client/1.1"
Oct 14 10:00:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:00:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15720 "" "Go-http-client/1.1"
Oct 14 10:00:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:43.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:43.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:43.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:43.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:43.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:43.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:00:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:00:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:00:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:00:44.460 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:00:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:00:47 np0005486759.ooo.test podman[313046]: 2025-10-14 10:00:47.443022557 +0000 UTC m=+0.068636612 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:00:47 np0005486759.ooo.test podman[313046]: 2025-10-14 10:00:47.482422902 +0000 UTC m=+0.108036927 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 10:00:47 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:00:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:00:47 np0005486759.ooo.test podman[313071]: 2025-10-14 10:00:47.600768089 +0000 UTC m=+0.078050437 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct 14 10:00:47 np0005486759.ooo.test podman[313071]: 2025-10-14 10:00:47.615343671 +0000 UTC m=+0.092626069 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Oct 14 10:00:47 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:00:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:48.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:48.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:48.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:48.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:48.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:48.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24297 DF PROTO=TCP SPT=47862 DPT=9102 SEQ=2223107533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA114580000000001030307) 
Oct 14 10:00:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24298 DF PROTO=TCP SPT=47862 DPT=9102 SEQ=2223107533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA118410000000001030307) 
Oct 14 10:00:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24299 DF PROTO=TCP SPT=47862 DPT=9102 SEQ=2223107533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA120410000000001030307) 
Oct 14 10:00:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:53.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:53.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:53.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:53.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:53.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:53.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:00:54.167 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:00:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:00:54.167 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:00:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:00:54.168 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:00:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:00:56 np0005486759.ooo.test podman[313091]: 2025-10-14 10:00:56.463310308 +0000 UTC m=+0.087700719 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:00:56 np0005486759.ooo.test podman[313091]: 2025-10-14 10:00:56.494153543 +0000 UTC m=+0.118543924 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 14 10:00:56 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:00:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24300 DF PROTO=TCP SPT=47862 DPT=9102 SEQ=2223107533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA130010000000001030307) 
Oct 14 10:00:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:58.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:58.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:00:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:58.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:00:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:58.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:58.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:00:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:58.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:00:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:00:59 np0005486759.ooo.test podman[313109]: 2025-10-14 10:00:59.44818858 +0000 UTC m=+0.069884749 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 10:00:59 np0005486759.ooo.test podman[313109]: 2025-10-14 10:00:59.48445081 +0000 UTC m=+0.106147009 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:00:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:59.483 2 DEBUG oslo_concurrency.processutils [None req-786e53d2-68ba-4d61-8ec2-1e4d76e31325 47f47db377ab430392f8911c8c093a60 72769ccafab2440d89c54473f5b2299b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:00:59 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:00:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:00:59.511 2 DEBUG oslo_concurrency.processutils [None req-786e53d2-68ba-4d61-8ec2-1e4d76e31325 47f47db377ab430392f8911c8c093a60 72769ccafab2440d89c54473f5b2299b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:01:01 np0005486759.ooo.test CROND[313135]: (root) CMD (run-parts /etc/cron.hourly)
Oct 14 10:01:01 np0005486759.ooo.test run-parts[313138]: (/etc/cron.hourly) starting 0anacron
Oct 14 10:01:01 np0005486759.ooo.test run-parts[313144]: (/etc/cron.hourly) finished 0anacron
Oct 14 10:01:01 np0005486759.ooo.test CROND[313134]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 14 10:01:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:03.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:01:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:03.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:03.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:01:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:03.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:01:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:03.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:01:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:03.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:08.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:01:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:08.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:08.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 14 10:01:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:08.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:01:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:08.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 14 10:01:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:08.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:01:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:01:09 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:09.870 287366 INFO neutron.agent.linux.ip_lib [None req-76ff7af7-440e-4bcb-9d8a-7812b915be80 - - - - - -] Device tap3ac5f212-9e cannot be used as it has no MAC address
Oct 14 10:01:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:09.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:09 np0005486759.ooo.test kernel: device tap3ac5f212-9e entered promiscuous mode
Oct 14 10:01:09 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:09Z|00055|binding|INFO|Claiming lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a for this chassis.
Oct 14 10:01:09 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:09Z|00056|binding|INFO|3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a: Claiming unknown
Oct 14 10:01:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:09.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:09 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436069.9062] manager: (tap3ac5f212-9e): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct 14 10:01:09 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:09Z|00057|binding|INFO|Setting lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a ovn-installed in OVS
Oct 14 10:01:09 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:09Z|00058|binding|INFO|Setting lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a up in Southbound
Oct 14 10:01:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:09.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:09.909 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-a6b02595-ce43-43c7-aca8-531937571464', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6b02595-ce43-43c7-aca8-531937571464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51305e49-a7d0-486f-a42f-28329bac1f69, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:01:09 np0005486759.ooo.test podman[313148]: 2025-10-14 10:01:09.913496398 +0000 UTC m=+0.098611140 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:01:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:09.913 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a in datapath a6b02595-ce43-43c7-aca8-531937571464 bound to our chassis
Oct 14 10:01:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:09.916 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port aa18e873-000f-4f86-969f-374cd84a0bc2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:01:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:09.916 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6b02595-ce43-43c7-aca8-531937571464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:01:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:09.920 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[802fa47f-7826-4967-8be3-82d2d1eb1bff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:01:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:09.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:09 np0005486759.ooo.test podman[313157]: 2025-10-14 10:01:09.953055227 +0000 UTC m=+0.127014611 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Oct 14 10:01:09 np0005486759.ooo.test podman[313150]: 2025-10-14 10:01:09.972710223 +0000 UTC m=+0.149640987 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3)
Oct 14 10:01:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:09.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:10 np0005486759.ooo.test podman[313148]: 2025-10-14 10:01:10.000501586 +0000 UTC m=+0.185616298 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:01:10 np0005486759.ooo.test podman[313150]: 2025-10-14 10:01:10.010412797 +0000 UTC m=+0.187343561 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 10:01:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:10.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:10 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:01:10 np0005486759.ooo.test podman[313159]: 2025-10-14 10:01:10.0161464 +0000 UTC m=+0.181883025 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 10:01:10 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:01:10 np0005486759.ooo.test podman[313157]: 2025-10-14 10:01:10.038872989 +0000 UTC m=+0.212832363 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:01:10 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:01:10 np0005486759.ooo.test podman[313159]: 2025-10-14 10:01:10.055592326 +0000 UTC m=+0.221328931 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true)
Oct 14 10:01:10 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:01:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:10.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:10 np0005486759.ooo.test podman[313292]: 
Oct 14 10:01:10 np0005486759.ooo.test podman[313292]: 2025-10-14 10:01:10.864477859 +0000 UTC m=+0.087597217 container create e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:01:10 np0005486759.ooo.test systemd[1]: Started libpod-conmon-e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617.scope.
Oct 14 10:01:10 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:01:10 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e52fe15b6a921871dc2ec02d06dadbf52a1fbb50c7c6460b53149eb2ba6a139/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:01:10 np0005486759.ooo.test podman[313292]: 2025-10-14 10:01:10.824193307 +0000 UTC m=+0.047312725 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:01:10 np0005486759.ooo.test podman[313292]: 2025-10-14 10:01:10.937097391 +0000 UTC m=+0.160216799 container init e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:01:10 np0005486759.ooo.test podman[313292]: 2025-10-14 10:01:10.946825355 +0000 UTC m=+0.169944743 container start e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq[313311]: started, version 2.85 cachesize 150
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq[313311]: DNS service limited to local subnets
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq[313311]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq[313311]: warning: no upstream servers configured
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq-dhcp[313311]: DHCP, static leases only on 192.168.122.0, lease time 1d
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:10 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:11.007 287366 INFO neutron.agent.dhcp.agent [None req-f37a69f2-b590-4ed1-9336-c6b6c89228e2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:08Z, description=, device_id=188b8064-86cb-4f22-a10d-31fb9fcda59a, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f2100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f2520>], id=90fcef7f-673c-43e1-bc03-f3e3a72a5108, ip_allocation=immediate, mac_address=fa:16:3e:1a:ad:82, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=269, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:08Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:11 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:01:11 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:11 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:11 np0005486759.ooo.test podman[313329]: 2025-10-14 10:01:11.168524507 +0000 UTC m=+0.042145339 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:01:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:11.428 287366 INFO neutron.agent.dhcp.agent [None req-d746a627-68a5-46fb-8288-5ee329bf72cd - - - - - -] DHCP configuration for ports {'b60156a8-258f-4e97-bbcb-db7630f22c8b', '90fcef7f-673c-43e1-bc03-f3e3a72a5108', '20e0600d-00d1-463c-a458-e3c8e895ea5f', '0e7d62dc-0cdc-4b6f-b4eb-7db2250fa9f3'} is completed
Oct 14 10:01:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:11.608 287366 INFO neutron.agent.dhcp.agent [None req-43594a49-b133-47b5-b32d-fe43fda5dced - - - - - -] DHCP configuration for ports {'90fcef7f-673c-43e1-bc03-f3e3a72a5108'} is completed
Oct 14 10:01:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:12.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:01:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:01:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:01:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 128850 "" "Go-http-client/1.1"
Oct 14 10:01:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:01:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16203 "" "Go-http-client/1.1"
Oct 14 10:01:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:12.366 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:12Z, description=, device_id=baa03708-d09b-4a0a-b4df-83a4fe3da122, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7a8d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7a86d0>], id=554eef83-508b-48cf-9925-82ed6a16ca2f, ip_allocation=immediate, mac_address=fa:16:3e:ef:f3:26, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=310, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:12Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:12 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:01:12 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:12 np0005486759.ooo.test podman[313365]: 2025-10-14 10:01:12.571990915 +0000 UTC m=+0.053743990 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:01:12 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:12.774 287366 INFO neutron.agent.dhcp.agent [None req-0f4a36d1-fe5e-4c9c-b9db-c8ed3bbf5ea5 - - - - - -] DHCP configuration for ports {'554eef83-508b-48cf-9925-82ed6a16ca2f'} is completed
Oct 14 10:01:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:13.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:13.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:01:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:01:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:14.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.185 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.188 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.189 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.205 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.206 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.207 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.207 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.262 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.340 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.341 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.401 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.402 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.459 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.461 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.506 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.730 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.732 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12382MB free_disk=386.7115669250488GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.732 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.732 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.810 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.811 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.811 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.851 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.865 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.867 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:01:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:15.867 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:01:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:16.867 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:16.868 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:16.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:17.186 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:17.205 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:01:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:17.205 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:01:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:17.206 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:01:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:18.120 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:17Z, description=, device_id=04676fff-820a-495d-aeed-3df998072404, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7b9490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7b92b0>], id=4ab940b4-c018-4a50-9182-70b9a0b83ab2, ip_allocation=immediate, mac_address=fa:16:3e:d8:cd:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=358, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:17Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:01:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:01:18 np0005486759.ooo.test systemd[1]: tmp-crun.j3aHvb.mount: Deactivated successfully.
Oct 14 10:01:18 np0005486759.ooo.test podman[313403]: 2025-10-14 10:01:18.454116903 +0000 UTC m=+0.080949635 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 10:01:18 np0005486759.ooo.test podman[313403]: 2025-10-14 10:01:18.52359318 +0000 UTC m=+0.150425812 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251009)
Oct 14 10:01:18 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:01:18 np0005486759.ooo.test systemd[1]: tmp-crun.2iUbdX.mount: Deactivated successfully.
Oct 14 10:01:18 np0005486759.ooo.test podman[313404]: 2025-10-14 10:01:18.596638084 +0000 UTC m=+0.214568996 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 10:01:18 np0005486759.ooo.test podman[313404]: 2025-10-14 10:01:18.604229374 +0000 UTC m=+0.222160286 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7)
Oct 14 10:01:18 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:01:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:18.639 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:01:18 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:01:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:18.639 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:01:18 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:18.640 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:01:18 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:18.640 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:01:18 np0005486759.ooo.test podman[313447]: 2025-10-14 10:01:18.642822874 +0000 UTC m=+0.109087968 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:01:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:18.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:18.839 287366 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpolqqru7v/privsep.sock']
Oct 14 10:01:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:18.855 287366 INFO neutron.agent.dhcp.agent [None req-68508723-642b-47ce-8418-e82f5480d0fc - - - - - -] DHCP configuration for ports {'4ab940b4-c018-4a50-9182-70b9a0b83ab2'} is completed
Oct 14 10:01:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:19Z|00059|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:01:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:19.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:19.270 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:01:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:19.296 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:01:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:19.297 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:01:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:19.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:19 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:19.475 287366 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 14 10:01:19 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:19.382 313483 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 10:01:19 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:19.385 313483 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 10:01:19 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:19.387 313483 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 14 10:01:19 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:19.387 313483 INFO oslo.privsep.daemon [-] privsep daemon running as pid 313483
Oct 14 10:01:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4 DF PROTO=TCP SPT=56642 DPT=9102 SEQ=2402070829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA189880000000001030307) 
Oct 14 10:01:19 np0005486759.ooo.test dnsmasq-dhcp[313311]: DHCPRELEASE(tap3ac5f212-9e) 192.168.122.218 fa:16:3e:1a:ad:82
Oct 14 10:01:20 np0005486759.ooo.test podman[313505]: 2025-10-14 10:01:20.205871541 +0000 UTC m=+0.052637477 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:01:20 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:01:20 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:20 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:20 np0005486759.ooo.test systemd[1]: tmp-crun.8N4wfA.mount: Deactivated successfully.
Oct 14 10:01:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5 DF PROTO=TCP SPT=56642 DPT=9102 SEQ=2402070829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA18D810000000001030307) 
Oct 14 10:01:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6 DF PROTO=TCP SPT=56642 DPT=9102 SEQ=2402070829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA195810000000001030307) 
Oct 14 10:01:22 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:22.699 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:22Z, description=, device_id=f316ceef-29c6-4078-af01-ce504caa4a05, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7aef40>], id=ba781a82-10cd-419f-8352-93103be1dd99, ip_allocation=immediate, mac_address=fa:16:3e:c1:96:bd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=385, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:22Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:22 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:01:22 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:22 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:22 np0005486759.ooo.test podman[313543]: 2025-10-14 10:01:22.936514646 +0000 UTC m=+0.051652467 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:01:22 np0005486759.ooo.test systemd[1]: tmp-crun.vp5ddt.mount: Deactivated successfully.
Oct 14 10:01:23 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:23.132 287366 INFO neutron.agent.dhcp.agent [None req-155bd5ed-05b2-40ba-b375-701ed82c004b - - - - - -] DHCP configuration for ports {'ba781a82-10cd-419f-8352-93103be1dd99'} is completed
Oct 14 10:01:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:23.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:24 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:24.327 287366 INFO neutron.agent.linux.ip_lib [None req-82a61f83-9e34-4334-a107-41af8307a891 - - - - - -] Device tap768c8abc-dd cannot be used as it has no MAC address
Oct 14 10:01:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:24.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:24 np0005486759.ooo.test kernel: device tap768c8abc-dd entered promiscuous mode
Oct 14 10:01:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:24.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:24 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436084.3584] manager: (tap768c8abc-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Oct 14 10:01:24 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:24Z|00060|binding|INFO|Claiming lport 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 for this chassis.
Oct 14 10:01:24 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:24Z|00061|binding|INFO|768c8abc-dd70-4c99-aa8b-a56fb54ba2e7: Claiming unknown
Oct 14 10:01:24 np0005486759.ooo.test systemd-udevd[313575]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:01:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:24.372 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-09e16ed7-f945-41a6-8272-927947e04a34', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e16ed7-f945-41a6-8272-927947e04a34', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5707afc1-8cb1-498a-ae11-72c1640f3fbb, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=768c8abc-dd70-4c99-aa8b-a56fb54ba2e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:01:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:24.374 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 in datapath 09e16ed7-f945-41a6-8272-927947e04a34 bound to our chassis
Oct 14 10:01:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:24.377 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 09e16ed7-f945-41a6-8272-927947e04a34 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:01:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:24.378 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[25a9b683-dea2-4859-86ff-3c6c95041dd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:24Z|00062|binding|INFO|Setting lport 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 ovn-installed in OVS
Oct 14 10:01:24 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:24Z|00063|binding|INFO|Setting lport 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 up in Southbound
Oct 14 10:01:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:24.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap768c8abc-dd: No such device
Oct 14 10:01:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:24.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:24.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:24.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:25.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:25 np0005486759.ooo.test podman[313646]: 
Oct 14 10:01:25 np0005486759.ooo.test podman[313646]: 2025-10-14 10:01:25.261021888 +0000 UTC m=+0.083157273 container create 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:01:25 np0005486759.ooo.test systemd[1]: Started libpod-conmon-318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4.scope.
Oct 14 10:01:25 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:01:25 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1553636b6c66d7c9bcf4d828e312c70ff1c925d001df03c50b938a645a3188e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:01:25 np0005486759.ooo.test podman[313646]: 2025-10-14 10:01:25.322642886 +0000 UTC m=+0.144778281 container init 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:01:25 np0005486759.ooo.test podman[313646]: 2025-10-14 10:01:25.224721157 +0000 UTC m=+0.046856652 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:01:25 np0005486759.ooo.test podman[313646]: 2025-10-14 10:01:25.334615199 +0000 UTC m=+0.156750584 container start 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq[313665]: started, version 2.85 cachesize 150
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq[313665]: DNS service limited to local subnets
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq[313665]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq[313665]: warning: no upstream servers configured
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq-dhcp[313665]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 0 addresses
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:01:25 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:01:25 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:25.495 287366 INFO neutron.agent.dhcp.agent [None req-66a2b021-5117-4d5f-8ce9-77e962736488 - - - - - -] DHCP configuration for ports {'342cf9cd-72d7-444c-a483-046dfbb42d73'} is completed
Oct 14 10:01:26 np0005486759.ooo.test systemd[1]: tmp-crun.798KdL.mount: Deactivated successfully.
Oct 14 10:01:26 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:26.291 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:26Z, description=, device_id=32e4d5dd-06db-4897-bc8d-076756fa3e06, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec85c970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec87f070>], id=e54a55af-e0ef-444c-b7ef-ddc25d7bc13d, ip_allocation=immediate, mac_address=fa:16:3e:65:c4:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=410, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:26Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:26 np0005486759.ooo.test podman[313683]: 2025-10-14 10:01:26.451623613 +0000 UTC m=+0.037898550 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:01:26 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 5 addresses
Oct 14 10:01:26 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:26 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7 DF PROTO=TCP SPT=56642 DPT=9102 SEQ=2402070829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA1A5410000000001030307) 
Oct 14 10:01:26 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:26.654 287366 INFO neutron.agent.dhcp.agent [None req-5afcb4e8-7acf-43ef-bae3-2c710170aa9c - - - - - -] DHCP configuration for ports {'e54a55af-e0ef-444c-b7ef-ddc25d7bc13d'} is completed
Oct 14 10:01:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:01:27 np0005486759.ooo.test systemd[1]: tmp-crun.Q9gKDe.mount: Deactivated successfully.
Oct 14 10:01:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:27.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:27 np0005486759.ooo.test podman[313706]: 2025-10-14 10:01:27.489032654 +0000 UTC m=+0.115172412 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent)
Oct 14 10:01:27 np0005486759.ooo.test podman[313706]: 2025-10-14 10:01:27.493315925 +0000 UTC m=+0.119455723 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 10:01:27 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:01:28 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:28.511 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:28Z, description=, device_id=6ec7a3f4-e250-4e33-9f65-05b531382e94, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77ab20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77a190>], id=257bb082-fef3-40b7-9ebe-e81f601ac68f, ip_allocation=immediate, mac_address=fa:16:3e:95:87:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=422, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:28Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:28.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:28 np0005486759.ooo.test podman[313741]: 2025-10-14 10:01:28.77497248 +0000 UTC m=+0.104812789 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:01:28 np0005486759.ooo.test systemd[1]: tmp-crun.woEqgm.mount: Deactivated successfully.
Oct 14 10:01:28 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 6 addresses
Oct 14 10:01:28 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:28 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:29.019 287366 INFO neutron.agent.dhcp.agent [None req-86fd93b1-45cc-4840-969e-11f16c0cfae9 - - - - - -] DHCP configuration for ports {'257bb082-fef3-40b7-9ebe-e81f601ac68f'} is completed
Oct 14 10:01:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:29.026 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:28Z, description=, device_id=32e4d5dd-06db-4897-bc8d-076756fa3e06, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77adf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77adc0>], id=b5ec12b0-7130-4e8d-a811-6519dd71da49, ip_allocation=immediate, mac_address=fa:16:3e:8f:4d:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:21Z, description=, dns_domain=, id=09e16ed7-f945-41a6-8272-927947e04a34, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-644143256-network, port_security_enabled=True, project_id=8d76666f6fdc42e2bc8cc94f577392fc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58969, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=375, status=ACTIVE, subnets=['8aef6a1c-5082-41bf-8b62-97759a28d29c'], tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, updated_at=2025-10-14T10:01:23Z, vlan_transparent=None, network_id=09e16ed7-f945-41a6-8272-927947e04a34, port_security_enabled=False, project_id=8d76666f6fdc42e2bc8cc94f577392fc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=423, status=DOWN, tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, updated_at=2025-10-14T10:01:28Z on network 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:01:29 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 1 addresses
Oct 14 10:01:29 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:01:29 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:01:29 np0005486759.ooo.test podman[313779]: 2025-10-14 10:01:29.243315038 +0000 UTC m=+0.057635978 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:01:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:29.495 287366 INFO neutron.agent.dhcp.agent [None req-6b083bc8-c9e5-42c7-b86e-b0b3c6e49782 - - - - - -] DHCP configuration for ports {'b5ec12b0-7130-4e8d-a811-6519dd71da49'} is completed
Oct 14 10:01:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:01:30 np0005486759.ooo.test podman[313799]: 2025-10-14 10:01:30.440480422 +0000 UTC m=+0.065567769 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:01:30 np0005486759.ooo.test podman[313799]: 2025-10-14 10:01:30.451456294 +0000 UTC m=+0.076543621 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 10:01:30 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:01:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:30.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:31.655 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:28Z, description=, device_id=32e4d5dd-06db-4897-bc8d-076756fa3e06, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7de730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7de250>], id=b5ec12b0-7130-4e8d-a811-6519dd71da49, ip_allocation=immediate, mac_address=fa:16:3e:8f:4d:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:21Z, description=, dns_domain=, id=09e16ed7-f945-41a6-8272-927947e04a34, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-644143256-network, port_security_enabled=True, project_id=8d76666f6fdc42e2bc8cc94f577392fc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58969, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=375, status=ACTIVE, subnets=['8aef6a1c-5082-41bf-8b62-97759a28d29c'], tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, updated_at=2025-10-14T10:01:23Z, vlan_transparent=None, network_id=09e16ed7-f945-41a6-8272-927947e04a34, port_security_enabled=False, project_id=8d76666f6fdc42e2bc8cc94f577392fc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=423, status=DOWN, tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, updated_at=2025-10-14T10:01:28Z on network 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:01:31 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 1 addresses
Oct 14 10:01:31 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:01:31 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:01:31 np0005486759.ooo.test podman[313838]: 2025-10-14 10:01:31.868638729 +0000 UTC m=+0.057743801 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:01:32 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 5 addresses
Oct 14 10:01:32 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:32 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:32 np0005486759.ooo.test podman[313871]: 2025-10-14 10:01:32.037098876 +0000 UTC m=+0.075304203 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:01:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:32.150 287366 INFO neutron.agent.dhcp.agent [None req-768798d3-ee01-4829-a336-f75fac764fbf - - - - - -] DHCP configuration for ports {'b5ec12b0-7130-4e8d-a811-6519dd71da49'} is completed
Oct 14 10:01:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:32Z|00064|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:01:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:32.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:33.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:34.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:36 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:01:36.511 2 INFO neutron.agent.securitygroups_rpc [None req-cf04f1ed-0e4f-4856-a109-ae7bc103b195 a8cfa1237415425aa7bde67e7d7ade73 834a916e71314172ade05e4d78237e31 - - default default] Security group member updated ['65d5e2a8-e8f0-4181-9526-0c3b76f5ca4f']
Oct 14 10:01:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:36.604 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:01:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:36.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:36.606 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:01:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:36.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:38.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:39.483 287366 INFO neutron.agent.linux.ip_lib [None req-bf676373-5e69-446c-b90a-1ac0ed12833a - - - - - -] Device tapdb9766f7-4d cannot be used as it has no MAC address
Oct 14 10:01:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:39.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:39 np0005486759.ooo.test kernel: device tapdb9766f7-4d entered promiscuous mode
Oct 14 10:01:39 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436099.5103] manager: (tapdb9766f7-4d): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Oct 14 10:01:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:39Z|00065|binding|INFO|Claiming lport db9766f7-4d15-4531-97fa-1ef591fa6dbb for this chassis.
Oct 14 10:01:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:39Z|00066|binding|INFO|db9766f7-4d15-4531-97fa-1ef591fa6dbb: Claiming unknown
Oct 14 10:01:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:39.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:39 np0005486759.ooo.test systemd-udevd[313906]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:01:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:39.523 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '834a916e71314172ade05e4d78237e31', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8635bf4-8c57-47fa-b4c8-6f2e10d84435, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=db9766f7-4d15-4531-97fa-1ef591fa6dbb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:01:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:39.525 183328 INFO neutron.agent.ovn.metadata.agent [-] Port db9766f7-4d15-4531-97fa-1ef591fa6dbb in datapath 16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad bound to our chassis
Oct 14 10:01:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:39.528 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:01:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:39.529 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1dc423-1ed5-4a38-a330-188686aa6292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:01:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:39Z|00067|binding|INFO|Setting lport db9766f7-4d15-4531-97fa-1ef591fa6dbb ovn-installed in OVS
Oct 14 10:01:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:39Z|00068|binding|INFO|Setting lport db9766f7-4d15-4531-97fa-1ef591fa6dbb up in Southbound
Oct 14 10:01:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:39.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:39.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:39.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:40 np0005486759.ooo.test podman[313961]: 
Oct 14 10:01:40 np0005486759.ooo.test podman[313961]: 2025-10-14 10:01:40.338267472 +0000 UTC m=+0.079947245 container create 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: Started libpod-conmon-37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72.scope.
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:01:40 np0005486759.ooo.test podman[313961]: 2025-10-14 10:01:40.291273827 +0000 UTC m=+0.032953660 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:01:40 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8b903af9346c0ad2357d7e5b36250d7ddf24bc1e6f3a0a0e86abf61f9dc4b93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:01:40 np0005486759.ooo.test podman[313961]: 2025-10-14 10:01:40.405996745 +0000 UTC m=+0.147676528 container init 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: tmp-crun.ewxwIu.mount: Deactivated successfully.
Oct 14 10:01:40 np0005486759.ooo.test podman[313961]: 2025-10-14 10:01:40.415015849 +0000 UTC m=+0.156695622 container start 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq[314020]: started, version 2.85 cachesize 150
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq[314020]: DNS service limited to local subnets
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq[314020]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq[314020]: warning: no upstream servers configured
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq-dhcp[314020]: DHCP, static leases only on 19.80.0.0, lease time 1d
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/addn_hosts - 0 addresses
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq-dhcp[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/host
Oct 14 10:01:40 np0005486759.ooo.test dnsmasq-dhcp[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/opts
Oct 14 10:01:40 np0005486759.ooo.test podman[313977]: 2025-10-14 10:01:40.458557829 +0000 UTC m=+0.085162993 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 14 10:01:40 np0005486759.ooo.test podman[313977]: 2025-10-14 10:01:40.469261683 +0000 UTC m=+0.095866817 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:01:40 np0005486759.ooo.test podman[313979]: 2025-10-14 10:01:40.517795575 +0000 UTC m=+0.137141279 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:01:40 np0005486759.ooo.test podman[313979]: 2025-10-14 10:01:40.527988083 +0000 UTC m=+0.147333787 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:01:40 np0005486759.ooo.test podman[313978]: 2025-10-14 10:01:40.623583132 +0000 UTC m=+0.249812145 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:01:40 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:40.640 287366 INFO neutron.agent.dhcp.agent [None req-f75b9973-f25a-4e09-8e7c-b2857888bc2c - - - - - -] DHCP configuration for ports {'a58c89e2-6b58-4f02-b3b4-2a06f1c493f7'} is completed
Oct 14 10:01:40 np0005486759.ooo.test podman[313976]: 2025-10-14 10:01:40.668154332 +0000 UTC m=+0.294717255 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:01:40 np0005486759.ooo.test podman[313978]: 2025-10-14 10:01:40.671546126 +0000 UTC m=+0.297775109 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:01:40 np0005486759.ooo.test podman[313976]: 2025-10-14 10:01:40.703724571 +0000 UTC m=+0.330287484 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:01:40 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:01:42 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:01:42.112 2 INFO neutron.agent.securitygroups_rpc [None req-633227af-8108-4900-8f35-e072a1527d80 a8cfa1237415425aa7bde67e7d7ade73 834a916e71314172ade05e4d78237e31 - - default default] Security group member updated ['65d5e2a8-e8f0-4181-9526-0c3b76f5ca4f']
Oct 14 10:01:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:42.181 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7870a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787220>], id=9b19c6e4-d989-4a39-847b-4211ce60bfb4, ip_allocation=immediate, mac_address=fa:16:3e:9b:44:be, name=tempest-subport-2066385203, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:36Z, description=, dns_domain=, id=16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-618256394, port_security_enabled=True, project_id=834a916e71314172ade05e4d78237e31, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4725, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=461, status=ACTIVE, subnets=['63ebb50d-fa88-4100-9c4b-7dc8869940cb'], tags=[], tenant_id=834a916e71314172ade05e4d78237e31, updated_at=2025-10-14T10:01:38Z, vlan_transparent=None, network_id=16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, port_security_enabled=True, project_id=834a916e71314172ade05e4d78237e31, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['65d5e2a8-e8f0-4181-9526-0c3b76f5ca4f'], standard_attr_id=487, status=DOWN, tags=[], tenant_id=834a916e71314172ade05e4d78237e31, updated_at=2025-10-14T10:01:41Z on network 16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad
Oct 14 10:01:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:01:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:01:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:01:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132496 "" "Go-http-client/1.1"
Oct 14 10:01:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:01:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17145 "" "Go-http-client/1.1"
Oct 14 10:01:42 np0005486759.ooo.test podman[314072]: 2025-10-14 10:01:42.449748716 +0000 UTC m=+0.049367857 container kill 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:01:42 np0005486759.ooo.test dnsmasq[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/addn_hosts - 1 addresses
Oct 14 10:01:42 np0005486759.ooo.test dnsmasq-dhcp[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/host
Oct 14 10:01:42 np0005486759.ooo.test dnsmasq-dhcp[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/opts
Oct 14 10:01:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:42.679 287366 INFO neutron.agent.dhcp.agent [None req-6c7925e6-424c-4174-b6f3-d3d7003c4e1d - - - - - -] DHCP configuration for ports {'9b19c6e4-d989-4a39-847b-4211ce60bfb4'} is completed
Oct 14 10:01:42 np0005486759.ooo.test podman[314109]: 2025-10-14 10:01:42.83350892 +0000 UTC m=+0.054060900 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:01:42 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:01:42 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:42 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:43Z|00069|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:43.441 287366 INFO neutron.agent.linux.ip_lib [None req-d7890897-0b14-4ccf-a30a-8e825fcf4fd3 - - - - - -] Device tap840ec343-33 cannot be used as it has no MAC address
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test kernel: device tap840ec343-33 entered promiscuous mode
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test systemd-udevd[314140]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:01:43 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436103.4956] manager: (tap840ec343-33): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Oct 14 10:01:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:43Z|00070|binding|INFO|Claiming lport 840ec343-3320-416e-b6a6-ad8ca4eed2e0 for this chassis.
Oct 14 10:01:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:43Z|00071|binding|INFO|840ec343-3320-416e-b6a6-ad8ca4eed2e0: Claiming unknown
Oct 14 10:01:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:43.507 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-6cc793dd-acc7-44e4-bdd4-014701e67c7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6cc793dd-acc7-44e4-bdd4-014701e67c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '955136e6fca841f8878ceb14a59dbd70', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95c36831-e8ae-446f-8ccc-33ee8cef9450, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=840ec343-3320-416e-b6a6-ad8ca4eed2e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:01:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:43.509 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 840ec343-3320-416e-b6a6-ad8ca4eed2e0 in datapath 6cc793dd-acc7-44e4-bdd4-014701e67c7c bound to our chassis
Oct 14 10:01:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:43Z|00072|binding|INFO|Setting lport 840ec343-3320-416e-b6a6-ad8ca4eed2e0 ovn-installed in OVS
Oct 14 10:01:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:43Z|00073|binding|INFO|Setting lport 840ec343-3320-416e-b6a6-ad8ca4eed2e0 up in Southbound
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:43.511 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6cc793dd-acc7-44e4-bdd4-014701e67c7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:01:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:43.512 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e6a4a6-27d5-488c-b3cb-667ea739474b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:43.608 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:01:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:43.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:01:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:01:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:01:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:44.219 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:43Z, description=, device_id=986d5ee1-f198-4704-95cf-94daa12f3ff4, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c8280>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed0037c0>], id=e88e2b1b-3aaf-44f2-85aa-937e3b5fc1ac, ip_allocation=immediate, mac_address=fa:16:3e:6d:6b:a4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=504, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:44Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:44 np0005486759.ooo.test podman[314210]: 
Oct 14 10:01:44 np0005486759.ooo.test podman[314210]: 2025-10-14 10:01:44.441124258 +0000 UTC m=+0.098955530 container create a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:01:44 np0005486759.ooo.test systemd[1]: Started libpod-conmon-a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224.scope.
Oct 14 10:01:44 np0005486759.ooo.test podman[314210]: 2025-10-14 10:01:44.389924346 +0000 UTC m=+0.047755648 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:01:44 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:01:44 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31acb6c5a13602842bf40fe352c324c667e22fa9c5e97c8f0667694958dda71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 5 addresses
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:44 np0005486759.ooo.test podman[314223]: 2025-10-14 10:01:44.505854911 +0000 UTC m=+0.112557644 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct 14 10:01:44 np0005486759.ooo.test podman[314210]: 2025-10-14 10:01:44.512390869 +0000 UTC m=+0.170222121 container init a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:01:44 np0005486759.ooo.test podman[314210]: 2025-10-14 10:01:44.522022421 +0000 UTC m=+0.179853673 container start a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq[314241]: started, version 2.85 cachesize 150
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq[314241]: DNS service limited to local subnets
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq[314241]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq[314241]: warning: no upstream servers configured
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq-dhcp[314241]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/addn_hosts - 0 addresses
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/host
Oct 14 10:01:44 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/opts
Oct 14 10:01:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:44.677 287366 INFO neutron.agent.dhcp.agent [None req-9822e5c5-ecd3-488c-947a-a9f63db232d1 - - - - - -] DHCP configuration for ports {'b66a8d88-2c24-4f1a-8001-6cca6fc85914'} is completed
Oct 14 10:01:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:44.830 287366 INFO neutron.agent.dhcp.agent [None req-754eaa13-13a4-4e4e-8072-2efaa5d130bc - - - - - -] DHCP configuration for ports {'e88e2b1b-3aaf-44f2-85aa-937e3b5fc1ac'} is completed
Oct 14 10:01:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:44.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:45.549 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:45Z, description=, device_id=a92fe436-66c6-4f7f-bdf3-96250c645581, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7a8af0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7a87f0>], id=33b4c989-70ad-4ae4-93e3-44fd159ebad0, ip_allocation=immediate, mac_address=fa:16:3e:c6:8b:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=511, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:45Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:45 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 6 addresses
Oct 14 10:01:45 np0005486759.ooo.test podman[314264]: 2025-10-14 10:01:45.759006663 +0000 UTC m=+0.056129133 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:01:45 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:45 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:45.981 287366 INFO neutron.agent.dhcp.agent [None req-1de752cd-11ee-4da7-819c-740e5e9080eb - - - - - -] DHCP configuration for ports {'33b4c989-70ad-4ae4-93e3-44fd159ebad0'} is completed
Oct 14 10:01:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:47.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:48.505 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:48Z, description=, device_id=a92fe436-66c6-4f7f-bdf3-96250c645581, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed0089d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed008790>], id=b4ce8a8c-2b78-4941-8039-36ad34cdb4a3, ip_allocation=immediate, mac_address=fa:16:3e:d1:1c:7a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:40Z, description=, dns_domain=, id=6cc793dd-acc7-44e4-bdd4-014701e67c7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-1245469105-network, port_security_enabled=True, project_id=955136e6fca841f8878ceb14a59dbd70, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39142, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=479, status=ACTIVE, subnets=['693e13b5-c857-4084-ade5-a99b0b80ec8d'], tags=[], tenant_id=955136e6fca841f8878ceb14a59dbd70, updated_at=2025-10-14T10:01:42Z, vlan_transparent=None, network_id=6cc793dd-acc7-44e4-bdd4-014701e67c7c, port_security_enabled=False, project_id=955136e6fca841f8878ceb14a59dbd70, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=531, status=DOWN, tags=[], tenant_id=955136e6fca841f8878ceb14a59dbd70, updated_at=2025-10-14T10:01:48Z on network 6cc793dd-acc7-44e4-bdd4-014701e67c7c
Oct 14 10:01:48 np0005486759.ooo.test podman[314303]: 2025-10-14 10:01:48.724681273 +0000 UTC m=+0.044746148 container kill a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:01:48 np0005486759.ooo.test dnsmasq[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/addn_hosts - 1 addresses
Oct 14 10:01:48 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/host
Oct 14 10:01:48 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/opts
Oct 14 10:01:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:01:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:01:48 np0005486759.ooo.test podman[314316]: 2025-10-14 10:01:48.796414028 +0000 UTC m=+0.052896505 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 10:01:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:48.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:48 np0005486759.ooo.test podman[314317]: 2025-10-14 10:01:48.858442188 +0000 UTC m=+0.112411929 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 14 10:01:48 np0005486759.ooo.test podman[314316]: 2025-10-14 10:01:48.863256694 +0000 UTC m=+0.119739191 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:01:48 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:01:48 np0005486759.ooo.test podman[314317]: 2025-10-14 10:01:48.896396099 +0000 UTC m=+0.150365900 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, config_id=edpm, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Oct 14 10:01:48 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:01:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:48.968 287366 INFO neutron.agent.dhcp.agent [None req-eed4b52c-3ab0-4c7f-8b05-070484892f68 - - - - - -] DHCP configuration for ports {'b4ce8a8c-2b78-4941-8039-36ad34cdb4a3'} is completed
Oct 14 10:01:49 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:49.262 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:48Z, description=, device_id=a92fe436-66c6-4f7f-bdf3-96250c645581, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec950c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec8e93d0>], id=b4ce8a8c-2b78-4941-8039-36ad34cdb4a3, ip_allocation=immediate, mac_address=fa:16:3e:d1:1c:7a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:40Z, description=, dns_domain=, id=6cc793dd-acc7-44e4-bdd4-014701e67c7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-1245469105-network, port_security_enabled=True, project_id=955136e6fca841f8878ceb14a59dbd70, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39142, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=479, status=ACTIVE, subnets=['693e13b5-c857-4084-ade5-a99b0b80ec8d'], tags=[], tenant_id=955136e6fca841f8878ceb14a59dbd70, updated_at=2025-10-14T10:01:42Z, vlan_transparent=None, network_id=6cc793dd-acc7-44e4-bdd4-014701e67c7c, port_security_enabled=False, project_id=955136e6fca841f8878ceb14a59dbd70, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=531, status=DOWN, tags=[], tenant_id=955136e6fca841f8878ceb14a59dbd70, updated_at=2025-10-14T10:01:48Z on network 6cc793dd-acc7-44e4-bdd4-014701e67c7c
Oct 14 10:01:49 np0005486759.ooo.test podman[314385]: 2025-10-14 10:01:49.445260569 +0000 UTC m=+0.049902564 container kill a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:01:49 np0005486759.ooo.test dnsmasq[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/addn_hosts - 1 addresses
Oct 14 10:01:49 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/host
Oct 14 10:01:49 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/opts
Oct 14 10:01:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27757 DF PROTO=TCP SPT=41452 DPT=9102 SEQ=2714064176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA1FEB70000000001030307) 
Oct 14 10:01:49 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:49.617 287366 INFO neutron.agent.dhcp.agent [None req-bfbaada7-8ab8-4a08-81a2-0c11a00e8d2c - - - - - -] DHCP configuration for ports {'b4ce8a8c-2b78-4941-8039-36ad34cdb4a3'} is completed
Oct 14 10:01:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27758 DF PROTO=TCP SPT=41452 DPT=9102 SEQ=2714064176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA202C10000000001030307) 
Oct 14 10:01:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:51.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:52 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:52.509 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:51Z, description=, device_id=ac268d64-9d27-4edd-9628-7c310d3939ed, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787a00>], id=2feefc36-f67a-4e22-b6ab-e33a14077fb5, ip_allocation=immediate, mac_address=fa:16:3e:7f:63:8e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=558, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:52Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27759 DF PROTO=TCP SPT=41452 DPT=9102 SEQ=2714064176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA20AC20000000001030307) 
Oct 14 10:01:52 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 7 addresses
Oct 14 10:01:52 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:52 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:52 np0005486759.ooo.test systemd[1]: tmp-crun.qX4i8E.mount: Deactivated successfully.
Oct 14 10:01:52 np0005486759.ooo.test podman[314425]: 2025-10-14 10:01:52.679747668 +0000 UTC m=+0.042098477 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:01:53 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:53.020 287366 INFO neutron.agent.dhcp.agent [None req-bb143cb5-2b8b-4959-9c9f-217122f0aee5 - - - - - -] DHCP configuration for ports {'2feefc36-f67a-4e22-b6ab-e33a14077fb5'} is completed
Oct 14 10:01:53 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:01:53.583 2 INFO neutron.agent.securitygroups_rpc [None req-8f06fc8b-e08e-472f-9983-ff2c035f2cff fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Security group member updated ['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec']
Oct 14 10:01:53 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:53.612 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7874f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7875e0>], id=f0846fb5-52f8-482b-a868-67d618105942, ip_allocation=immediate, mac_address=fa:16:3e:26:62:d5, name=tempest-parent-650561809, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:21Z, description=, dns_domain=, id=09e16ed7-f945-41a6-8272-927947e04a34, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-644143256-network, port_security_enabled=True, project_id=8d76666f6fdc42e2bc8cc94f577392fc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58969, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=375, status=ACTIVE, subnets=['8aef6a1c-5082-41bf-8b62-97759a28d29c'], tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, updated_at=2025-10-14T10:01:23Z, vlan_transparent=None, network_id=09e16ed7-f945-41a6-8272-927947e04a34, port_security_enabled=True, project_id=8d76666f6fdc42e2bc8cc94f577392fc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec'], standard_attr_id=568, status=DOWN, tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, updated_at=2025-10-14T10:01:53Z on network 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:01:53 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 2 addresses
Oct 14 10:01:53 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:01:53 np0005486759.ooo.test podman[314464]: 2025-10-14 10:01:53.815703797 +0000 UTC m=+0.062938390 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:01:53 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:01:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:53.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:53.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:54 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:54.080 287366 INFO neutron.agent.dhcp.agent [None req-6832fb90-33f0-44f4-98f6-ba941794038f - - - - - -] DHCP configuration for ports {'f0846fb5-52f8-482b-a868-67d618105942'} is completed
Oct 14 10:01:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:54.168 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:01:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:54.169 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:01:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:54.169 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:01:54 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:54.561 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:53Z, description=, device_id=51626a97-bd66-4962-b35e-37f9211b32dd, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5730>], id=1d455df5-3441-4587-a827-6f541e2b403b, ip_allocation=immediate, mac_address=fa:16:3e:7a:2b:75, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=572, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:54Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:54 np0005486759.ooo.test podman[314501]: 2025-10-14 10:01:54.750881689 +0000 UTC m=+0.047218813 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:01:54 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 8 addresses
Oct 14 10:01:54 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:54 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:01:54 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:54.936 287366 INFO neutron.agent.dhcp.agent [None req-931ea5fe-4cbe-4b94-865a-cde0166763c0 - - - - - -] DHCP configuration for ports {'1d455df5-3441-4587-a827-6f541e2b403b'} is completed
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:55.473 287366 INFO neutron.agent.linux.ip_lib [None req-5fce8404-0c67-4c73-9e52-ae2b1b948efa - - - - - -] Device tap6d4626d2-57 cannot be used as it has no MAC address
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test kernel: device tap6d4626d2-57 entered promiscuous mode
Oct 14 10:01:55 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436115.4994] manager: (tap6d4626d2-57): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Oct 14 10:01:55 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:55Z|00074|binding|INFO|Claiming lport 6d4626d2-5777-47c9-8ea9-9420398d14e8 for this chassis.
Oct 14 10:01:55 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:55Z|00075|binding|INFO|6d4626d2-5777-47c9-8ea9-9420398d14e8: Claiming unknown
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test systemd-udevd[314534]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:01:55 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:55Z|00076|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:01:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:55.513 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-f98d8b09-2b11-47b6-bc40-2162696497fe', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d8b09-2b11-47b6-bc40-2162696497fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26d268655eb841889a5364f4c9458bb1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b31c3823-8631-4731-a22d-331bcdcf2fd1, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=6d4626d2-5777-47c9-8ea9-9420398d14e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:01:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:55.514 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 6d4626d2-5777-47c9-8ea9-9420398d14e8 in datapath f98d8b09-2b11-47b6-bc40-2162696497fe bound to our chassis
Oct 14 10:01:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:55.515 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f98d8b09-2b11-47b6-bc40-2162696497fe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:01:55 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:01:55.516 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a0703c40-9a12-49c0-9dcd-6c3d3838f57c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:55Z|00077|binding|INFO|Setting lport 6d4626d2-5777-47c9-8ea9-9420398d14e8 ovn-installed in OVS
Oct 14 10:01:55 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:01:55Z|00078|binding|INFO|Setting lport 6d4626d2-5777-47c9-8ea9-9420398d14e8 up in Southbound
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:55.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:56 np0005486759.ooo.test podman[314589]: 
Oct 14 10:01:56 np0005486759.ooo.test podman[314589]: 2025-10-14 10:01:56.374629845 +0000 UTC m=+0.082451291 container create 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 10:01:56 np0005486759.ooo.test systemd[1]: Started libpod-conmon-7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72.scope.
Oct 14 10:01:56 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:01:56 np0005486759.ooo.test podman[314589]: 2025-10-14 10:01:56.335018944 +0000 UTC m=+0.042840340 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:01:56 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23fac9c54707156ef47ec793c7cd376156d9fa611c2e8b67c3d66b88657e590a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:01:56 np0005486759.ooo.test podman[314589]: 2025-10-14 10:01:56.448304648 +0000 UTC m=+0.156126034 container init 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 10:01:56 np0005486759.ooo.test podman[314589]: 2025-10-14 10:01:56.457351023 +0000 UTC m=+0.165172409 container start 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq[314608]: started, version 2.85 cachesize 150
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq[314608]: DNS service limited to local subnets
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq[314608]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq[314608]: warning: no upstream servers configured
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq-dhcp[314608]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/addn_hosts - 0 addresses
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/host
Oct 14 10:01:56 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/opts
Oct 14 10:01:56 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:56.607 287366 INFO neutron.agent.dhcp.agent [None req-0c13d258-1031-461d-9e33-22ea2545da68 - - - - - -] DHCP configuration for ports {'fb0404ef-bfee-46c3-80d8-5d30dad473a7'} is completed
Oct 14 10:01:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27760 DF PROTO=TCP SPT=41452 DPT=9102 SEQ=2714064176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA21A820000000001030307) 
Oct 14 10:01:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:57.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:58 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:01:58 np0005486759.ooo.test podman[314609]: 2025-10-14 10:01:58.423915965 +0000 UTC m=+0.058159885 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:01:58 np0005486759.ooo.test podman[314609]: 2025-10-14 10:01:58.433203246 +0000 UTC m=+0.067447156 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:01:58 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:01:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:01:58.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:01:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:01:59.695 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:58Z, description=, device_id=57baa8e3-ec67-48c6-8ee3-398516efea54, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f2250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f26d0>], id=72fdda2f-a488-46d0-bfbd-bb2b0d950b5a, ip_allocation=immediate, mac_address=fa:16:3e:f4:40:eb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=594, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:01:59Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:01:59 np0005486759.ooo.test podman[314643]: 2025-10-14 10:01:59.925283902 +0000 UTC m=+0.057630029 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:01:59 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 9 addresses
Oct 14 10:01:59 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:01:59 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:00 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:00.176 287366 INFO neutron.agent.dhcp.agent [None req-460f2218-79fe-4cfa-8d10-4aa59c9a402f - - - - - -] DHCP configuration for ports {'72fdda2f-a488-46d0-bfbd-bb2b0d950b5a'} is completed
Oct 14 10:02:01 np0005486759.ooo.test dnsmasq[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/addn_hosts - 0 addresses
Oct 14 10:02:01 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/host
Oct 14 10:02:01 np0005486759.ooo.test dnsmasq-dhcp[314241]: read /var/lib/neutron/dhcp/6cc793dd-acc7-44e4-bdd4-014701e67c7c/opts
Oct 14 10:02:01 np0005486759.ooo.test podman[314682]: 2025-10-14 10:02:01.211421153 +0000 UTC m=+0.057493014 container kill a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:02:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:02:01 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:01.301 2 INFO neutron.agent.securitygroups_rpc [None req-21d20b88-9ff8-4a95-aa3a-237a971ae34b fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Security group member updated ['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec']
Oct 14 10:02:01 np0005486759.ooo.test systemd[1]: tmp-crun.DGZFeA.mount: Deactivated successfully.
Oct 14 10:02:01 np0005486759.ooo.test podman[314696]: 2025-10-14 10:02:01.340674592 +0000 UTC m=+0.096392904 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:02:01 np0005486759.ooo.test podman[314696]: 2025-10-14 10:02:01.347758916 +0000 UTC m=+0.103477318 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:02:01 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:02:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:01Z|00079|binding|INFO|Releasing lport 840ec343-3320-416e-b6a6-ad8ca4eed2e0 from this chassis (sb_readonly=0)
Oct 14 10:02:01 np0005486759.ooo.test kernel: device tap840ec343-33 left promiscuous mode
Oct 14 10:02:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:01Z|00080|binding|INFO|Setting lport 840ec343-3320-416e-b6a6-ad8ca4eed2e0 down in Southbound
Oct 14 10:02:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:01.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:01.396 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-6cc793dd-acc7-44e4-bdd4-014701e67c7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6cc793dd-acc7-44e4-bdd4-014701e67c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '955136e6fca841f8878ceb14a59dbd70', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95c36831-e8ae-446f-8ccc-33ee8cef9450, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=840ec343-3320-416e-b6a6-ad8ca4eed2e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:01.398 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 840ec343-3320-416e-b6a6-ad8ca4eed2e0 in datapath 6cc793dd-acc7-44e4-bdd4-014701e67c7c unbound from our chassis
Oct 14 10:02:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:01.402 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6cc793dd-acc7-44e4-bdd4-014701e67c7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:01.403 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[4ffa12a7-2cb5-493a-a985-51a56d274662]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:01.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:02.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:03.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:03 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 8 addresses
Oct 14 10:02:03 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:03 np0005486759.ooo.test podman[314745]: 2025-10-14 10:02:03.974852732 +0000 UTC m=+0.053866764 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:03 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:03 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:03Z|00081|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:04.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:04.042 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:03Z, description=, device_id=57baa8e3-ec67-48c6-8ee3-398516efea54, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5070>], id=4c4f0385-9b9e-45f2-9699-8d1f66d80ce3, ip_allocation=immediate, mac_address=fa:16:3e:68:9d:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:52Z, description=, dns_domain=, id=f98d8b09-2b11-47b6-bc40-2162696497fe, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-818999075-network, port_security_enabled=True, project_id=26d268655eb841889a5364f4c9458bb1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50569, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=564, status=ACTIVE, subnets=['9e9ab387-59ec-40dd-9f2c-ed8983d33d84'], tags=[], tenant_id=26d268655eb841889a5364f4c9458bb1, updated_at=2025-10-14T10:01:54Z, vlan_transparent=None, network_id=f98d8b09-2b11-47b6-bc40-2162696497fe, port_security_enabled=False, project_id=26d268655eb841889a5364f4c9458bb1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=610, status=DOWN, tags=[], tenant_id=26d268655eb841889a5364f4c9458bb1, updated_at=2025-10-14T10:02:03Z on network f98d8b09-2b11-47b6-bc40-2162696497fe
Oct 14 10:02:04 np0005486759.ooo.test dnsmasq[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/addn_hosts - 1 addresses
Oct 14 10:02:04 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/host
Oct 14 10:02:04 np0005486759.ooo.test podman[314782]: 2025-10-14 10:02:04.247615492 +0000 UTC m=+0.057900137 container kill 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:04 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/opts
Oct 14 10:02:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:04.432 287366 INFO neutron.agent.dhcp.agent [None req-a26899d3-59cd-4b59-b1d3-f0734971085c - - - - - -] DHCP configuration for ports {'4c4f0385-9b9e-45f2-9699-8d1f66d80ce3'} is completed
Oct 14 10:02:04 np0005486759.ooo.test dnsmasq[314241]: exiting on receipt of SIGTERM
Oct 14 10:02:04 np0005486759.ooo.test podman[314820]: 2025-10-14 10:02:04.537163739 +0000 UTC m=+0.059215536 container kill a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:02:04 np0005486759.ooo.test systemd[1]: libpod-a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224.scope: Deactivated successfully.
Oct 14 10:02:04 np0005486759.ooo.test podman[314832]: 2025-10-14 10:02:04.600274962 +0000 UTC m=+0.051790400 container died a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:04 np0005486759.ooo.test podman[314832]: 2025-10-14 10:02:04.634281224 +0000 UTC m=+0.085796652 container cleanup a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:02:04 np0005486759.ooo.test systemd[1]: libpod-conmon-a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224.scope: Deactivated successfully.
Oct 14 10:02:04 np0005486759.ooo.test podman[314834]: 2025-10-14 10:02:04.689274471 +0000 UTC m=+0.132688184 container remove a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6cc793dd-acc7-44e4-bdd4-014701e67c7c, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:02:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:04.743 287366 INFO neutron.agent.dhcp.agent [None req-e5d4f391-f8e6-49b8-b7d8-4a99f16528e2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:04.743 287366 INFO neutron.agent.dhcp.agent [None req-e5d4f391-f8e6-49b8-b7d8-4a99f16528e2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c31acb6c5a13602842bf40fe352c324c667e22fa9c5e97c8f0667694958dda71-merged.mount: Deactivated successfully.
Oct 14 10:02:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3de59b2904b3a9f1ba61157409efbb2275d6fd0f6c1fc26bcf5a178b9038224-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:04 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d6cc793dd\x2dacc7\x2d44e4\x2dbdd4\x2d014701e67c7c.mount: Deactivated successfully.
Oct 14 10:02:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:05.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.165 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.166 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.188 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 14 10:02:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:06.271 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:03Z, description=, device_id=57baa8e3-ec67-48c6-8ee3-398516efea54, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec795eb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec9047c0>], id=4c4f0385-9b9e-45f2-9699-8d1f66d80ce3, ip_allocation=immediate, mac_address=fa:16:3e:68:9d:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:01:52Z, description=, dns_domain=, id=f98d8b09-2b11-47b6-bc40-2162696497fe, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-818999075-network, port_security_enabled=True, project_id=26d268655eb841889a5364f4c9458bb1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50569, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=564, status=ACTIVE, subnets=['9e9ab387-59ec-40dd-9f2c-ed8983d33d84'], tags=[], tenant_id=26d268655eb841889a5364f4c9458bb1, updated_at=2025-10-14T10:01:54Z, vlan_transparent=None, network_id=f98d8b09-2b11-47b6-bc40-2162696497fe, port_security_enabled=False, project_id=26d268655eb841889a5364f4c9458bb1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=610, status=DOWN, tags=[], tenant_id=26d268655eb841889a5364f4c9458bb1, updated_at=2025-10-14T10:02:03Z on network f98d8b09-2b11-47b6-bc40-2162696497fe
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.272 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.273 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.278 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.279 2 INFO nova.compute.claims [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Claim successful on node np0005486759.ooo.test
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.394 2 DEBUG nova.compute.provider_tree [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.406 2 DEBUG nova.scheduler.client.report [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.425 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.426 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 14 10:02:06 np0005486759.ooo.test dnsmasq[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/addn_hosts - 1 addresses
Oct 14 10:02:06 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/host
Oct 14 10:02:06 np0005486759.ooo.test podman[314879]: 2025-10-14 10:02:06.457938323 +0000 UTC m=+0.039213097 container kill 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 10:02:06 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/opts
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.464 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.465 2 DEBUG nova.network.neutron [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.485 2 INFO nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.507 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.586 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.588 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.588 2 INFO nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Creating image(s)
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.589 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "/var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.589 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "/var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.590 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "/var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.590 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "000bf511c6a7cacff2419cc4dc1d8f1ec4916605" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.591 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "000bf511c6a7cacff2419cc4dc1d8f1ec4916605" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:06.697 287366 INFO neutron.agent.dhcp.agent [None req-78da3cb8-1055-4782-b7b9-313aa79dba00 - - - - - -] DHCP configuration for ports {'4c4f0385-9b9e-45f2-9699-8d1f66d80ce3'} is completed
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.743 2 WARNING oslo_policy.policy [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.744 2 WARNING oslo_policy.policy [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 14 10:02:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:06.751 2 DEBUG nova.policy [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd4e3504d2bf41249414f9a3763d5c72', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 14 10:02:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:07.460 2 INFO neutron.agent.securitygroups_rpc [req-5da11188-0fc7-49c3-aa49-ac928c21a5b1 req-b8a28444-b119-4805-abed-41094b2d7a2e 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['70bdee88-950d-4694-8c33-af49b86626c0']
Oct 14 10:02:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:07.895 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:07.967 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.part --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:07.968 2 DEBUG nova.virt.images [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] 49de46ee-0386-41b3-98c9-c8f724edf634 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Oct 14 10:02:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:07.970 2 DEBUG nova.privsep.utils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 14 10:02:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:07.970 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.part /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.145 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.part /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.converted" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.149 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.212 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605.converted --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.214 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "000bf511c6a7cacff2419cc4dc1d8f1ec4916605" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.238 2 INFO oslo.privsep.daemon [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpf4txj2vv/privsep.sock']
Oct 14 10:02:08 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:08.310 2 INFO neutron.agent.securitygroups_rpc [None req-f4952ce4-a176-4023-a9f6-c5bd1d7b67d4 ba199bcacf074d76b36d60dc12d13cb5 70d4052729ff41f39eda195c1d7973bb - - default default] Security group rule updated ['905940d5-3dbf-414d-b49e-ee26a374bb24']
Oct 14 10:02:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:08.516 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005486759.ooo.test, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:53Z, description=, device_id=e9b02497-1e5d-4944-95c9-d833637e1968, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec815490>], dns_domain=, dns_name=tempest-liveautoblockmigrationv225test-server-185379272, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec8156d0>], id=f0846fb5-52f8-482b-a868-67d618105942, ip_allocation=immediate, mac_address=fa:16:3e:26:62:d5, name=tempest-parent-650561809, network_id=09e16ed7-f945-41a6-8272-927947e04a34, port_security_enabled=True, project_id=8d76666f6fdc42e2bc8cc94f577392fc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec'], standard_attr_id=568, status=DOWN, tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7ae4f0>], trunk_id=31feb8c2-5f97-44fd-b688-06cf05b67451, updated_at=2025-10-14T10:02:07Z on network 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:02:08 np0005486759.ooo.test systemd[1]: tmp-crun.ehQ1AN.mount: Deactivated successfully.
Oct 14 10:02:08 np0005486759.ooo.test podman[314934]: 2025-10-14 10:02:08.733155441 +0000 UTC m=+0.064854439 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 14 10:02:08 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 2 addresses
Oct 14 10:02:08 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:02:08 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.923 2 INFO oslo.privsep.daemon [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Spawned new privsep daemon via rootwrap
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.803 169 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.809 169 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.813 169 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 14 10:02:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:08.813 169 INFO oslo.privsep.daemon [-] privsep daemon running as pid 169
Oct 14 10:02:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:08.954 287366 INFO neutron.agent.dhcp.agent [None req-079a538b-885d-4114-9c3d-366102363278 - - - - - -] DHCP configuration for ports {'f0846fb5-52f8-482b-a868-67d618105942'} is completed
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.025 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:09 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:09.039 2 INFO neutron.agent.securitygroups_rpc [None req-d1d60bb8-892a-4387-b63f-e2c7d21f871b ba199bcacf074d76b36d60dc12d13cb5 70d4052729ff41f39eda195c1d7973bb - - default default] Security group rule updated ['905940d5-3dbf-414d-b49e-ee26a374bb24']
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.101 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.102 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "000bf511c6a7cacff2419cc4dc1d8f1ec4916605" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.104 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "000bf511c6a7cacff2419cc4dc1d8f1ec4916605" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.126 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.187 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.188 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605,backing_fmt=raw /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.213 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605,backing_fmt=raw /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.214 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "000bf511c6a7cacff2419cc4dc1d8f1ec4916605" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.215 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.262 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/000bf511c6a7cacff2419cc4dc1d8f1ec4916605 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.263 2 DEBUG nova.virt.disk.api [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Checking if we can resize image /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.264 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:09 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:09.264 2 INFO neutron.agent.securitygroups_rpc [req-d1fc2080-6431-4384-bb4c-45008cbb52f9 req-cf3390b3-1417-42f2-9d61-e3791ecb6002 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['e3feb9c3-7bd8-402e-a05d-d3fd7dacd895']
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.312 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.313 2 DEBUG nova.virt.disk.api [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Cannot resize image /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.314 2 DEBUG nova.objects.instance [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lazy-loading 'migration_context' on Instance uuid e9b02497-1e5d-4944-95c9-d833637e1968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.329 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.329 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Ensure instance console log exists: /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.329 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.330 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:09.330 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.540 2 DEBUG nova.network.neutron [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Successfully updated port: f0846fb5-52f8-482b-a868-67d618105942 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 14 10:02:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.561 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "refresh_cache-e9b02497-1e5d-4944-95c9-d833637e1968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:02:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.562 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquired lock "refresh_cache-e9b02497-1e5d-4944-95c9-d833637e1968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:02:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.562 2 DEBUG nova.network.neutron [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 14 10:02:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.696 2 DEBUG nova.network.neutron [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 14 10:02:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.999 2 DEBUG nova.compute.manager [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received event network-changed-f0846fb5-52f8-482b-a868-67d618105942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:10.999 2 DEBUG nova.compute.manager [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Refreshing instance network info cache due to event network-changed-f0846fb5-52f8-482b-a868-67d618105942. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.000 2 DEBUG oslo_concurrency.lockutils [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "refresh_cache-e9b02497-1e5d-4944-95c9-d833637e1968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:02:11 np0005486759.ooo.test podman[314982]: 2025-10-14 10:02:11.45684992 +0000 UTC m=+0.068409537 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:11 np0005486759.ooo.test podman[314982]: 2025-10-14 10:02:11.469669662 +0000 UTC m=+0.081229289 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: tmp-crun.TFqmAX.mount: Deactivated successfully.
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.500 2 DEBUG nova.network.neutron [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Updating instance_info_cache with network_info: [{"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:02:11 np0005486759.ooo.test podman[314974]: 2025-10-14 10:02:11.505552026 +0000 UTC m=+0.129559823 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:02:11 np0005486759.ooo.test podman[314975]: 2025-10-14 10:02:11.519378968 +0000 UTC m=+0.136997871 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0)
Oct 14 10:02:11 np0005486759.ooo.test podman[314975]: 2025-10-14 10:02:11.530491147 +0000 UTC m=+0.148110040 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.532 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Releasing lock "refresh_cache-e9b02497-1e5d-4944-95c9-d833637e1968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.532 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Instance network_info: |[{"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.533 2 DEBUG oslo_concurrency.lockutils [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquired lock "refresh_cache-e9b02497-1e5d-4944-95c9-d833637e1968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.533 2 DEBUG nova.network.neutron [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Refreshing network info cache for port f0846fb5-52f8-482b-a868-67d618105942 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.536 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Start _get_guest_xml network_info=[{"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-14T09:59:41Z,direct_url=<?>,disk_format='qcow2',id=49de46ee-0386-41b3-98c9-c8f724edf634,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8bf64e81a4214f9490d231a2e79ab3d8',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-14T09:59:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '49de46ee-0386-41b3-98c9-c8f724edf634'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 14 10:02:11 np0005486759.ooo.test podman[314974]: 2025-10-14 10:02:11.540276526 +0000 UTC m=+0.164284333 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.540 2 WARNING nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.543 2 DEBUG nova.virt.libvirt.host [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Searching host: 'np0005486759.ooo.test' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.543 2 DEBUG nova.virt.libvirt.host [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.545 2 DEBUG nova.virt.libvirt.host [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Searching host: 'np0005486759.ooo.test' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.545 2 DEBUG nova.virt.libvirt.host [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.545 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.546 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-14T09:59:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d980ffd-f0d2-447f-9f57-bb0ac0e49099',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-14T09:59:41Z,direct_url=<?>,disk_format='qcow2',id=49de46ee-0386-41b3-98c9-c8f724edf634,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8bf64e81a4214f9490d231a2e79ab3d8',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-14T09:59:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.546 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.546 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.546 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.546 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.547 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.547 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.547 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.547 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.547 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.547 2 DEBUG nova.virt.hardware [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.550 2 DEBUG nova.virt.libvirt.vif [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T10:02:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-185379272',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005486759.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-185379272',id=3,image_ref='49de46ee-0386-41b3-98c9-c8f724edf634',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005486759.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d76666f6fdc42e2bc8cc94f577392fc',ramdisk_id='',reservation_id='r-08njpxn1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='49de46ee-0386-41b3-98c9-c8f724edf634',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-505072292',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-505072292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-14T10:02:06Z,user_data=None,user_id='fd4e3504d2bf41249414f9a3763d5c72',uuid=e9b02497-1e5d-4944-95c9-d833637e1968,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.551 2 DEBUG nova.network.os_vif_util [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Converting VIF {"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.551 2 DEBUG nova.network.os_vif_util [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.552 2 DEBUG nova.objects.instance [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lazy-loading 'pci_devices' on Instance uuid e9b02497-1e5d-4944-95c9-d833637e1968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.564 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] End _get_guest_xml xml=<domain type="kvm">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <uuid>e9b02497-1e5d-4944-95c9-d833637e1968</uuid>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <name>instance-00000003</name>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <memory>131072</memory>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <vcpu>1</vcpu>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <metadata>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-185379272</nova:name>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:creationTime>2025-10-14 10:02:11</nova:creationTime>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:flavor name="m1.nano">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:memory>128</nova:memory>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:disk>1</nova:disk>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:swap>0</nova:swap>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:ephemeral>0</nova:ephemeral>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:vcpus>1</nova:vcpus>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       </nova:flavor>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:owner>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:user uuid="fd4e3504d2bf41249414f9a3763d5c72">tempest-LiveAutoBlockMigrationV225Test-505072292-project-member</nova:user>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:project uuid="8d76666f6fdc42e2bc8cc94f577392fc">tempest-LiveAutoBlockMigrationV225Test-505072292</nova:project>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       </nova:owner>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:root type="image" uuid="49de46ee-0386-41b3-98c9-c8f724edf634"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <nova:ports>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         <nova:port uuid="f0846fb5-52f8-482b-a868-67d618105942">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:         </nova:port>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       </nova:ports>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </nova:instance>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </metadata>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <sysinfo type="smbios">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <system>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <entry name="manufacturer">RDO</entry>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <entry name="product">OpenStack Compute</entry>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <entry name="serial">e9b02497-1e5d-4944-95c9-d833637e1968</entry>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <entry name="uuid">e9b02497-1e5d-4944-95c9-d833637e1968</entry>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <entry name="family">Virtual Machine</entry>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </system>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </sysinfo>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <os>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <boot dev="hd"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <smbios mode="sysinfo"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </os>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <features>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <acpi/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <apic/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <vmcoreinfo/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </features>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <clock offset="utc">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <timer name="pit" tickpolicy="delay"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <timer name="hpet" present="no"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </clock>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <cpu mode="host-model" match="exact">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <topology sockets="1" cores="1" threads="1"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </cpu>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   <devices>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <disk type="file" device="disk">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <source file="/var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <target dev="vda" bus="virtio"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <disk type="file" device="cdrom">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <driver name="qemu" type="raw" cache="none"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <source file="/var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.config"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <target dev="sda" bus="sata"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </disk>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <interface type="ethernet">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <mac address="fa:16:3e:26:62:d5"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <model type="virtio"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <driver name="vhost" rx_queue_size="512"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <mtu size="1442"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <target dev="tapf0846fb5-52"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </interface>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <serial type="pty">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <log file="/var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/console.log" append="off"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </serial>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <video>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <model type="virtio"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </video>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <input type="tablet" bus="usb"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <rng model="virtio">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <backend model="random">/dev/urandom</backend>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </rng>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="pci" model="pcie-root-port"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <controller type="usb" index="0"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     <memballoon model="virtio">
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:       <stats period="10"/>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:     </memballoon>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:   </devices>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: </domain>
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.565 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Preparing to wait for external event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.566 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.566 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.566 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.567 2 DEBUG nova.virt.libvirt.vif [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-14T10:02:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-185379272',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005486759.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-185379272',id=3,image_ref='49de46ee-0386-41b3-98c9-c8f724edf634',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005486759.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d76666f6fdc42e2bc8cc94f577392fc',ramdisk_id='',reservation_id='r-08njpxn1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='49de46ee-0386-41b3-98c9-c8f724edf634',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-505072292',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-505072292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-14T10:02:06Z,user_data=None,user_id='fd4e3504d2bf41249414f9a3763d5c72',uuid=e9b02497-1e5d-4944-95c9-d833637e1968,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.567 2 DEBUG nova.network.os_vif_util [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Converting VIF {"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.567 2 DEBUG nova.network.os_vif_util [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.568 2 DEBUG os_vif [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.572 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0846fb5-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.572 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0846fb5-52, col_values=(('external_ids', {'iface-id': 'f0846fb5-52f8-482b-a868-67d618105942', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:62:d5', 'vm-uuid': 'e9b02497-1e5d-4944-95c9-d833637e1968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:11 np0005486759.ooo.test podman[314980]: 2025-10-14 10:02:11.576086618 +0000 UTC m=+0.186750818 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.581 2 INFO os_vif [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52')
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:02:11 np0005486759.ooo.test podman[314980]: 2025-10-14 10:02:11.610271681 +0000 UTC m=+0.220935861 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.620 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.620 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.620 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] No VIF found with MAC fa:16:3e:26:62:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.621 2 INFO nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Using config drive
Oct 14 10:02:11 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.872 2 INFO nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Creating config drive at /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.config
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.877 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc1cypl7i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:11.996 2 DEBUG oslo_concurrency.processutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc1cypl7i" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:12 np0005486759.ooo.test kernel: device tapf0846fb5-52 entered promiscuous mode
Oct 14 10:02:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436132.0577] manager: (tapf0846fb5-52): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00082|binding|INFO|Claiming lport f0846fb5-52f8-482b-a868-67d618105942 for this chassis.
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00083|binding|INFO|f0846fb5-52f8-482b-a868-67d618105942: Claiming fa:16:3e:26:62:d5 10.100.0.14
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00084|binding|INFO|Claiming lport 9eae4210-d420-4e29-b9a5-561483f0559f for this chassis.
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00085|binding|INFO|9eae4210-d420-4e29-b9a5-561483f0559f: Claiming fa:16:3e:eb:54:e8 19.80.0.225
Oct 14 10:02:12 np0005486759.ooo.test systemd-udevd[315066]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:02:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436132.0830] device (tapf0846fb5-52): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Oct 14 10:02:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436132.0837] device (tapf0846fb5-52): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.075 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:54:e8 19.80.0.225'], port_security=['fa:16:3e:eb:54:e8 19.80.0.225'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['f0846fb5-52f8-482b-a868-67d618105942'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-85842672', 'neutron:cidrs': '19.80.0.225/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-85842672', 'neutron:project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b0443e-a40b-4b63-bb20-e38126b7a2fa, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=9eae4210-d420-4e29-b9a5-561483f0559f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.078 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:62:d5 10.100.0.14'], port_security=['fa:16:3e:26:62:d5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-650561809', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e9b02497-1e5d-4944-95c9-d833637e1968', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e16ed7-f945-41a6-8272-927947e04a34', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-650561809', 'neutron:project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5707afc1-8cb1-498a-ae11-72c1640f3fbb, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=f0846fb5-52f8-482b-a868-67d618105942) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.080 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 9eae4210-d420-4e29-b9a5-561483f0559f in datapath 5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf bound to our chassis
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.084 183328 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf
Oct 14 10:02:12 np0005486759.ooo.test systemd-machined[93972]: New machine qemu-3-instance-00000003.
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.095 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[f7894dd0-c415-4ad5-9946-e30feb9569a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.096 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5b1c3cdf-d1 in ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.098 183433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5b1c3cdf-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.098 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8ca459-ed26-45b8-a848-94caf6f2203d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.100 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[be98f1af-3699-4f4a-b3cd-d9a3b71a709b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00086|binding|INFO|Setting lport f0846fb5-52f8-482b-a868-67d618105942 ovn-installed in OVS
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00087|binding|INFO|Setting lport f0846fb5-52f8-482b-a868-67d618105942 up in Southbound
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00088|binding|INFO|Setting lport 9eae4210-d420-4e29-b9a5-561483f0559f up in Southbound
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.112 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[47c9d58e-e7f8-4fd2-b6d7-e9d85d475153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.123 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[6b135246-0c30-48ce-9d1b-4c5c92cb14a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:12.147 2 INFO neutron.agent.securitygroups_rpc [req-dd56f953-d40c-4e70-a8f4-9b3f1cd3e9e4 req-8d827125-56a7-48af-96e3-7f56e8990b36 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['35f0f016-b422-4bfe-a2bd-f8ee87d423e2']
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.159 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[744010e6-8757-4a4c-bf42-99910f5738b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.163 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a63c4da5-04a7-4afc-93e0-3b22abd6c9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test systemd-udevd[315070]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:02:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436132.1673] manager: (tap5b1c3cdf-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.193 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ef3417-f246-43bb-a8a9-f7d8dd2effbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.196 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[58b18117-0266-4ad7-b212-cb71521e79de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap5b1c3cdf-d1: link becomes ready
Oct 14 10:02:12 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap5b1c3cdf-d0: link becomes ready
Oct 14 10:02:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436132.2170] device (tap5b1c3cdf-d0): carrier: link connected
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.222 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[da80c05f-7d80-4e8c-b25a-fd28ef4c0a07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.237 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[879fdbc3-b2ec-455d-9921-f53ff951828e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b1c3cdf-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:0d:88:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1215233, 'reachable_time': 23681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315109, 'error': None, 'target': 'ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.252 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[77af3e0b-a96b-4482-8778-a0897c9262fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:889b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1215233, 'tstamp': 1215233}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315110, 'error': None, 'target': 'ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.268 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[5a28bcc9-5478-45cc-a7ce-78caba983a13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5b1c3cdf-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:0d:88:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1215233, 'reachable_time': 23681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315112, 'error': None, 'target': 'ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:02:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.301 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[80a66db0-63f1-4967-8221-63fb9cb7b9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:02:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 134320 "" "Go-http-client/1.1"
Oct 14 10:02:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:02:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17639 "" "Go-http-client/1.1"
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.373 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[dd32672d-b842-4025-a32a-a0c4f1307f90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.375 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b1c3cdf-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.375 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.376 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b1c3cdf-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test kernel: device tap5b1c3cdf-d0 entered promiscuous mode
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.382 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5b1c3cdf-d0, col_values=(('external_ids', {'iface-id': 'f5793126-1b86-4d21-9ae0-680fbb943ec2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00089|binding|INFO|Releasing lport f5793126-1b86-4d21-9ae0-680fbb943ec2 from this chassis (sb_readonly=0)
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.393 183328 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.394 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3eae0d-a6f6-40be-86aa-0ef209f4f061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.394 183328 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: global
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     log         /dev/log local0 debug
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     log-tag     haproxy-metadata-proxy-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     user        root
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     group       root
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     maxconn     1024
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     pidfile     /var/lib/neutron/external/pids/5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf.pid.haproxy
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     daemon
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: defaults
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     log global
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     mode http
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     option httplog
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     option dontlognull
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     option http-server-close
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     option forwardfor
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     retries                 3
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-request    30s
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout connect         30s
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout client          32s
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout server          32s
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-keep-alive 30s
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: listen listener
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     bind 169.254.169.254:80
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     server metadata /var/lib/neutron/metadata_proxy
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:     http-request add-header X-OVN-Network-ID 5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.395 183328 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'env', 'PROCESS_TAG=haproxy-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 14 10:02:12 np0005486759.ooo.test systemd[1]: tmp-crun.3XDwy4.mount: Deactivated successfully.
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.616 2 DEBUG nova.network.neutron [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Updated VIF entry in instance network info cache for port f0846fb5-52f8-482b-a868-67d618105942. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.617 2 DEBUG nova.network.neutron [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Updating instance_info_cache with network_info: [{"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.673 2 DEBUG oslo_concurrency.lockutils [req-fd0270f9-2723-45ed-bc7b-b9c6467198be req-c8cc017a-e20e-40a2-a991-0041e462b138 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Releasing lock "refresh_cache-e9b02497-1e5d-4944-95c9-d833637e1968" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.674 2 DEBUG nova.virt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Emitting event <LifecycleEvent: 1760436132.6724136, e9b02497-1e5d-4944-95c9-d833637e1968 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.675 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] VM Started (Lifecycle Event)
Oct 14 10:02:12 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:12.679 2 INFO neutron.agent.securitygroups_rpc [None req-44749805-6692-4077-bfdb-760d301c8570 a8cfa1237415425aa7bde67e7d7ade73 834a916e71314172ade05e4d78237e31 - - default default] Security group member updated ['65d5e2a8-e8f0-4181-9526-0c3b76f5ca4f']
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.746 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.750 2 DEBUG nova.virt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Emitting event <LifecycleEvent: 1760436132.6726127, e9b02497-1e5d-4944-95c9-d833637e1968 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.751 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] VM Paused (Lifecycle Event)
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.773 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.775 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.794 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 14 10:02:12 np0005486759.ooo.test podman[315145]: 
Oct 14 10:02:12 np0005486759.ooo.test podman[315145]: 2025-10-14 10:02:12.864476031 +0000 UTC m=+0.089390857 container create 2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:02:12 np0005486759.ooo.test systemd[1]: Started libpod-conmon-2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214.scope.
Oct 14 10:02:12 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:02:12 np0005486759.ooo.test podman[315145]: 2025-10-14 10:02:12.8247747 +0000 UTC m=+0.049689566 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 10:02:12 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea7dff615121a6c63a84d227b207a5c97352101829ffff1ae6ff64163a4cbd04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:02:12 np0005486759.ooo.test podman[315145]: 2025-10-14 10:02:12.933556959 +0000 UTC m=+0.158471775 container init 2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:12 np0005486759.ooo.test podman[315145]: 2025-10-14 10:02:12.939276114 +0000 UTC m=+0.164190920 container start 2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:12.941 287366 INFO neutron.agent.linux.ip_lib [None req-be8a68c0-02ee-44c9-9598-45ae96a91326 - - - - - -] Device tap1d74e03b-2f cannot be used as it has no MAC address
Oct 14 10:02:12 np0005486759.ooo.test podman[315179]: 2025-10-14 10:02:12.95160769 +0000 UTC m=+0.057556007 container kill 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:02:12 np0005486759.ooo.test dnsmasq[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/addn_hosts - 0 addresses
Oct 14 10:02:12 np0005486759.ooo.test dnsmasq-dhcp[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/host
Oct 14 10:02:12 np0005486759.ooo.test dnsmasq-dhcp[314020]: read /var/lib/neutron/dhcp/16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad/opts
Oct 14 10:02:12 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [NOTICE]   (315197) : New worker (315203) forked
Oct 14 10:02:12 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [NOTICE]   (315197) : Loading success.
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test kernel: device tap1d74e03b-2f entered promiscuous mode
Oct 14 10:02:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436132.9717] manager: (tap1d74e03b-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00090|binding|INFO|Claiming lport 1d74e03b-2f19-4709-af07-2da4cb4c3708 for this chassis.
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00091|binding|INFO|1d74e03b-2f19-4709-af07-2da4cb4c3708: Claiming unknown
Oct 14 10:02:12 np0005486759.ooo.test systemd-udevd[315098]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:02:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:12.984 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-0db638ae-88a6-4fa2-853d-a956040be6ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0db638ae-88a6-4fa2-853d-a956040be6ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05152bc18068402cb671831d9378c8e3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c9f951a-7803-40da-9634-699425e89ff8, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=1d74e03b-2f19-4709-af07-2da4cb4c3708) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00092|binding|INFO|Setting lport 1d74e03b-2f19-4709-af07-2da4cb4c3708 ovn-installed in OVS
Oct 14 10:02:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:12Z|00093|binding|INFO|Setting lport 1d74e03b-2f19-4709-af07-2da4cb4c3708 up in Southbound
Oct 14 10:02:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:12.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.002 183328 INFO neutron.agent.ovn.metadata.agent [-] Port f0846fb5-52f8-482b-a868-67d618105942 in datapath 09e16ed7-f945-41a6-8272-927947e04a34 bound to our chassis
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.005 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port ce0d57f5-ba8e-4f4b-b860-4ddff8dc52e0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.005 183328 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.010 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[77372ad5-8a92-46e0-b4af-34ba0188f139]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.012 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09e16ed7-f1 in ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.013 183433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09e16ed7-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.013 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[7be1c893-0104-46ef-bfaa-51c357dc08a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.014 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[6ffd2571-f7c8-4e3f-9d95-f8a79f555487]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.020 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[96155d35-95b3-4763-82fb-195716168910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.029 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[125de2da-e0ae-4e83-be61-fda727c1e519]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.048 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[25a7b54d-62c9-41ac-9df8-08867536c15a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436133.0528] manager: (tap09e16ed7-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.053 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[95b274b4-3ec0-4234-b854-1d364bb5e766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.071 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[357c64a4-0f2c-4595-9a85-c134a09839a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.073 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[3cba5531-7915-4bbe-b8d9-71e12ce9e5d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap09e16ed7-f0: link becomes ready
Oct 14 10:02:13 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436133.0880] device (tap09e16ed7-f0): carrier: link connected
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.091 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[04b947a2-11b6-4988-b445-3d83cbd90467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.104 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[18ed8e06-7d37-4a7f-91b5-e88e25033fe4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09e16ed7-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c7:41:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1215320, 'reachable_time': 35208, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315240, 'error': None, 'target': 'ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.112 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[f084311f-b5da-4dfb-9e15-2639f09fb4d1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:41b7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1215320, 'tstamp': 1215320}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315241, 'error': None, 'target': 'ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.123 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[849f36fe-b4e0-4410-bb87-70eed11c041a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09e16ed7-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c7:41:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1215320, 'reachable_time': 35208, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315242, 'error': None, 'target': 'ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.143 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[bc332cce-30fd-4b76-a0e2-2c53224addbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.184 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[294b0042-a397-4f06-aefa-6f38388fa85f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.185 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e16ed7-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.186 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.186 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09e16ed7-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test kernel: device tap09e16ed7-f0 entered promiscuous mode
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.190 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09e16ed7-f0, col_values=(('external_ids', {'iface-id': '342cf9cd-72d7-444c-a483-046dfbb42d73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:13Z|00094|binding|INFO|Releasing lport 342cf9cd-72d7-444c-a483-046dfbb42d73 from this chassis (sb_readonly=0)
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.197 183328 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09e16ed7-f945-41a6-8272-927947e04a34.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09e16ed7-f945-41a6-8272-927947e04a34.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.198 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a87f6401-e2ee-43d8-aede-156516d69dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.198 183328 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: global
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     log         /dev/log local0 debug
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     log-tag     haproxy-metadata-proxy-09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     user        root
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     group       root
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     maxconn     1024
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     pidfile     /var/lib/neutron/external/pids/09e16ed7-f945-41a6-8272-927947e04a34.pid.haproxy
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     daemon
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: defaults
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     log global
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     mode http
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     option httplog
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     option dontlognull
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     option http-server-close
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     option forwardfor
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     retries                 3
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-request    30s
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout connect         30s
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout client          32s
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout server          32s
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-keep-alive 30s
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: listen listener
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     bind 169.254.169.254:80
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     server metadata /var/lib/neutron/metadata_proxy
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:     http-request add-header X-OVN-Network-ID 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.199 183328 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34', 'env', 'PROCESS_TAG=haproxy-09e16ed7-f945-41a6-8272-927947e04a34', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09e16ed7-f945-41a6-8272-927947e04a34.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 14 10:02:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:13Z|00095|binding|INFO|Removing iface tapdb9766f7-4d ovn-installed in OVS
Oct 14 10:02:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:13Z|00096|binding|INFO|Removing lport db9766f7-4d15-4531-97fa-1ef591fa6dbb ovn-installed in OVS
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.302 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3dd0bba6-216f-4586-aa0a-97246e7d338c with type ""
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.304 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '834a916e71314172ade05e4d78237e31', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8635bf4-8c57-47fa-b4c8-6f2e10d84435, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=db9766f7-4d15-4531-97fa-1ef591fa6dbb) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq[314020]: exiting on receipt of SIGTERM
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: libpod-37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72.scope: Deactivated successfully.
Oct 14 10:02:13 np0005486759.ooo.test podman[315279]: 2025-10-14 10:02:13.316815921 +0000 UTC m=+0.061817167 container kill 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3)
Oct 14 10:02:13 np0005486759.ooo.test podman[315301]: 2025-10-14 10:02:13.359698849 +0000 UTC m=+0.030395238 container died 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:02:13 np0005486759.ooo.test podman[315301]: 2025-10-14 10:02:13.441807195 +0000 UTC m=+0.112503584 container cleanup 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: libpod-conmon-37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72.scope: Deactivated successfully.
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: tmp-crun.NHF8kw.mount: Deactivated successfully.
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d8b903af9346c0ad2357d7e5b36250d7ddf24bc1e6f3a0a0e86abf61f9dc4b93-merged.mount: Deactivated successfully.
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:13 np0005486759.ooo.test podman[315300]: 2025-10-14 10:02:13.471393437 +0000 UTC m=+0.140334542 container remove 37aad6f3cc6e16ff890f753195fe0803bbbc0b06d4d62809ecd7791101f2aa72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test kernel: device tapdb9766f7-4d left promiscuous mode
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d16b6a091\x2d5c3b\x2d4dc5\x2d9ed5\x2dd4a183ed59ad.mount: Deactivated successfully.
Oct 14 10:02:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:13.542 287366 INFO neutron.agent.dhcp.agent [None req-aa7cf5fd-5d78-4b41-ab95-4b599ac7e158 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:13 np0005486759.ooo.test podman[315355]: 
Oct 14 10:02:13 np0005486759.ooo.test podman[315355]: 2025-10-14 10:02:13.56624364 +0000 UTC m=+0.057553767 container create 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: Started libpod-conmon-82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467.scope.
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:02:13 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2622d0773d2a8926ec3d8807055c6680071512ba37bf830e0582b71aa44fbc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:02:13 np0005486759.ooo.test podman[315355]: 2025-10-14 10:02:13.615061049 +0000 UTC m=+0.106371176 container init 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:13 np0005486759.ooo.test podman[315355]: 2025-10-14 10:02:13.620529456 +0000 UTC m=+0.111839583 container start 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:02:13 np0005486759.ooo.test podman[315355]: 2025-10-14 10:02:13.535527193 +0000 UTC m=+0.026837340 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 10:02:13 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [NOTICE]   (315374) : New worker (315376) forked
Oct 14 10:02:13 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [NOTICE]   (315374) : Loading success.
Oct 14 10:02:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:13.679 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.682 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 1d74e03b-2f19-4709-af07-2da4cb4c3708 in datapath 0db638ae-88a6-4fa2-853d-a956040be6ba unbound from our chassis
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.684 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0db638ae-88a6-4fa2-853d-a956040be6ba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.684 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[3a9bf545-30ea-459c-a83b-28549811a78b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.685 183328 INFO neutron.agent.ovn.metadata.agent [-] Port db9766f7-4d15-4531-97fa-1ef591fa6dbb in datapath 16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad unbound from our chassis
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.686 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16b6a091-5c3b-4dc5-9ed5-d4a183ed59ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:13.687 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[51386868-dec8-465a-91ef-1ce79ce45064]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:13 np0005486759.ooo.test podman[315407]: 
Oct 14 10:02:13 np0005486759.ooo.test podman[315407]: 2025-10-14 10:02:13.85469459 +0000 UTC m=+0.073766491 container create c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: Started libpod-conmon-c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4.scope.
Oct 14 10:02:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:13Z|00097|binding|INFO|Releasing lport 342cf9cd-72d7-444c-a483-046dfbb42d73 from this chassis (sb_readonly=0)
Oct 14 10:02:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:13Z|00098|binding|INFO|Releasing lport f5793126-1b86-4d21-9ae0-680fbb943ec2 from this chassis (sb_readonly=0)
Oct 14 10:02:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:13Z|00099|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:13 np0005486759.ooo.test podman[315407]: 2025-10-14 10:02:13.816188885 +0000 UTC m=+0.035260806 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:02:13 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d630102e5d6d1e2df7d0096f3bb820e6e664af971634dce70b0c61f474c7d83/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:02:13 np0005486759.ooo.test podman[315407]: 2025-10-14 10:02:13.939788175 +0000 UTC m=+0.158860066 container init c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:02:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:13.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:13 np0005486759.ooo.test podman[315407]: 2025-10-14 10:02:13.946758258 +0000 UTC m=+0.165830149 container start c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3)
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq[315426]: started, version 2.85 cachesize 150
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq[315426]: DNS service limited to local subnets
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq[315426]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq[315426]: warning: no upstream servers configured
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq-dhcp[315426]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/addn_hosts - 0 addresses
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/host
Oct 14 10:02:13 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/opts
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:02:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:02:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:14.037 287366 INFO neutron.agent.dhcp.agent [None req-66ac10a8-3b0b-429a-a1e7-85d1dba045c5 - - - - - -] DHCP configuration for ports {'0e1d2cbd-73ec-471d-a917-54f95d45a02e'} is completed
Oct 14 10:02:14 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:14.077 2 INFO neutron.agent.securitygroups_rpc [req-967bd2fb-3b10-4e72-8a5d-1174f34a0e8f req-8bcc4102-9da3-46e7-a4a6-4dfc594c6250 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['ae4c5915-1a8e-4f22-ad7f-2d5dfc84cc18']
Oct 14 10:02:14 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:14.172 2 INFO neutron.agent.securitygroups_rpc [None req-4aca8743-9c7e-4155-ad82-bb44b1b7723c a8cfa1237415425aa7bde67e7d7ade73 834a916e71314172ade05e4d78237e31 - - default default] Security group member updated ['65d5e2a8-e8f0-4181-9526-0c3b76f5ca4f']
Oct 14 10:02:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:14.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:14.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:14 np0005486759.ooo.test systemd[1]: tmp-crun.OfyPzM.mount: Deactivated successfully.
Oct 14 10:02:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:14.583 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:14Z, description=, device_id=ee67851a-dad2-414b-b8fe-b6a2637354ce, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c8970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ecffbbe0>], id=c576aa1c-ea30-45dd-ad99-f205bf71f40a, ip_allocation=immediate, mac_address=fa:16:3e:64:db:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=666, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:02:14Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:02:14 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 9 addresses
Oct 14 10:02:14 np0005486759.ooo.test podman[315444]: 2025-10-14 10:02:14.788484306 +0000 UTC m=+0.043773146 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 10:02:14 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:14 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:14.994 287366 INFO neutron.agent.dhcp.agent [None req-26a3df51-cd73-4e5f-b34a-5d03d6d8ad8e - - - - - -] DHCP configuration for ports {'c576aa1c-ea30-45dd-ad99-f205bf71f40a'} is completed
Oct 14 10:02:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:15.187 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:15.850 2 INFO neutron.agent.securitygroups_rpc [req-abdcf86a-2071-466f-9118-700f9373608e req-43c1e8e8-8d7c-49b8-bdd3-e67c7a479fe7 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['6f41196a-be52-4bc1-abfc-acd5108b98d9']
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:16.112 2 INFO neutron.agent.securitygroups_rpc [req-24425bec-c0ba-404e-a5e1-9546bd360534 req-f5d23381-9dbf-497c-9005-7bfb736d6943 ba199bcacf074d76b36d60dc12d13cb5 70d4052729ff41f39eda195c1d7973bb - - default default] Security group member updated ['905940d5-3dbf-414d-b49e-ee26a374bb24']
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.192 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.214 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.215 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.215 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.216 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.305 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:16 np0005486759.ooo.test dnsmasq[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/addn_hosts - 0 addresses
Oct 14 10:02:16 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/host
Oct 14 10:02:16 np0005486759.ooo.test podman[315484]: 2025-10-14 10:02:16.319947965 +0000 UTC m=+0.042648212 container kill 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2)
Oct 14 10:02:16 np0005486759.ooo.test dnsmasq-dhcp[314608]: read /var/lib/neutron/dhcp/f98d8b09-2b11-47b6-bc40-2162696497fe/opts
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.357 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.358 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:16.416 2 INFO neutron.agent.securitygroups_rpc [req-0c691f75-0fb9-4dab-ace3-975d72bb4657 req-87b5ad67-d11b-4009-9ea6-49926cca414f 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['6f41196a-be52-4bc1-abfc-acd5108b98d9']
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.459 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.461 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:16Z|00100|binding|INFO|Releasing lport 6d4626d2-5777-47c9-8ea9-9420398d14e8 from this chassis (sb_readonly=0)
Oct 14 10:02:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:16Z|00101|binding|INFO|Setting lport 6d4626d2-5777-47c9-8ea9-9420398d14e8 down in Southbound
Oct 14 10:02:16 np0005486759.ooo.test kernel: device tap6d4626d2-57 left promiscuous mode
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:16.492 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-f98d8b09-2b11-47b6-bc40-2162696497fe', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d8b09-2b11-47b6-bc40-2162696497fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26d268655eb841889a5364f4c9458bb1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b31c3823-8631-4731-a22d-331bcdcf2fd1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=6d4626d2-5777-47c9-8ea9-9420398d14e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:16.493 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 6d4626d2-5777-47c9-8ea9-9420398d14e8 in datapath f98d8b09-2b11-47b6-bc40-2162696497fe unbound from our chassis
Oct 14 10:02:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:16.497 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f98d8b09-2b11-47b6-bc40-2162696497fe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:16.497 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c7bd4342-cd0d-4dee-bb7a-5e9f1aa871d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.510 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.511 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.569 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.574 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.637 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.638 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.679 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.821 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.822 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12184MB free_disk=386.67631912231445GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.823 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.823 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.916 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.916 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance e9b02497-1e5d-4944-95c9-d833637e1968 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.917 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.917 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1152MB phys_disk=399GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:02:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:16.984 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:02:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:17.000 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:02:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:17.025 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:02:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:17.026 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:17 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:17.176 2 INFO neutron.agent.securitygroups_rpc [req-96038753-2469-4691-ad23-202c0a6ec6e0 req-8521299f-e211-47c8-8ade-14986d280a71 7720a7b9f9f54c579f611e42089e782a 09aca07900bb49e594e78b919e107851 - - default default] Security group rule updated ['6f41196a-be52-4bc1-abfc-acd5108b98d9']
Oct 14 10:02:17 np0005486759.ooo.test podman[315541]: 2025-10-14 10:02:17.598609672 +0000 UTC m=+0.042344453 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:02:17 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 8 addresses
Oct 14 10:02:17 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:17 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:17 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:17.614 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:17Z, description=, device_id=ee67851a-dad2-414b-b8fe-b6a2637354ce, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec830790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed07adc0>], id=5622843f-c089-4dbe-94db-9152f782d261, ip_allocation=immediate, mac_address=fa:16:3e:c7:c0:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:02:09Z, description=, dns_domain=, id=0db638ae-88a6-4fa2-853d-a956040be6ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-65642612-network, port_security_enabled=True, project_id=05152bc18068402cb671831d9378c8e3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36089, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=635, status=ACTIVE, subnets=['645615e2-4929-4cdd-9b6e-9825c783d93e'], tags=[], tenant_id=05152bc18068402cb671831d9378c8e3, updated_at=2025-10-14T10:02:11Z, vlan_transparent=None, network_id=0db638ae-88a6-4fa2-853d-a956040be6ba, port_security_enabled=False, project_id=05152bc18068402cb671831d9378c8e3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=673, status=DOWN, tags=[], tenant_id=05152bc18068402cb671831d9378c8e3, updated_at=2025-10-14T10:02:17Z on network 0db638ae-88a6-4fa2-853d-a956040be6ba
Oct 14 10:02:17 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:17Z|00102|binding|INFO|Releasing lport 342cf9cd-72d7-444c-a483-046dfbb42d73 from this chassis (sb_readonly=0)
Oct 14 10:02:17 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:17Z|00103|binding|INFO|Releasing lport f5793126-1b86-4d21-9ae0-680fbb943ec2 from this chassis (sb_readonly=0)
Oct 14 10:02:17 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:17Z|00104|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:17.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:17 np0005486759.ooo.test dnsmasq[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/addn_hosts - 1 addresses
Oct 14 10:02:17 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/host
Oct 14 10:02:17 np0005486759.ooo.test podman[315577]: 2025-10-14 10:02:17.814086775 +0000 UTC m=+0.054721220 container kill c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:02:17 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/opts
Oct 14 10:02:17 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:17.962 287366 INFO neutron.agent.dhcp.agent [None req-a4adb5da-f74e-4367-85fe-f0d35bff9d58 - - - - - -] DHCP configuration for ports {'5622843f-c089-4dbe-94db-9152f782d261'} is completed
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.024 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.025 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.025 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.049 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.088 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.089 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.089 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.090 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.769 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.789 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.789 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.790 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.791 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.792 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:02:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:18.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:02:19 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:02:19 np0005486759.ooo.test podman[315600]: 2025-10-14 10:02:19.467932218 +0000 UTC m=+0.091434651 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 14 10:02:19 np0005486759.ooo.test podman[315601]: 2025-10-14 10:02:19.441746149 +0000 UTC m=+0.065162809 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 14 10:02:19 np0005486759.ooo.test podman[315600]: 2025-10-14 10:02:19.503375819 +0000 UTC m=+0.126878202 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 14 10:02:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36420 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=4176765796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA273E80000000001030307) 
Oct 14 10:02:19 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:02:19 np0005486759.ooo.test podman[315601]: 2025-10-14 10:02:19.524321758 +0000 UTC m=+0.147738428 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Oct 14 10:02:19 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.398 2 DEBUG nova.compute.manager [req-b45982cf-6511-4f27-a3ce-986f247d9bcf req-7eda7fed-c89c-4b3e-a8a8-eddd3ec42cec f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.400 2 DEBUG oslo_concurrency.lockutils [req-b45982cf-6511-4f27-a3ce-986f247d9bcf req-7eda7fed-c89c-4b3e-a8a8-eddd3ec42cec f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.400 2 DEBUG oslo_concurrency.lockutils [req-b45982cf-6511-4f27-a3ce-986f247d9bcf req-7eda7fed-c89c-4b3e-a8a8-eddd3ec42cec f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.401 2 DEBUG oslo_concurrency.lockutils [req-b45982cf-6511-4f27-a3ce-986f247d9bcf req-7eda7fed-c89c-4b3e-a8a8-eddd3ec42cec f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.401 2 DEBUG nova.compute.manager [req-b45982cf-6511-4f27-a3ce-986f247d9bcf req-7eda7fed-c89c-4b3e-a8a8-eddd3ec42cec f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Processing event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.403 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.408 2 DEBUG nova.virt.driver [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] Emitting event <LifecycleEvent: 1760436140.4082694, e9b02497-1e5d-4944-95c9-d833637e1968 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.409 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] VM Resumed (Lifecycle Event)
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.411 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.415 2 INFO nova.virt.libvirt.driver [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Instance spawned successfully.
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.416 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.433 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.443 2 DEBUG nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.450 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.451 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.452 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.453 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.454 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.455 2 DEBUG nova.virt.libvirt.driver [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.464 2 INFO nova.compute.manager [None req-6f7d86ad-8a16-4bb2-8a33-54e0d2216dbf - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.522 2 INFO nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Took 13.94 seconds to spawn the instance on the hypervisor.
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.524 2 DEBUG nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 10:02:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36421 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=4176765796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA278010000000001030307) 
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.607 2 INFO nova.compute.manager [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Took 14.36 seconds to build instance.
Oct 14 10:02:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:20.622 2 DEBUG oslo_concurrency.lockutils [None req-87c66d07-2c37-4fec-9bec-7a0bbda73985 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:21 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:21.476 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:17Z, description=, device_id=ee67851a-dad2-414b-b8fe-b6a2637354ce, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77ab80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77afd0>], id=5622843f-c089-4dbe-94db-9152f782d261, ip_allocation=immediate, mac_address=fa:16:3e:c7:c0:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:02:09Z, description=, dns_domain=, id=0db638ae-88a6-4fa2-853d-a956040be6ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-65642612-network, port_security_enabled=True, project_id=05152bc18068402cb671831d9378c8e3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36089, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=635, status=ACTIVE, subnets=['645615e2-4929-4cdd-9b6e-9825c783d93e'], tags=[], tenant_id=05152bc18068402cb671831d9378c8e3, updated_at=2025-10-14T10:02:11Z, vlan_transparent=None, network_id=0db638ae-88a6-4fa2-853d-a956040be6ba, port_security_enabled=False, project_id=05152bc18068402cb671831d9378c8e3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=673, status=DOWN, tags=[], tenant_id=05152bc18068402cb671831d9378c8e3, updated_at=2025-10-14T10:02:17Z on network 0db638ae-88a6-4fa2-853d-a956040be6ba
Oct 14 10:02:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:21.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:21 np0005486759.ooo.test dnsmasq[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/addn_hosts - 1 addresses
Oct 14 10:02:21 np0005486759.ooo.test podman[315661]: 2025-10-14 10:02:21.732987536 +0000 UTC m=+0.084399345 container kill c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:02:21 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/host
Oct 14 10:02:21 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/opts
Oct 14 10:02:22 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:22.080 287366 INFO neutron.agent.dhcp.agent [None req-1d179301-474b-47f0-b74b-b3daaf08de3d - - - - - -] DHCP configuration for ports {'5622843f-c089-4dbe-94db-9152f782d261'} is completed
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.471 2 DEBUG nova.compute.manager [req-b8cd59ce-dc95-4ea2-90f5-b472ac37441b req-2f1f98b0-2c42-49bb-9043-deb418e930d6 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.471 2 DEBUG oslo_concurrency.lockutils [req-b8cd59ce-dc95-4ea2-90f5-b472ac37441b req-2f1f98b0-2c42-49bb-9043-deb418e930d6 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.471 2 DEBUG oslo_concurrency.lockutils [req-b8cd59ce-dc95-4ea2-90f5-b472ac37441b req-2f1f98b0-2c42-49bb-9043-deb418e930d6 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.472 2 DEBUG oslo_concurrency.lockutils [req-b8cd59ce-dc95-4ea2-90f5-b472ac37441b req-2f1f98b0-2c42-49bb-9043-deb418e930d6 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.472 2 DEBUG nova.compute.manager [req-b8cd59ce-dc95-4ea2-90f5-b472ac37441b req-2f1f98b0-2c42-49bb-9043-deb418e930d6 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] No waiting events found dispatching network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.472 2 WARNING nova.compute.manager [req-b8cd59ce-dc95-4ea2-90f5-b472ac37441b req-2f1f98b0-2c42-49bb-9043-deb418e930d6 f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received unexpected event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 for instance with vm_state active and task_state None.
Oct 14 10:02:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36422 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=4176765796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA280020000000001030307) 
Oct 14 10:02:22 np0005486759.ooo.test podman[315700]: 2025-10-14 10:02:22.608180125 +0000 UTC m=+0.041772255 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:02:22 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 7 addresses
Oct 14 10:02:22 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:22 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:22 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:22Z|00105|binding|INFO|Releasing lport 342cf9cd-72d7-444c-a483-046dfbb42d73 from this chassis (sb_readonly=0)
Oct 14 10:02:22 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:22Z|00106|binding|INFO|Releasing lport f5793126-1b86-4d21-9ae0-680fbb943ec2 from this chassis (sb_readonly=0)
Oct 14 10:02:22 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:22Z|00107|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:22.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 10:02:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 10:02:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.355 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.356 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.356 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.357 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.357 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.358 2 INFO nova.compute.manager [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Terminating instance
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.359 2 DEBUG nova.compute.manager [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 14 10:02:23 np0005486759.ooo.test kernel: device tapf0846fb5-52 left promiscuous mode
Oct 14 10:02:23 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436143.3803] device (tapf0846fb5-52): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00108|binding|INFO|Releasing lport f0846fb5-52f8-482b-a868-67d618105942 from this chassis (sb_readonly=0)
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00109|binding|INFO|Setting lport f0846fb5-52f8-482b-a868-67d618105942 down in Southbound
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00110|binding|INFO|Releasing lport 9eae4210-d420-4e29-b9a5-561483f0559f from this chassis (sb_readonly=0)
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00111|binding|INFO|Setting lport 9eae4210-d420-4e29-b9a5-561483f0559f down in Southbound
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00112|binding|INFO|Removing iface tapf0846fb5-52 ovn-installed in OVS
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00113|binding|INFO|Releasing lport 342cf9cd-72d7-444c-a483-046dfbb42d73 from this chassis (sb_readonly=0)
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00114|binding|INFO|Releasing lport f5793126-1b86-4d21-9ae0-680fbb943ec2 from this chassis (sb_readonly=0)
Oct 14 10:02:23 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:23Z|00115|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.404 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:54:e8 19.80.0.225'], port_security=['fa:16:3e:eb:54:e8 19.80.0.225'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['f0846fb5-52f8-482b-a868-67d618105942'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-85842672', 'neutron:cidrs': '19.80.0.225/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-85842672', 'neutron:project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b0443e-a40b-4b63-bb20-e38126b7a2fa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=9eae4210-d420-4e29-b9a5-561483f0559f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.405 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:62:d5 10.100.0.14'], port_security=['fa:16:3e:26:62:d5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-650561809', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e9b02497-1e5d-4944-95c9-d833637e1968', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e16ed7-f945-41a6-8272-927947e04a34', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-650561809', 'neutron:project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5707afc1-8cb1-498a-ae11-72c1640f3fbb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=f0846fb5-52f8-482b-a868-67d618105942) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.406 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 9eae4210-d420-4e29-b9a5-561483f0559f in datapath 5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf unbound from our chassis
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.408 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.409 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[686163f7-32be-4320-a373-5c5133f8fb75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.409 183328 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf namespace which is not needed anymore
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 3.454s CPU time.
Oct 14 10:02:23 np0005486759.ooo.test systemd-machined[93972]: Machine qemu-3-instance-00000003 terminated.
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test dnsmasq[314608]: exiting on receipt of SIGTERM
Oct 14 10:02:23 np0005486759.ooo.test podman[315758]: 2025-10-14 10:02:23.530990316 +0000 UTC m=+0.032680808 container kill 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: libpod-7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72.scope: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test podman[315784]: 2025-10-14 10:02:23.581352453 +0000 UTC m=+0.035691090 container died 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-23fac9c54707156ef47ec793c7cd376156d9fa611c2e8b67c3d66b88657e590a-merged.mount: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.614 2 INFO nova.virt.libvirt.driver [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Instance destroyed successfully.
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.615 2 DEBUG nova.objects.instance [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lazy-loading 'resources' on Instance uuid e9b02497-1e5d-4944-95c9-d833637e1968 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [NOTICE]   (315197) : haproxy version is 2.8.14-c23fe91
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [NOTICE]   (315197) : path to executable is /usr/sbin/haproxy
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [WARNING]  (315197) : Exiting Master process...
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [WARNING]  (315197) : Exiting Master process...
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [ALERT]    (315197) : Current worker (315203) exited with code 143 (Terminated)
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf[315187]: [WARNING]  (315197) : All workers exited. Exiting... (0)
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: libpod-2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214.scope: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test podman[315784]: 2025-10-14 10:02:23.628544903 +0000 UTC m=+0.082883510 container remove 7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f98d8b09-2b11-47b6-bc40-2162696497fe, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.630 2 DEBUG nova.virt.libvirt.vif [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-14T10:02:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-185379272',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005486759.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-185379272',id=3,image_ref='49de46ee-0386-41b3-98c9-c8f724edf634',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-14T10:02:20Z,launched_on='np0005486759.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005486759.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8d76666f6fdc42e2bc8cc94f577392fc',ramdisk_id='',reservation_id='r-08njpxn1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='49de46ee-0386-41b3-98c9-c8f724edf634',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-505072292',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-505072292-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-14T10:02:20Z,user_data=None,user_id='fd4e3504d2bf41249414f9a3763d5c72',uuid=e9b02497-1e5d-4944-95c9-d833637e1968,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.630 2 DEBUG nova.network.os_vif_util [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Converting VIF {"id": "f0846fb5-52f8-482b-a868-67d618105942", "address": "fa:16:3e:26:62:d5", "network": {"id": "09e16ed7-f945-41a6-8272-927947e04a34", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-644143256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d76666f6fdc42e2bc8cc94f577392fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0846fb5-52", "ovs_interfaceid": "f0846fb5-52f8-482b-a868-67d618105942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.631 2 DEBUG nova.network.os_vif_util [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.631 2 DEBUG os_vif [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.633 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0846fb5-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test podman[315771]: 2025-10-14 10:02:23.63764521 +0000 UTC m=+0.109955345 container died 2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.638 2 INFO os_vif [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:62:d5,bridge_name='br-int',has_traffic_filtering=True,id=f0846fb5-52f8-482b-a868-67d618105942,network=Network(09e16ed7-f945-41a6-8272-927947e04a34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapf0846fb5-52')
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.639 2 INFO nova.virt.libvirt.driver [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Deleting instance files /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968_del
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.640 2 INFO nova.virt.libvirt.driver [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Deletion of /var/lib/nova/instances/e9b02497-1e5d-4944-95c9-d833637e1968_del complete
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: libpod-conmon-7db82a65f6bc54eecb8c3014e113f9f5ab108e72204f9115ddc673ba2e413d72.scope: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test podman[315771]: 2025-10-14 10:02:23.668614155 +0000 UTC m=+0.140924290 container cleanup 2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:02:23 np0005486759.ooo.test podman[315845]: 2025-10-14 10:02:23.728715969 +0000 UTC m=+0.047898563 container remove 2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: libpod-conmon-2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214.scope: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.733 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[5a75e2c4-f49c-45d5-866f-2be4d2dbb726]: (4, ('Tue Oct 14 10:02:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf (2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214)\n2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214\nTue Oct 14 10:02:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf (2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214)\n2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.735 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[91673974-a0a0-40ea-9dbd-9d1e9017ffff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.735 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b1c3cdf-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test kernel: device tap5b1c3cdf-d0 left promiscuous mode
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.743 2 INFO nova.compute.manager [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.743 2 DEBUG oslo.service.loopingcall [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.744 2 DEBUG nova.compute.manager [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.744 2 DEBUG nova.network.neutron [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.747 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[0f11e576-d48d-4372-9f7b-5b570bf77457]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:23.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.766 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d74edb33-98a5-45c2-902c-ef7b00bda165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.767 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d37e7bd6-8370-4a89-a964-28ea82defe66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.779 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[2320e893-ce52-4619-b2d5-c4d0cb8c124b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1215227, 'reachable_time': 24362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315862, 'error': None, 'target': 'ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.781 183464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5b1c3cdf-d8c0-4d24-ba4c-9280a2b86abf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.781 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[03b9d38e-23cf-4812-9cec-6fb0ce500a13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.782 183328 INFO neutron.agent.ovn.metadata.agent [-] Port f0846fb5-52f8-482b-a868-67d618105942 in datapath 09e16ed7-f945-41a6-8272-927947e04a34 unbound from our chassis
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.785 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port ce0d57f5-ba8e-4f4b-b860-4ddff8dc52e0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.785 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09e16ed7-f945-41a6-8272-927947e04a34, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.786 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d46f3e81-8477-4ddd-9bcf-f9e096c1ca72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:23.786 183328 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34 namespace which is not needed anymore
Oct 14 10:02:23 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:23.840 287366 INFO neutron.agent.dhcp.agent [None req-52ce9ca5-ea78-4389-830c-0b57d1fc832e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [NOTICE]   (315374) : haproxy version is 2.8.14-c23fe91
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [NOTICE]   (315374) : path to executable is /usr/sbin/haproxy
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [WARNING]  (315374) : Exiting Master process...
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [WARNING]  (315374) : Exiting Master process...
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [ALERT]    (315374) : Current worker (315376) exited with code 143 (Terminated)
Oct 14 10:02:23 np0005486759.ooo.test neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34[315370]: [WARNING]  (315374) : All workers exited. Exiting... (0)
Oct 14 10:02:23 np0005486759.ooo.test systemd[1]: libpod-82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467.scope: Deactivated successfully.
Oct 14 10:02:23 np0005486759.ooo.test podman[315880]: 2025-10-14 10:02:23.911072512 +0000 UTC m=+0.051773121 container died 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 10:02:24 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:24.000 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:24 np0005486759.ooo.test podman[315880]: 2025-10-14 10:02:24.008640838 +0000 UTC m=+0.149341467 container cleanup 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:02:24 np0005486759.ooo.test podman[315894]: 2025-10-14 10:02:24.022709187 +0000 UTC m=+0.109571124 container cleanup 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: libpod-conmon-82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467.scope: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test podman[315910]: 2025-10-14 10:02:24.093106165 +0000 UTC m=+0.063616702 container remove 82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.099 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[aed56d9f-dcca-4997-97ff-2fe65a04ff95]: (4, ('Tue Oct 14 10:02:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34 (82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467)\n82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467\nTue Oct 14 10:02:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34 (82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467)\n82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.104 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d9452b-0767-4df3-8686-ba1fec48170a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.106 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e16ed7-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:24 np0005486759.ooo.test kernel: device tap09e16ed7-f0 left promiscuous mode
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.115 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[cc09f4f9-a82f-4a6b-894a-7fa3cc8aa402]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.136 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[0da87f83-941e-44fb-853f-53e2d880b823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.137 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[92c5c752-5af8-4921-b047-62dde576c58d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.151 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[66ef1738-4ef1-4da5-84c6-a04a1ee9023f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1215316, 'reachable_time': 18374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315927, 'error': None, 'target': 'ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.153 183464 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09e16ed7-f945-41a6-8272-927947e04a34 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 14 10:02:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:24.153 183464 DEBUG oslo.privsep.daemon [-] privsep: reply[597ed3a7-87af-4eb8-b0ac-6e5e87abbf1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:24 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:24.434 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:01:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec791d90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec791fa0>], id=f0846fb5-52f8-482b-a868-67d618105942, ip_allocation=immediate, mac_address=fa:16:3e:26:62:d5, name=tempest-parent-650561809, network_id=09e16ed7-f945-41a6-8272-927947e04a34, port_security_enabled=True, project_id=8d76666f6fdc42e2bc8cc94f577392fc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=6, security_groups=['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec'], standard_attr_id=568, status=DOWN, tags=[], tenant_id=8d76666f6fdc42e2bc8cc94f577392fc, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec791c70>], trunk_id=31feb8c2-5f97-44fd-b688-06cf05b67451, updated_at=2025-10-14T10:02:24Z on network 09e16ed7-f945-41a6-8272-927947e04a34
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.459 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ebf51a5-b8fd-4dba-983e-8aee79793860', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.455344', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1e88314-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': '294e35486fe285c3519f646a4f101ad283445ee2fd400ed24492ad48062233bb'}]}, 'timestamp': '2025-10-14 10:02:24.461095', '_unique_id': '2045aef944d0419e999975452805e4e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.463 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.464 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.486 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.487 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87287057-9b47-42d1-b3ce-1e084bbe67e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.464871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1ec7d7a-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.660053233, 'message_signature': 'a752fa81f36bedbedd49988c2b2827bf5cbac3485c46a1f706b47002b8889ba6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.464871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1ec94b8-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.660053233, 'message_signature': 'f774ddaade58994449f0a80d3ca226d59f520f9a6b74a614ddc2a2f2c4472e5b'}]}, 'timestamp': '2025-10-14 10:02:24.487575', '_unique_id': '573c28a0fee441a0aa5fff470019749b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.489 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.490 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.490 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.491 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c6d4a9f-c07e-4a8f-96fe-73f65025294a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.490783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1ed25fe-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.660053233, 'message_signature': '9174b98d9470f75b7eb93ca2b8de378c82e942f00addac846e2ec2bd71684fe5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.490783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1ed36b6-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.660053233, 'message_signature': 'de19927937163b0c3075e3fe09f6d3cfe52981f13834f6e49d571f19d0708838'}]}, 'timestamp': '2025-10-14 10:02:24.491693', '_unique_id': 'fe84703f38124859a2589b51bcf5a498'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.492 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac3dd757-bdff-4178-ad9e-e5cf969a39c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.494147', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1eda8b2-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': '9a4a8005f2d7563322ae9bc9a654835bb5353eac88a18d36f04872910c17e23b'}]}, 'timestamp': '2025-10-14 10:02:24.494652', '_unique_id': 'e4d1a5bd47534c58b7dff43d7b568108'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.496 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.497 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14ebe78d-a45b-4195-84a1-0aa9daf16cea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.497046', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1ee1978-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': 'dc7b3397290ea1ee0ea1b361cf5df8dc2116bd0167f4ca764f0ccfc88f969e8e'}]}, 'timestamp': '2025-10-14 10:02:24.497553', '_unique_id': '6782052158d64f1c969e08d32604a52f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.498 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.499 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b2622d0773d2a8926ec3d8807055c6680071512ba37bf830e0582b71aa44fbc0-merged.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82a742f56e895688b8396eb137b83b1e74251e345ec13c6f2984f564d8a6c467-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: run-netns-ovnmeta\x2d09e16ed7\x2df945\x2d41a6\x2d8272\x2d927947e04a34.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ea7dff615121a6c63a84d227b207a5c97352101829ffff1ae6ff64163a4cbd04-merged.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e0d65c83c89ea153af3611d3c5dad20513bd07094cf89e8c9fb46089e8c5214-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: run-netns-ovnmeta\x2d5b1c3cdf\x2dd8c0\x2d4d24\x2dba4c\x2d9280a2b86abf.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2df98d8b09\x2d2b11\x2d47b6\x2dbc40\x2d2162696497fe.mount: Deactivated successfully.
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.540 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57a108f9-de7c-4a97-a210-bcc022927b96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.499946', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1f494d8-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': '2696029c2c30cae9abc058e0fab4a9eb602baa5e444d6f7926ad7b5a1302c0a5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.499946', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1f4ac8e-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'ec1f924134ce76b4e3eb466b26e7e70a3dcaaa697d92881281f26ef5ccbf72cf'}]}, 'timestamp': '2025-10-14 10:02:24.540614', '_unique_id': '361f0e8ae0294414a97ede1c7bdbad37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.543 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.544 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c694eade-0394-4342-b885-8267b477d0de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.543948', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1f54446-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': '7dafeb2bd1a7d25ef43519c68fda3b947e4fdf70ae3dbb0fae6933f6a0c0672f'}]}, 'timestamp': '2025-10-14 10:02:24.544510', '_unique_id': '183419c0cd9c4dfda2288874bf63b89b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.545 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.547 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.547 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5956b204-7213-4520-bec7-7256d8fe84d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.546984', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1f5b804-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.660053233, 'message_signature': 'b322188d279268c64d9269d623365687116088dba3a0fa6f0e89f1536f45a579'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.546984', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1f5c830-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.660053233, 'message_signature': 'c08ba0542031713bfa9c21d6e9883330fa123a87df2f3fd119e08bf097e0ac38'}]}, 'timestamp': '2025-10-14 10:02:24.547843', '_unique_id': 'f6f827b4c8084217970973851343527a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.550 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.550 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.550 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.550 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f94ddc1-8b9b-4226-846c-d98c17b62ca8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.550357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1f63f86-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'cdec6299b22bc02d0db99917825bf762a43bc026ec9fcdbde611452f7273d56a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.550357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1f650d4-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': '6a656364f08aade64be6437db6937d5e9ab55a6b67e26d15e4b5dcb1f8221527'}]}, 'timestamp': '2025-10-14 10:02:24.551344', '_unique_id': 'b809516ce38f4c1fb9819f6d01289c59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.553 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.553 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '477b2960-6d92-4e8f-91f2-ff4001fc0a52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.553732', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1f6c0aa-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': 'b60a2e52c5781119dba297ee7510d47340ebb4b4635207710e1ae969da8da14a'}]}, 'timestamp': '2025-10-14 10:02:24.554240', '_unique_id': '902c7139143d4120bd566578b2189cd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.555 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.556 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.556 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a832dd6-5176-4861-8bb6-7fe048015b0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.556455', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1f72996-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': '5f9b921d0ea942c0d0d9c096d0d1181e299fd7ede31e0802afbea2b81a9e2350'}]}, 'timestamp': '2025-10-14 10:02:24.556864', '_unique_id': 'ce79927d2ab549ff96e2fe67cb857d83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '956a9d1b-62bb-4349-bf13-8683c6a01c46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.558505', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1f776bc-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': '8b956daa71502171d6683b21da77de3d009f4fcf6de4dc09996ad660f9a235f5'}]}, 'timestamp': '2025-10-14 10:02:24.558824', '_unique_id': 'a1e9778972c44075a5ab82f9ebe1134d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.560 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.560 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.560 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56f2a56a-57f9-490a-a812-3a9c351865e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.560522', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1f7c5a4-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': '38f7d8bffac8cbe795d0173fe24016f91204ae09f20a759cbbd15ae234df5a53'}]}, 'timestamp': '2025-10-14 10:02:24.560850', '_unique_id': 'b906a3370e5c4cada62f5333f91126a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.562 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.579 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 11910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f57b25d-c4c3-4871-86d7-b1b9550fbc15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11910000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:02:24.562396', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e1fabdea-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.774607108, 'message_signature': '58347f6b1ebe9ddb787ca2f0f07f15586961400a5620859e9f2220743cb609e2'}]}, 'timestamp': '2025-10-14 10:02:24.580436', '_unique_id': '4863f319f95146a99493be566ba0023d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.581 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.583 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.583 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04c8a9de-d289-4e5c-89c3-103d217ed9a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.583532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1fb4a1c-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'c1c36c993930052570a1630284498547efe64e150b617ea153dbda36c1810083'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.583532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1fb5980-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': '057eccb20c43eef1976f8af8c1103c89b52daab75e86291d31d5ab38c08a7eb7'}]}, 'timestamp': '2025-10-14 10:02:24.584287', '_unique_id': 'b15fe92b63ae48c9ab73afda33345435'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.585 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.586 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c33ff78d-7082-42ae-bf32-b85cb64f24d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.585872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1fba6e2-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': '3ed494bc7cb5786ce898becd91f0964b6f649255f76e672730b703b1c7682df1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.585872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1fbb542-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'b2ac0f3ce36a134bf1913819a7dd2d8565b0a4a4e54c5fce6bc86d25462fb15e'}]}, 'timestamp': '2025-10-14 10:02:24.586624', '_unique_id': 'ee3e7cbf4a6b4bfd99a1dd6f994b5547'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.587 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.588 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bff08139-83ae-4d2e-9f3f-f480803245bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8191, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.588181', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1fc0d58-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': 'c513bef7127d8aa503bd113f385b4527d01de3775d8a8d18253b886515311afe'}]}, 'timestamp': '2025-10-14 10:02:24.588895', '_unique_id': '417d818da5ac40c9be42bbd607239d3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.590 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.590 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.590 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32ef5fd2-f321-4846-8b6f-e9358a6b2b5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.590504', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1fc59de-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'e01c1d6b5196857ed4fb81b5f7183fccf85b43845def6a205d6ed230dc8bd77f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.590504', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1fc64ec-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'fd874ced995207d3cb95db152f1ce9d08b909872ff575c6a7f587c99ca0779a7'}]}, 'timestamp': '2025-10-14 10:02:24.591112', '_unique_id': 'ffa311bfb4874811858280cfe5d5ba53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.591 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.592 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.592 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2488aa7-4304-4fbe-8944-782565405e07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:02:24.592443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e1fca43e-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.774607108, 'message_signature': '9c82a41b96753e3c6e8a6ae4de38436a2205eff39e33d6a6301c67f02eb019dc'}]}, 'timestamp': '2025-10-14 10:02:24.592705', '_unique_id': '2365b73e60ea4237b86e7e70098969dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.593 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7c579df-24f3-4d01-9b7d-7281a3c72199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:02:24.593732', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1fcd4cc-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': 'af450943a992ccf5d8a4bd381e962c52268c71993be5ac6e2b8d8cc88ae5eea5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:02:24.593732', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1fcdcba-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.695150744, 'message_signature': '6103d5e6e613ae52fe3dc6bc783e26bc392875f019bf7c088954b77c4f1f15f5'}]}, 'timestamp': '2025-10-14 10:02:24.594133', '_unique_id': '2b2bcaff659043eba57b424f6e74e968'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.594 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3f63c1b-7c2c-4c3c-a236-f39d771f48bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:02:24.595149', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'e1fd0c62-a8e4-11f0-b515-fa163eba5220', 'monotonic_time': 12164.65044213, 'message_signature': 'b19ab5dc2b372cdf8c265410d693e48dae73aff50419e67b2b4e5aa33ee147bc'}]}, 'timestamp': '2025-10-14 10:02:24.595366', '_unique_id': '243fdbf14c8f44b2a3c823c31b7072ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.595 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:02:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:02:24.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:02:24 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:24.628 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:24 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 2 addresses
Oct 14 10:02:24 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:02:24 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:02:24 np0005486759.ooo.test podman[315943]: 2025-10-14 10:02:24.655904964 +0000 UTC m=+0.057601759 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009)
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.672 2 DEBUG nova.network.neutron [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.690 2 INFO nova.compute.manager [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Took 0.95 seconds to deallocate network for instance.
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.742 2 DEBUG nova.compute.manager [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received event network-vif-unplugged-f0846fb5-52f8-482b-a868-67d618105942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.743 2 DEBUG oslo_concurrency.lockutils [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.744 2 DEBUG oslo_concurrency.lockutils [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.744 2 DEBUG oslo_concurrency.lockutils [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.744 2 DEBUG nova.compute.manager [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] No waiting events found dispatching network-vif-unplugged-f0846fb5-52f8-482b-a868-67d618105942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.745 2 DEBUG nova.compute.manager [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received event network-vif-unplugged-f0846fb5-52f8-482b-a868-67d618105942 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.745 2 DEBUG nova.compute.manager [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.746 2 DEBUG oslo_concurrency.lockutils [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Acquiring lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.746 2 DEBUG oslo_concurrency.lockutils [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.746 2 DEBUG oslo_concurrency.lockutils [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.747 2 DEBUG nova.compute.manager [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] No waiting events found dispatching network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.747 2 WARNING nova.compute.manager [req-ba3fc1bb-c39e-470a-be18-85b1696476c3 req-d61694c2-458b-47b1-bfbe-35930366e4bb f89c0ae8e16248e4bdc7378649662e4d bbb70b4d4f0f4594ba31143518520ca7 - - default default] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Received unexpected event network-vif-plugged-f0846fb5-52f8-482b-a868-67d618105942 for instance with vm_state active and task_state deleting.
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.768 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.768 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:24 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 6 addresses
Oct 14 10:02:24 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:24 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:24 np0005486759.ooo.test podman[315975]: 2025-10-14 10:02:24.784448374 +0000 UTC m=+0.065995004 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.847 2 DEBUG nova.compute.provider_tree [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.860 2 DEBUG nova.scheduler.client.report [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.888 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:24 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:24Z|00116|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:24 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:24.931 287366 INFO neutron.agent.dhcp.agent [None req-cc519eae-d4ee-4b78-b37e-d47bf06b146b - - - - - -] DHCP configuration for ports {'f0846fb5-52f8-482b-a868-67d618105942'} is completed
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.933 2 INFO nova.scheduler.client.report [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Deleted allocations for instance e9b02497-1e5d-4944-95c9-d833637e1968
Oct 14 10:02:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:24.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:25.013 2 DEBUG oslo_concurrency.lockutils [None req-c474e98b-246b-4df9-996b-86cb4ac8a926 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Lock "e9b02497-1e5d-4944-95c9-d833637e1968" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36423 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=4176765796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA28FC10000000001030307) 
Oct 14 10:02:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:27.152 2 INFO neutron.agent.securitygroups_rpc [None req-1461a21e-e2f4-4d39-a1b1-6056f8499d1f fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Security group member updated ['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec']
Oct 14 10:02:27 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:27Z|00117|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:27 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 5 addresses
Oct 14 10:02:27 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:27 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:27 np0005486759.ooo.test podman[316033]: 2025-10-14 10:02:27.342423159 +0000 UTC m=+0.066048046 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:02:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:27.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:27 np0005486759.ooo.test dnsmasq[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/addn_hosts - 0 addresses
Oct 14 10:02:27 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/host
Oct 14 10:02:27 np0005486759.ooo.test dnsmasq-dhcp[315426]: read /var/lib/neutron/dhcp/0db638ae-88a6-4fa2-853d-a956040be6ba/opts
Oct 14 10:02:27 np0005486759.ooo.test podman[316045]: 2025-10-14 10:02:27.397045855 +0000 UTC m=+0.062375894 container kill c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 10:02:27 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:27Z|00118|binding|INFO|Releasing lport 1d74e03b-2f19-4709-af07-2da4cb4c3708 from this chassis (sb_readonly=0)
Oct 14 10:02:27 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:27Z|00119|binding|INFO|Setting lport 1d74e03b-2f19-4709-af07-2da4cb4c3708 down in Southbound
Oct 14 10:02:27 np0005486759.ooo.test kernel: device tap1d74e03b-2f left promiscuous mode
Oct 14 10:02:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:27.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:27 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:27.662 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-0db638ae-88a6-4fa2-853d-a956040be6ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0db638ae-88a6-4fa2-853d-a956040be6ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05152bc18068402cb671831d9378c8e3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c9f951a-7803-40da-9634-699425e89ff8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=1d74e03b-2f19-4709-af07-2da4cb4c3708) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:27 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:27.664 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 1d74e03b-2f19-4709-af07-2da4cb4c3708 in datapath 0db638ae-88a6-4fa2-853d-a956040be6ba unbound from our chassis
Oct 14 10:02:27 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:27.669 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0db638ae-88a6-4fa2-853d-a956040be6ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:27 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:27.672 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[869f810a-dd20-4ae6-b15f-fd626517cd59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:27.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:28.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:28 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:02:28.824 2 INFO neutron.agent.securitygroups_rpc [None req-f23548b9-f2b4-40fd-a698-dedc200dcfb8 fd4e3504d2bf41249414f9a3763d5c72 8d76666f6fdc42e2bc8cc94f577392fc - - default default] Security group member updated ['1542c2ce-d7be-41a6-8ba4-dbe55f3e87ec']
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 1 addresses
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:02:29 np0005486759.ooo.test podman[316096]: 2025-10-14 10:02:29.038720776 +0000 UTC m=+0.072876604 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:02:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:29.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:02:29 np0005486759.ooo.test podman[316111]: 2025-10-14 10:02:29.14835433 +0000 UTC m=+0.084955052 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:29Z|00120|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:29 np0005486759.ooo.test podman[316111]: 2025-10-14 10:02:29.232132236 +0000 UTC m=+0.168733028 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 10:02:29 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:02:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:29.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:29 np0005486759.ooo.test podman[316151]: 2025-10-14 10:02:29.279916264 +0000 UTC m=+0.060970251 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq[315426]: exiting on receipt of SIGTERM
Oct 14 10:02:29 np0005486759.ooo.test podman[316185]: 2025-10-14 10:02:29.667819368 +0000 UTC m=+0.065507260 container kill c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:02:29 np0005486759.ooo.test systemd[1]: libpod-c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4.scope: Deactivated successfully.
Oct 14 10:02:29 np0005486759.ooo.test podman[316199]: 2025-10-14 10:02:29.742776503 +0000 UTC m=+0.058083372 container died c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:29 np0005486759.ooo.test podman[316199]: 2025-10-14 10:02:29.772948384 +0000 UTC m=+0.088255213 container cleanup c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:02:29 np0005486759.ooo.test systemd[1]: libpod-conmon-c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4.scope: Deactivated successfully.
Oct 14 10:02:29 np0005486759.ooo.test podman[316200]: 2025-10-14 10:02:29.814893384 +0000 UTC m=+0.125165030 container remove c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db638ae-88a6-4fa2-853d-a956040be6ba, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:02:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:29.846 287366 INFO neutron.agent.dhcp.agent [None req-3157baa9-9ccf-4d7b-a3f9-4ef20ce74f63 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:29.847 287366 INFO neutron.agent.dhcp.agent [None req-3157baa9-9ccf-4d7b-a3f9-4ef20ce74f63 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/addn_hosts - 0 addresses
Oct 14 10:02:29 np0005486759.ooo.test podman[316243]: 2025-10-14 10:02:29.960338091 +0000 UTC m=+0.055093872 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/host
Oct 14 10:02:29 np0005486759.ooo.test dnsmasq-dhcp[313665]: read /var/lib/neutron/dhcp/09e16ed7-f945-41a6-8272-927947e04a34/opts
Oct 14 10:02:30 np0005486759.ooo.test systemd[1]: tmp-crun.tfOvYi.mount: Deactivated successfully.
Oct 14 10:02:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d630102e5d6d1e2df7d0096f3bb820e6e664af971634dce70b0c61f474c7d83-merged.mount: Deactivated successfully.
Oct 14 10:02:30 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c863bbaca987fba79ee8a0a5101ff48288612d01e1e99476f10eb21875f3e9e4-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:30 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d0db638ae\x2d88a6\x2d4fa2\x2d853d\x2da956040be6ba.mount: Deactivated successfully.
Oct 14 10:02:30 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:30Z|00121|binding|INFO|Releasing lport 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 from this chassis (sb_readonly=0)
Oct 14 10:02:30 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:30Z|00122|binding|INFO|Setting lport 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 down in Southbound
Oct 14 10:02:30 np0005486759.ooo.test kernel: device tap768c8abc-dd left promiscuous mode
Oct 14 10:02:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:30.208 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-09e16ed7-f945-41a6-8272-927947e04a34', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e16ed7-f945-41a6-8272-927947e04a34', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d76666f6fdc42e2bc8cc94f577392fc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5707afc1-8cb1-498a-ae11-72c1640f3fbb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=768c8abc-dd70-4c99-aa8b-a56fb54ba2e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:30.210 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 768c8abc-dd70-4c99-aa8b-a56fb54ba2e7 in datapath 09e16ed7-f945-41a6-8272-927947e04a34 unbound from our chassis
Oct 14 10:02:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:30.214 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09e16ed7-f945-41a6-8272-927947e04a34, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:30.217 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c35c76b3-340e-4a74-9958-493ab67179c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:30.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:02:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:32.356 287366 INFO neutron.agent.linux.ip_lib [None req-3557268e-8dce-436f-af66-a6ea200e7489 - - - - - -] Device tap40d0d6b3-7f cannot be used as it has no MAC address
Oct 14 10:02:32 np0005486759.ooo.test podman[316268]: 2025-10-14 10:02:32.377427547 +0000 UTC m=+0.071819202 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test kernel: device tap40d0d6b3-7f entered promiscuous mode
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436152.3883] manager: (tap40d0d6b3-7f): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Oct 14 10:02:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:32Z|00123|binding|INFO|Claiming lport 40d0d6b3-7f24-4531-af2b-3710504d2d66 for this chassis.
Oct 14 10:02:32 np0005486759.ooo.test podman[316268]: 2025-10-14 10:02:32.38832163 +0000 UTC m=+0.082713295 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:02:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:32Z|00124|binding|INFO|40d0d6b3-7f24-4531-af2b-3710504d2d66: Claiming unknown
Oct 14 10:02:32 np0005486759.ooo.test systemd-udevd[316299]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:02:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:32.393 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-e904e8e1-141e-48c6-bb35-43a6d593c079', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e904e8e1-141e-48c6-bb35-43a6d593c079', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9be440187a9a42389ec92437f728daf4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0866ae5b-3191-4bc9-b9bf-6700611ec46f, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=40d0d6b3-7f24-4531-af2b-3710504d2d66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:32.396 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 40d0d6b3-7f24-4531-af2b-3710504d2d66 in datapath e904e8e1-141e-48c6-bb35-43a6d593c079 bound to our chassis
Oct 14 10:02:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:32.398 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e904e8e1-141e-48c6-bb35-43a6d593c079 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:02:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:32.400 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[aa336f3a-dbdb-4487-b86b-d35d62ec5ab2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:32 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:02:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:32Z|00125|binding|INFO|Setting lport 40d0d6b3-7f24-4531-af2b-3710504d2d66 ovn-installed in OVS
Oct 14 10:02:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:32Z|00126|binding|INFO|Setting lport 40d0d6b3-7f24-4531-af2b-3710504d2d66 up in Southbound
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:32Z|00127|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:32 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:02:32 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:32 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:32 np0005486759.ooo.test podman[316328]: 2025-10-14 10:02:32.571331612 +0000 UTC m=+0.053189113 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 14 10:02:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:32.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:33 np0005486759.ooo.test podman[316396]: 2025-10-14 10:02:33.219839166 +0000 UTC m=+0.054315028 container kill 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq[313665]: exiting on receipt of SIGTERM
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: libpod-318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4.scope: Deactivated successfully.
Oct 14 10:02:33 np0005486759.ooo.test podman[316417]: 
Oct 14 10:02:33 np0005486759.ooo.test podman[316430]: 2025-10-14 10:02:33.282614521 +0000 UTC m=+0.048821480 container died 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f1553636b6c66d7c9bcf4d828e312c70ff1c925d001df03c50b938a645a3188e-merged.mount: Deactivated successfully.
Oct 14 10:02:33 np0005486759.ooo.test podman[316430]: 2025-10-14 10:02:33.317073142 +0000 UTC m=+0.083280091 container remove 318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09e16ed7-f945-41a6-8272-927947e04a34, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: libpod-conmon-318e245a697708bb48ab21453265a2137063dbf254b190dbbb7907f3b36c95b4.scope: Deactivated successfully.
Oct 14 10:02:33 np0005486759.ooo.test podman[316417]: 2025-10-14 10:02:33.334096731 +0000 UTC m=+0.118229518 container create 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:02:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:33.338 287366 INFO neutron.agent.dhcp.agent [None req-37d7bbcc-fc57-4eee-8827-602edf1e1992 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d09e16ed7\x2df945\x2d41a6\x2d8272\x2d927947e04a34.mount: Deactivated successfully.
Oct 14 10:02:33 np0005486759.ooo.test podman[316417]: 2025-10-14 10:02:33.241673472 +0000 UTC m=+0.025806289 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:02:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:33.367 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: Started libpod-conmon-30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135.scope.
Oct 14 10:02:33 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:02:33 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a10ec95902109bf95f43da7c8df5fad4241e7c6134017679d832ae7d30b839/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:02:33 np0005486759.ooo.test podman[316417]: 2025-10-14 10:02:33.393988499 +0000 UTC m=+0.178121296 container init 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009)
Oct 14 10:02:33 np0005486759.ooo.test podman[316417]: 2025-10-14 10:02:33.399249829 +0000 UTC m=+0.183382606 container start 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS)
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq[316466]: started, version 2.85 cachesize 150
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq[316466]: DNS service limited to local subnets
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq[316466]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq[316466]: warning: no upstream servers configured
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq-dhcp[316466]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/addn_hosts - 0 addresses
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/host
Oct 14 10:02:33 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/opts
Oct 14 10:02:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:33.530 287366 INFO neutron.agent.dhcp.agent [None req-a58114f5-ce54-449f-8dff-75e96dd6ba71 - - - - - -] DHCP configuration for ports {'84e26916-b6bd-4eb1-8d14-bbd81165f5fa'} is completed
Oct 14 10:02:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:33.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:34.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:34.405 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:34.761 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:34Z, description=, device_id=10f7f20f-57ef-47e1-ad30-3a74f19db5a8, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7aec70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7ae700>], id=423ac5cf-010e-4e25-a5d9-430b49955e08, ip_allocation=immediate, mac_address=fa:16:3e:71:6e:82, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=735, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:02:34Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:02:34 np0005486759.ooo.test podman[316482]: 2025-10-14 10:02:34.946968284 +0000 UTC m=+0.040776824 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:34 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:02:34 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:34 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:35.191 287366 INFO neutron.agent.dhcp.agent [None req-a5ddcc01-6d13-4196-a811-81999f12be67 - - - - - -] DHCP configuration for ports {'423ac5cf-010e-4e25-a5d9-430b49955e08'} is completed
Oct 14 10:02:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:36.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:36.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:36.822 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:36 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:36.824 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:02:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:38.078 287366 INFO neutron.agent.linux.ip_lib [None req-38f14d4b-7984-4c70-8291-f9648dfdc72e - - - - - -] Device tap2d4dc890-a9 cannot be used as it has no MAC address
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:38 np0005486759.ooo.test kernel: device tap2d4dc890-a9 entered promiscuous mode
Oct 14 10:02:38 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436158.1056] manager: (tap2d4dc890-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:38Z|00128|binding|INFO|Claiming lport 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 for this chassis.
Oct 14 10:02:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:38Z|00129|binding|INFO|2d4dc890-a9a1-4d04-820c-c7ec984fcb76: Claiming unknown
Oct 14 10:02:38 np0005486759.ooo.test systemd-udevd[316511]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:02:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:38.121 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-c116b547-83f0-4ca0-91ff-47c6adf5d25b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c116b547-83f0-4ca0-91ff-47c6adf5d25b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba32016074d74170a21724e616d43009', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d1a08a-bb43-4d43-958f-8b5363a9cecb, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=2d4dc890-a9a1-4d04-820c-c7ec984fcb76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:38.122 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 in datapath c116b547-83f0-4ca0-91ff-47c6adf5d25b bound to our chassis
Oct 14 10:02:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:38.126 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 91fa667e-66cf-4a2b-bd64-7b2a606426fc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:02:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:38.126 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c116b547-83f0-4ca0-91ff-47c6adf5d25b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:38.127 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[69339a87-5bc8-4445-a037-67a1f822928f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:38Z|00130|binding|INFO|Setting lport 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 ovn-installed in OVS
Oct 14 10:02:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:38Z|00131|binding|INFO|Setting lport 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 up in Southbound
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap2d4dc890-a9: No such device
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:38.251 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:37Z, description=, device_id=10f7f20f-57ef-47e1-ad30-3a74f19db5a8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec9041c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec904dc0>], id=81758096-be7a-44f0-a74e-3d796e88fb4c, ip_allocation=immediate, mac_address=fa:16:3e:94:83:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:02:30Z, description=, dns_domain=, id=e904e8e1-141e-48c6-bb35-43a6d593c079, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-829310277-network, port_security_enabled=True, project_id=9be440187a9a42389ec92437f728daf4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=704, status=ACTIVE, subnets=['de3d28a0-09db-4a81-b37f-75240b618d12'], tags=[], tenant_id=9be440187a9a42389ec92437f728daf4, updated_at=2025-10-14T10:02:31Z, vlan_transparent=None, network_id=e904e8e1-141e-48c6-bb35-43a6d593c079, port_security_enabled=False, project_id=9be440187a9a42389ec92437f728daf4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=751, status=DOWN, tags=[], tenant_id=9be440187a9a42389ec92437f728daf4, updated_at=2025-10-14T10:02:37Z on network e904e8e1-141e-48c6-bb35-43a6d593c079
Oct 14 10:02:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:38.432 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:37Z, description=, device_id=26294aa8-6699-41cc-affa-08cbbfd948c0, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7de250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec763220>], id=e77ced70-4c0e-4078-b1f8-f412b49f2e80, ip_allocation=immediate, mac_address=fa:16:3e:c6:0f:f7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=757, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:02:38Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:02:38 np0005486759.ooo.test systemd[1]: tmp-crun.vIOgPD.mount: Deactivated successfully.
Oct 14 10:02:38 np0005486759.ooo.test dnsmasq[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/addn_hosts - 1 addresses
Oct 14 10:02:38 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/host
Oct 14 10:02:38 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/opts
Oct 14 10:02:38 np0005486759.ooo.test podman[316565]: 2025-10-14 10:02:38.451723811 +0000 UTC m=+0.057785243 container kill 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.613 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760436143.6128535, e9b02497-1e5d-4944-95c9-d833637e1968 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.614 2 INFO nova.compute.manager [-] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] VM Stopped (Lifecycle Event)
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.636 2 DEBUG nova.compute.manager [None req-5064996c-dda9-4bec-8797-6c5cfcc7bd04 - - - - - -] [instance: e9b02497-1e5d-4944-95c9-d833637e1968] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 14 10:02:38 np0005486759.ooo.test podman[316610]: 2025-10-14 10:02:38.64836946 +0000 UTC m=+0.058326969 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:02:38 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 5 addresses
Oct 14 10:02:38 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:38 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:38.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:38.757 287366 INFO neutron.agent.dhcp.agent [None req-60f14006-9de1-48f1-9440-e6bf0c2e1262 - - - - - -] DHCP configuration for ports {'81758096-be7a-44f0-a74e-3d796e88fb4c'} is completed
Oct 14 10:02:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:38.898 287366 INFO neutron.agent.dhcp.agent [None req-35ef09b0-9753-455d-83a0-f6a442120e7e - - - - - -] DHCP configuration for ports {'e77ced70-4c0e-4078-b1f8-f412b49f2e80'} is completed
Oct 14 10:02:38 np0005486759.ooo.test podman[316659]: 
Oct 14 10:02:39 np0005486759.ooo.test podman[316659]: 2025-10-14 10:02:39.000150002 +0000 UTC m=+0.069440249 container create 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:02:39 np0005486759.ooo.test systemd[1]: tmp-crun.bf6JoM.mount: Deactivated successfully.
Oct 14 10:02:39 np0005486759.ooo.test systemd[1]: Started libpod-conmon-96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e.scope.
Oct 14 10:02:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:39.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:39 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:02:39 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b443bfb0c5d816b6cf40f58e13da81889232c44ab0921ccb87e20ab228657f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:02:39 np0005486759.ooo.test podman[316659]: 2025-10-14 10:02:39.061890025 +0000 UTC m=+0.131180252 container init 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009)
Oct 14 10:02:39 np0005486759.ooo.test podman[316659]: 2025-10-14 10:02:38.964916747 +0000 UTC m=+0.034206994 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:02:39 np0005486759.ooo.test podman[316659]: 2025-10-14 10:02:39.070032594 +0000 UTC m=+0.139322811 container start 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq[316677]: started, version 2.85 cachesize 150
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq[316677]: DNS service limited to local subnets
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq[316677]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq[316677]: warning: no upstream servers configured
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq-dhcp[316677]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/addn_hosts - 0 addresses
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/host
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/opts
Oct 14 10:02:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:39.272 287366 INFO neutron.agent.dhcp.agent [None req-2711ea83-4261-43f1-adb6-2949a053cd9f - - - - - -] DHCP configuration for ports {'cc33b671-f6c2-49a2-b81d-e7c0ddf05965'} is completed
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:39 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:39 np0005486759.ooo.test podman[316695]: 2025-10-14 10:02:39.440024431 +0000 UTC m=+0.049429649 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:39Z|00132|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:39.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:40 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:40.507 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:37Z, description=, device_id=10f7f20f-57ef-47e1-ad30-3a74f19db5a8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed0925e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec81a5e0>], id=81758096-be7a-44f0-a74e-3d796e88fb4c, ip_allocation=immediate, mac_address=fa:16:3e:94:83:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:02:30Z, description=, dns_domain=, id=e904e8e1-141e-48c6-bb35-43a6d593c079, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-829310277-network, port_security_enabled=True, project_id=9be440187a9a42389ec92437f728daf4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=704, status=ACTIVE, subnets=['de3d28a0-09db-4a81-b37f-75240b618d12'], tags=[], tenant_id=9be440187a9a42389ec92437f728daf4, updated_at=2025-10-14T10:02:31Z, vlan_transparent=None, network_id=e904e8e1-141e-48c6-bb35-43a6d593c079, port_security_enabled=False, project_id=9be440187a9a42389ec92437f728daf4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=751, status=DOWN, tags=[], tenant_id=9be440187a9a42389ec92437f728daf4, updated_at=2025-10-14T10:02:37Z on network e904e8e1-141e-48c6-bb35-43a6d593c079
Oct 14 10:02:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:40.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:40 np0005486759.ooo.test podman[316733]: 2025-10-14 10:02:40.697200073 +0000 UTC m=+0.052249925 container kill 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:40 np0005486759.ooo.test dnsmasq[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/addn_hosts - 1 addresses
Oct 14 10:02:40 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/host
Oct 14 10:02:40 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/opts
Oct 14 10:02:40 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:40.923 287366 INFO neutron.agent.dhcp.agent [None req-8cf16fe4-a347-43a6-9aa5-3d58d7a70d21 - - - - - -] DHCP configuration for ports {'81758096-be7a-44f0-a74e-3d796e88fb4c'} is completed
Oct 14 10:02:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:41.827 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:02:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:02:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:02:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:02:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132498 "" "Go-http-client/1.1"
Oct 14 10:02:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:02:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17157 "" "Go-http-client/1.1"
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:02:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:42.387 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:41Z, description=, device_id=26294aa8-6699-41cc-affa-08cbbfd948c0, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec81a880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec81a4c0>], id=2fb4566d-3bd0-4bd7-b3a9-478aaf7c2171, ip_allocation=immediate, mac_address=fa:16:3e:ec:86:ce, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:02:34Z, description=, dns_domain=, id=c116b547-83f0-4ca0-91ff-47c6adf5d25b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-650258120-network, port_security_enabled=True, project_id=ba32016074d74170a21724e616d43009, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42583, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=739, status=ACTIVE, subnets=['bcf0342a-f7cc-4d3d-91a1-425e8472079b'], tags=[], tenant_id=ba32016074d74170a21724e616d43009, updated_at=2025-10-14T10:02:36Z, vlan_transparent=None, network_id=c116b547-83f0-4ca0-91ff-47c6adf5d25b, port_security_enabled=False, project_id=ba32016074d74170a21724e616d43009, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=782, status=DOWN, tags=[], tenant_id=ba32016074d74170a21724e616d43009, updated_at=2025-10-14T10:02:42Z on network c116b547-83f0-4ca0-91ff-47c6adf5d25b
Oct 14 10:02:42 np0005486759.ooo.test podman[316757]: 2025-10-14 10:02:42.433846852 +0000 UTC m=+0.063187459 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 14 10:02:42 np0005486759.ooo.test podman[316757]: 2025-10-14 10:02:42.44432027 +0000 UTC m=+0.073660857 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:02:42 np0005486759.ooo.test podman[316755]: 2025-10-14 10:02:42.477110691 +0000 UTC m=+0.110488131 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 10:02:42 np0005486759.ooo.test podman[316755]: 2025-10-14 10:02:42.486407144 +0000 UTC m=+0.119784584 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0)
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:02:42 np0005486759.ooo.test podman[316754]: 2025-10-14 10:02:42.530603893 +0000 UTC m=+0.164330974 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:02:42 np0005486759.ooo.test dnsmasq[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/addn_hosts - 1 addresses
Oct 14 10:02:42 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/host
Oct 14 10:02:42 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/opts
Oct 14 10:02:42 np0005486759.ooo.test podman[316840]: 2025-10-14 10:02:42.600947239 +0000 UTC m=+0.043573920 container kill 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:02:42 np0005486759.ooo.test podman[316754]: 2025-10-14 10:02:42.613848853 +0000 UTC m=+0.247575934 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:02:42 np0005486759.ooo.test podman[316756]: 2025-10-14 10:02:42.59080321 +0000 UTC m=+0.221726175 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, container_name=multipathd, io.buildah.version=1.41.3)
Oct 14 10:02:42 np0005486759.ooo.test podman[316756]: 2025-10-14 10:02:42.673381719 +0000 UTC m=+0.304304674 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 10:02:42 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:02:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:42.782 287366 INFO neutron.agent.dhcp.agent [None req-28e76079-f6fa-4849-a93a-a7337f345276 - - - - - -] DHCP configuration for ports {'2fb4566d-3bd0-4bd7-b3a9-478aaf7c2171'} is completed
Oct 14 10:02:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:43.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:02:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:02:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:02:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:44.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:44.086 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:41Z, description=, device_id=26294aa8-6699-41cc-affa-08cbbfd948c0, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec736dc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec736e80>], id=2fb4566d-3bd0-4bd7-b3a9-478aaf7c2171, ip_allocation=immediate, mac_address=fa:16:3e:ec:86:ce, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:02:34Z, description=, dns_domain=, id=c116b547-83f0-4ca0-91ff-47c6adf5d25b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-650258120-network, port_security_enabled=True, project_id=ba32016074d74170a21724e616d43009, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42583, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=739, status=ACTIVE, subnets=['bcf0342a-f7cc-4d3d-91a1-425e8472079b'], tags=[], tenant_id=ba32016074d74170a21724e616d43009, updated_at=2025-10-14T10:02:36Z, vlan_transparent=None, network_id=c116b547-83f0-4ca0-91ff-47c6adf5d25b, port_security_enabled=False, project_id=ba32016074d74170a21724e616d43009, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=782, status=DOWN, tags=[], tenant_id=ba32016074d74170a21724e616d43009, updated_at=2025-10-14T10:02:42Z on network c116b547-83f0-4ca0-91ff-47c6adf5d25b
Oct 14 10:02:44 np0005486759.ooo.test dnsmasq[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/addn_hosts - 1 addresses
Oct 14 10:02:44 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/host
Oct 14 10:02:44 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/opts
Oct 14 10:02:44 np0005486759.ooo.test podman[316884]: 2025-10-14 10:02:44.280015749 +0000 UTC m=+0.054584157 container kill 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:44.583 287366 INFO neutron.agent.dhcp.agent [None req-03ff2d52-5648-4be4-84cf-fdc0a579ba37 - - - - - -] DHCP configuration for ports {'2fb4566d-3bd0-4bd7-b3a9-478aaf7c2171'} is completed
Oct 14 10:02:44 np0005486759.ooo.test systemd[1]: tmp-crun.o8ZyKU.mount: Deactivated successfully.
Oct 14 10:02:44 np0005486759.ooo.test dnsmasq[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/addn_hosts - 0 addresses
Oct 14 10:02:44 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/host
Oct 14 10:02:44 np0005486759.ooo.test dnsmasq-dhcp[316466]: read /var/lib/neutron/dhcp/e904e8e1-141e-48c6-bb35-43a6d593c079/opts
Oct 14 10:02:44 np0005486759.ooo.test podman[316921]: 2025-10-14 10:02:44.786505801 +0000 UTC m=+0.063296011 container kill 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:02:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:44.905 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:02:44Z, description=, device_id=31dd9675-0953-4199-9d60-966771bc94e6, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77cbe0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec77cbb0>], id=c2855853-81bd-4ce7-8778-1b95759e0986, ip_allocation=immediate, mac_address=fa:16:3e:26:fe:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=790, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:02:44Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:02:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:44Z|00133|binding|INFO|Releasing lport 40d0d6b3-7f24-4531-af2b-3710504d2d66 from this chassis (sb_readonly=0)
Oct 14 10:02:44 np0005486759.ooo.test kernel: device tap40d0d6b3-7f left promiscuous mode
Oct 14 10:02:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:44Z|00134|binding|INFO|Setting lport 40d0d6b3-7f24-4531-af2b-3710504d2d66 down in Southbound
Oct 14 10:02:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:44.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:44.967 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-e904e8e1-141e-48c6-bb35-43a6d593c079', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e904e8e1-141e-48c6-bb35-43a6d593c079', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9be440187a9a42389ec92437f728daf4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0866ae5b-3191-4bc9-b9bf-6700611ec46f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=40d0d6b3-7f24-4531-af2b-3710504d2d66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:44.969 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 40d0d6b3-7f24-4531-af2b-3710504d2d66 in datapath e904e8e1-141e-48c6-bb35-43a6d593c079 unbound from our chassis
Oct 14 10:02:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:44.972 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e904e8e1-141e-48c6-bb35-43a6d593c079, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:44.974 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[889edac6-623f-4457-a4c4-5e10c28d07d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:44.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:44.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:45 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 5 addresses
Oct 14 10:02:45 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:45 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:45 np0005486759.ooo.test podman[316961]: 2025-10-14 10:02:45.101371257 +0000 UTC m=+0.045358734 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:02:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:45.325 287366 INFO neutron.agent.dhcp.agent [None req-9fd6f5ae-d79c-42c9-b1c5-555d4dc85419 - - - - - -] DHCP configuration for ports {'c2855853-81bd-4ce7-8778-1b95759e0986'} is completed
Oct 14 10:02:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:45.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:46 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:46Z|00135|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:46.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:46 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 4 addresses
Oct 14 10:02:46 np0005486759.ooo.test podman[316999]: 2025-10-14 10:02:46.294410873 +0000 UTC m=+0.053891295 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:02:46 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:46 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:46 np0005486759.ooo.test dnsmasq[316466]: exiting on receipt of SIGTERM
Oct 14 10:02:46 np0005486759.ooo.test podman[317038]: 2025-10-14 10:02:46.714207929 +0000 UTC m=+0.110133941 container kill 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:02:46 np0005486759.ooo.test systemd[1]: libpod-30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135.scope: Deactivated successfully.
Oct 14 10:02:46 np0005486759.ooo.test podman[317052]: 2025-10-14 10:02:46.781304806 +0000 UTC m=+0.047060557 container died 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:02:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:46 np0005486759.ooo.test podman[317052]: 2025-10-14 10:02:46.828541616 +0000 UTC m=+0.094297327 container remove 30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e904e8e1-141e-48c6-bb35-43a6d593c079, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:02:46 np0005486759.ooo.test systemd[1]: libpod-conmon-30837a873386f6ca4974380ab51874a7c9a0dccfc45126aa5688050e60062135.scope: Deactivated successfully.
Oct 14 10:02:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:46.860 287366 INFO neutron.agent.dhcp.agent [None req-72e31220-b173-47e7-8418-eba75995a024 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:46.860 287366 INFO neutron.agent.dhcp.agent [None req-72e31220-b173-47e7-8418-eba75995a024 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:47 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-90a10ec95902109bf95f43da7c8df5fad4241e7c6134017679d832ae7d30b839-merged.mount: Deactivated successfully.
Oct 14 10:02:47 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2de904e8e1\x2d141e\x2d48c6\x2dbb35\x2d43a6d593c079.mount: Deactivated successfully.
Oct 14 10:02:48 np0005486759.ooo.test dnsmasq[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/addn_hosts - 0 addresses
Oct 14 10:02:48 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/host
Oct 14 10:02:48 np0005486759.ooo.test podman[317093]: 2025-10-14 10:02:48.552186639 +0000 UTC m=+0.060809356 container kill 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:02:48 np0005486759.ooo.test dnsmasq-dhcp[316677]: read /var/lib/neutron/dhcp/c116b547-83f0-4ca0-91ff-47c6adf5d25b/opts
Oct 14 10:02:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:48.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:48Z|00136|binding|INFO|Releasing lport 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 from this chassis (sb_readonly=0)
Oct 14 10:02:48 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:48Z|00137|binding|INFO|Setting lport 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 down in Southbound
Oct 14 10:02:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:48.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:48 np0005486759.ooo.test kernel: device tap2d4dc890-a9 left promiscuous mode
Oct 14 10:02:48 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:48.729 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-c116b547-83f0-4ca0-91ff-47c6adf5d25b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c116b547-83f0-4ca0-91ff-47c6adf5d25b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba32016074d74170a21724e616d43009', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d1a08a-bb43-4d43-958f-8b5363a9cecb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=2d4dc890-a9a1-4d04-820c-c7ec984fcb76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:02:48 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:48.731 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 2d4dc890-a9a1-4d04-820c-c7ec984fcb76 in datapath c116b547-83f0-4ca0-91ff-47c6adf5d25b unbound from our chassis
Oct 14 10:02:48 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:48.734 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c116b547-83f0-4ca0-91ff-47c6adf5d25b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:02:48 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:48.735 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd05926-68ed-4f05-98d1-3e1fa7a349e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:02:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:48.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:49 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:49.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63355 DF PROTO=TCP SPT=42162 DPT=9102 SEQ=4228536045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA2E9170000000001030307) 
Oct 14 10:02:50 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:02:50 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:50 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:50 np0005486759.ooo.test podman[317132]: 2025-10-14 10:02:50.092924581 +0000 UTC m=+0.059484437 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3)
Oct 14 10:02:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:02:50 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:02:50 np0005486759.ooo.test podman[317146]: 2025-10-14 10:02:50.209202448 +0000 UTC m=+0.084742436 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct 14 10:02:50 np0005486759.ooo.test podman[317147]: 2025-10-14 10:02:50.232137777 +0000 UTC m=+0.105722965 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public)
Oct 14 10:02:50 np0005486759.ooo.test podman[317147]: 2025-10-14 10:02:50.240827652 +0000 UTC m=+0.114412760 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64)
Oct 14 10:02:50 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:02:50 np0005486759.ooo.test podman[317146]: 2025-10-14 10:02:50.291419496 +0000 UTC m=+0.166959494 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Oct 14 10:02:50 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:02:50 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:50Z|00138|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:50.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63356 DF PROTO=TCP SPT=42162 DPT=9102 SEQ=4228536045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA2ED010000000001030307) 
Oct 14 10:02:50 np0005486759.ooo.test dnsmasq[316677]: exiting on receipt of SIGTERM
Oct 14 10:02:50 np0005486759.ooo.test podman[317212]: 2025-10-14 10:02:50.770852722 +0000 UTC m=+0.054908456 container kill 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:02:50 np0005486759.ooo.test systemd[1]: libpod-96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e.scope: Deactivated successfully.
Oct 14 10:02:50 np0005486759.ooo.test podman[317223]: 2025-10-14 10:02:50.81736276 +0000 UTC m=+0.039851676 container died 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:02:50 np0005486759.ooo.test podman[317223]: 2025-10-14 10:02:50.898242618 +0000 UTC m=+0.120731494 container cleanup 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:02:50 np0005486759.ooo.test systemd[1]: libpod-conmon-96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e.scope: Deactivated successfully.
Oct 14 10:02:50 np0005486759.ooo.test podman[317230]: 2025-10-14 10:02:50.919526877 +0000 UTC m=+0.134759991 container remove 96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c116b547-83f0-4ca0-91ff-47c6adf5d25b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:02:50 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:50.939 287366 INFO neutron.agent.dhcp.agent [None req-b05ddebb-acc2-4eb6-9bbe-b22e4963256e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-7b443bfb0c5d816b6cf40f58e13da81889232c44ab0921ccb87e20ab228657f8-merged.mount: Deactivated successfully.
Oct 14 10:02:51 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96e9f66e35b43d50d1987973ff0c1be85ea9d5628bd37e159bd690dd254d979e-userdata-shm.mount: Deactivated successfully.
Oct 14 10:02:51 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2dc116b547\x2d83f0\x2d4ca0\x2d91ff\x2d47c6adf5d25b.mount: Deactivated successfully.
Oct 14 10:02:51 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:02:51.097 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:02:52 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:02:52 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:02:52 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:02:52 np0005486759.ooo.test podman[317268]: 2025-10-14 10:02:52.433826962 +0000 UTC m=+0.059501016 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009)
Oct 14 10:02:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63357 DF PROTO=TCP SPT=42162 DPT=9102 SEQ=4228536045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA2F5010000000001030307) 
Oct 14 10:02:52 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:02:52Z|00139|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:02:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:52.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:53.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:54.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:54.169 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:02:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:54.169 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:02:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:02:54.170 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:02:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63358 DF PROTO=TCP SPT=42162 DPT=9102 SEQ=4228536045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA304C10000000001030307) 
Oct 14 10:02:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:58.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:02:59.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:02:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:02:59 np0005486759.ooo.test podman[317290]: 2025-10-14 10:02:59.434260268 +0000 UTC m=+0.058713572 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 14 10:02:59 np0005486759.ooo.test podman[317290]: 2025-10-14 10:02:59.441577252 +0000 UTC m=+0.066030556 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251009)
Oct 14 10:02:59 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:03:03 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:03:03 np0005486759.ooo.test podman[317308]: 2025-10-14 10:03:03.43317715 +0000 UTC m=+0.056856766 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:03:03 np0005486759.ooo.test podman[317308]: 2025-10-14 10:03:03.439130772 +0000 UTC m=+0.062810428 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:03:03 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:03:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:03.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:04.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:08.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.793 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Acquiring lock "9b229ec3-a035-4949-8484-69924247065a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.793 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Lock "9b229ec3-a035-4949-8484-69924247065a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.825 2 DEBUG nova.compute.manager [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 14 10:03:09 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:09.876 2 INFO neutron.agent.securitygroups_rpc [req-7a3dbe45-a212-4782-8c55-ce1df6c3099f req-79d3f30b-263d-4425-bcb7-00c8b24e8d04 ba199bcacf074d76b36d60dc12d13cb5 70d4052729ff41f39eda195c1d7973bb - - default default] Security group member updated ['905940d5-3dbf-414d-b49e-ee26a374bb24']
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.893 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.894 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.899 2 DEBUG nova.virt.hardware [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 14 10:03:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:09.899 2 INFO nova.compute.claims [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Claim successful on node np0005486759.ooo.test
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.026 2 DEBUG nova.compute.provider_tree [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.040 2 DEBUG nova.scheduler.client.report [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.067 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.068 2 DEBUG nova.compute.manager [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.109 2 DEBUG nova.compute.claims [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Aborting claim: <nova.compute.claims.Claim object at 0x7f8a347a0eb0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.110 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.110 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.146 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.146 2 DEBUG nova.compute.utils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Instance 9b229ec3-a035-4949-8484-69924247065a could not be found. notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.147 2 DEBUG nova.compute.manager [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Instance disappeared during build. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2483
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.147 2 DEBUG nova.compute.manager [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.147 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Acquiring lock "refresh_cache-9b229ec3-a035-4949-8484-69924247065a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.147 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Acquired lock "refresh_cache-9b229ec3-a035-4949-8484-69924247065a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.148 2 DEBUG nova.network.neutron [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.158 2 DEBUG nova.compute.utils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.188 2 DEBUG nova.network.neutron [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.300 2 DEBUG nova.network.neutron [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.317 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Releasing lock "refresh_cache-9b229ec3-a035-4949-8484-69924247065a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.318 2 DEBUG nova.compute.manager [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.319 2 DEBUG nova.compute.manager [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] [instance: 9b229ec3-a035-4949-8484-69924247065a] Skipping network deallocation for instance since networking was not requested. _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2255
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.394 2 INFO nova.scheduler.client.report [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Deleted allocations for instance 9b229ec3-a035-4949-8484-69924247065a
Oct 14 10:03:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:10.395 2 DEBUG oslo_concurrency.lockutils [None req-1d0b3c7c-6fc8-45e1-98a4-7c536658e90f 497fb625157246a5b6d01560de54ffd0 f090f77551e54d85937df5aa9e0bf6b4 - - default default] Lock "9b229ec3-a035-4949-8484-69924247065a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:03:10 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:10.988 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:03:10Z, description=, device_id=a4302efd-c705-4c87-90bf-872c208c0906, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f2400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f27c0>], id=ea804914-9645-4e2d-9117-a7f5a5c8d357, ip_allocation=immediate, mac_address=fa:16:3e:75:af:48, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=854, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:03:10Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:03:11 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:03:11 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:03:11 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:03:11 np0005486759.ooo.test podman[317348]: 2025-10-14 10:03:11.203796722 +0000 UTC m=+0.072681819 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:03:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:11.228 287366 INFO neutron.agent.linux.ip_lib [None req-c561176a-c827-4f2b-854d-1bfb36cc6e28 - - - - - -] Device tap95b731f3-5e cannot be used as it has no MAC address
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test kernel: device tap95b731f3-5e entered promiscuous mode
Oct 14 10:03:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:11Z|00140|binding|INFO|Claiming lport 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 for this chassis.
Oct 14 10:03:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:11Z|00141|binding|INFO|95b731f3-5e53-48a3-9d8f-5542b0c1d3c4: Claiming unknown
Oct 14 10:03:11 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436191.2696] manager: (tap95b731f3-5e): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test systemd-udevd[317374]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.280 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-3a358ac5-85f1-49db-a829-f8fa243ab752', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a358ac5-85f1-49db-a829-f8fa243ab752', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d03b3e18e4541be845419a2c2164824', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f291ae6-cc0e-4f7e-b811-104735618de6, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=95b731f3-5e53-48a3-9d8f-5542b0c1d3c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:11Z|00142|binding|INFO|Setting lport 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 ovn-installed in OVS
Oct 14 10:03:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:11Z|00143|binding|INFO|Setting lport 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 up in Southbound
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.281 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 in datapath 3a358ac5-85f1-49db-a829-f8fa243ab752 bound to our chassis
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.283 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8a2624a6-7302-4e02-8fa5-5f71f092427e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.283 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a358ac5-85f1-49db-a829-f8fa243ab752, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.285 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[16acf3fc-bdf0-4d84-ae26-6471e8b00669]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:11.437 287366 INFO neutron.agent.dhcp.agent [None req-c5b52d1d-5e35-433a-8650-ff84cbd9db82 - - - - - -] DHCP configuration for ports {'ea804914-9645-4e2d-9117-a7f5a5c8d357'} is completed
Oct 14 10:03:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:11Z|00144|binding|INFO|Releasing lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a from this chassis (sb_readonly=0)
Oct 14 10:03:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:11Z|00145|binding|INFO|Setting lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a down in Southbound
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test kernel: device tap3ac5f212-9e left promiscuous mode
Oct 14 10:03:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:11.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.864 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-a6b02595-ce43-43c7-aca8-531937571464', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6b02595-ce43-43c7-aca8-531937571464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51305e49-a7d0-486f-a42f-28329bac1f69, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.867 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a in datapath a6b02595-ce43-43c7-aca8-531937571464 unbound from our chassis
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.870 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6b02595-ce43-43c7-aca8-531937571464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:03:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:11.871 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[55b1219d-e646-40d8-bae0-e98297261d0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:12 np0005486759.ooo.test podman[317435]: 
Oct 14 10:03:12 np0005486759.ooo.test podman[317435]: 2025-10-14 10:03:12.127644354 +0000 UTC m=+0.050151991 container create 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:12 np0005486759.ooo.test systemd[1]: Started libpod-conmon-64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150.scope.
Oct 14 10:03:12 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:03:12 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eafaee2a317dd1d16ef24bf2d1502a7c8a703f02a87a99735f92d8ed865b1897/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:03:12 np0005486759.ooo.test podman[317435]: 2025-10-14 10:03:12.194797803 +0000 UTC m=+0.117305470 container init 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:12 np0005486759.ooo.test podman[317435]: 2025-10-14 10:03:12.101887738 +0000 UTC m=+0.024395375 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:03:12 np0005486759.ooo.test podman[317435]: 2025-10-14 10:03:12.204246371 +0000 UTC m=+0.126754048 container start 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq[317454]: started, version 2.85 cachesize 150
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq[317454]: DNS service limited to local subnets
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq[317454]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq[317454]: warning: no upstream servers configured
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq-dhcp[317454]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/addn_hosts - 0 addresses
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/host
Oct 14 10:03:12 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/opts
Oct 14 10:03:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:03:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:03:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:03:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130674 "" "Go-http-client/1.1"
Oct 14 10:03:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:03:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16684 "" "Go-http-client/1.1"
Oct 14 10:03:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:12.558 287366 INFO neutron.agent.dhcp.agent [None req-71d786aa-46e0-4156-b3d0-3a62939095a6 - - - - - -] DHCP configuration for ports {'f207cdb0-a8ea-4316-a158-54d7264476d4'} is completed
Oct 14 10:03:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:12.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:03:13 np0005486759.ooo.test podman[317456]: 2025-10-14 10:03:13.453732718 +0000 UTC m=+0.081160037 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 10:03:13 np0005486759.ooo.test podman[317456]: 2025-10-14 10:03:13.465346792 +0000 UTC m=+0.092774161 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:03:13 np0005486759.ooo.test podman[317455]: 2025-10-14 10:03:13.434518332 +0000 UTC m=+0.064302683 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 10:03:13 np0005486759.ooo.test podman[317458]: 2025-10-14 10:03:13.549675525 +0000 UTC m=+0.171983927 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:03:13 np0005486759.ooo.test podman[317458]: 2025-10-14 10:03:13.560417603 +0000 UTC m=+0.182726015 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:03:13 np0005486759.ooo.test podman[317455]: 2025-10-14 10:03:13.570734287 +0000 UTC m=+0.200518648 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:03:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:13.581 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:03:13Z, description=, device_id=a4302efd-c705-4c87-90bf-872c208c0906, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7b4160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7b4d30>], id=40a84564-f0c0-4681-ac9e-6cf11b74e43e, ip_allocation=immediate, mac_address=fa:16:3e:40:e2:a9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:03:08Z, description=, dns_domain=, id=3a358ac5-85f1-49db-a829-f8fa243ab752, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1857125600-network, port_security_enabled=True, project_id=9d03b3e18e4541be845419a2c2164824, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63065, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=847, status=ACTIVE, subnets=['a6b026c3-f339-4271-8186-907729c9a32d'], tags=[], tenant_id=9d03b3e18e4541be845419a2c2164824, updated_at=2025-10-14T10:03:09Z, vlan_transparent=None, network_id=3a358ac5-85f1-49db-a829-f8fa243ab752, port_security_enabled=False, project_id=9d03b3e18e4541be845419a2c2164824, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=860, status=DOWN, tags=[], tenant_id=9d03b3e18e4541be845419a2c2164824, updated_at=2025-10-14T10:03:13Z on network 3a358ac5-85f1-49db-a829-f8fa243ab752
Oct 14 10:03:13 np0005486759.ooo.test podman[317457]: 2025-10-14 10:03:13.522224397 +0000 UTC m=+0.146924602 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:13 np0005486759.ooo.test podman[317457]: 2025-10-14 10:03:13.658422673 +0000 UTC m=+0.283122858 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:03:13 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:03:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:13.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:13 np0005486759.ooo.test dnsmasq[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/addn_hosts - 1 addresses
Oct 14 10:03:13 np0005486759.ooo.test podman[317550]: 2025-10-14 10:03:13.763113946 +0000 UTC m=+0.054604416 container kill 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:03:13 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/host
Oct 14 10:03:13 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/opts
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:03:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:03:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:14.017 287366 INFO neutron.agent.dhcp.agent [None req-89ed66bd-33cf-4ca0-8d61-f00386771c5e - - - - - -] DHCP configuration for ports {'40a84564-f0c0-4681-ac9e-6cf11b74e43e'} is completed
Oct 14 10:03:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:14.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:14.899 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:03:13Z, description=, device_id=a4302efd-c705-4c87-90bf-872c208c0906, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec768100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7683d0>], id=40a84564-f0c0-4681-ac9e-6cf11b74e43e, ip_allocation=immediate, mac_address=fa:16:3e:40:e2:a9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:03:08Z, description=, dns_domain=, id=3a358ac5-85f1-49db-a829-f8fa243ab752, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1857125600-network, port_security_enabled=True, project_id=9d03b3e18e4541be845419a2c2164824, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63065, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=847, status=ACTIVE, subnets=['a6b026c3-f339-4271-8186-907729c9a32d'], tags=[], tenant_id=9d03b3e18e4541be845419a2c2164824, updated_at=2025-10-14T10:03:09Z, vlan_transparent=None, network_id=3a358ac5-85f1-49db-a829-f8fa243ab752, port_security_enabled=False, project_id=9d03b3e18e4541be845419a2c2164824, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=860, status=DOWN, tags=[], tenant_id=9d03b3e18e4541be845419a2c2164824, updated_at=2025-10-14T10:03:13Z on network 3a358ac5-85f1-49db-a829-f8fa243ab752
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq-dhcp[313311]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:03:15 np0005486759.ooo.test podman[317598]: 2025-10-14 10:03:15.117013078 +0000 UTC m=+0.035191044 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0)
Oct 14 10:03:15 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:15Z|00146|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent [None req-caaaed85-54bd-49bd-83de-028c1ed2ab7a - - - - - -] Unable to reload_allocations dhcp for a6b02595-ce43-43c7-aca8-531937571464.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap3ac5f212-9e not found in namespace qdhcp-a6b02595-ce43-43c7-aca8-531937571464.
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     return fut.result()
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     raise self._exception
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap3ac5f212-9e not found in namespace qdhcp-a6b02595-ce43-43c7-aca8-531937571464.
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.139 287366 ERROR neutron.agent.dhcp.agent 
Oct 14 10:03:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:15.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:15.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:15.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/addn_hosts - 1 addresses
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/host
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/opts
Oct 14 10:03:15 np0005486759.ooo.test podman[317613]: 2025-10-14 10:03:15.223882869 +0000 UTC m=+0.033267086 container kill 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.312 287366 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.467 287366 INFO neutron.agent.dhcp.agent [None req-534784a4-bab2-446b-8784-05517e8451ca - - - - - -] DHCP configuration for ports {'40a84564-f0c0-4681-ac9e-6cf11b74e43e'} is completed
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.656 287366 INFO neutron.agent.dhcp.agent [None req-db1f88a4-5ebb-4c4d-876a-ae6b5d3585a7 - - - - - -] All active networks have been fetched through RPC.
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.658 287366 INFO neutron.agent.dhcp.agent [-] Starting network 9197abc5-07db-4abf-9578-9360b49aea49 dhcp configuration
Oct 14 10:03:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:15.662 287366 INFO neutron.agent.dhcp.agent [-] Starting network a6b02595-ce43-43c7-aca8-531937571464 dhcp configuration
Oct 14 10:03:15 np0005486759.ooo.test dnsmasq[313311]: exiting on receipt of SIGTERM
Oct 14 10:03:15 np0005486759.ooo.test podman[317651]: 2025-10-14 10:03:15.833700751 +0000 UTC m=+0.061358753 container kill e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:03:15 np0005486759.ooo.test systemd[1]: libpod-e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617.scope: Deactivated successfully.
Oct 14 10:03:15 np0005486759.ooo.test podman[317665]: 2025-10-14 10:03:15.895882748 +0000 UTC m=+0.044463136 container died e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:03:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617-userdata-shm.mount: Deactivated successfully.
Oct 14 10:03:15 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-4e52fe15b6a921871dc2ec02d06dadbf52a1fbb50c7c6460b53149eb2ba6a139-merged.mount: Deactivated successfully.
Oct 14 10:03:15 np0005486759.ooo.test podman[317665]: 2025-10-14 10:03:15.996405185 +0000 UTC m=+0.144985523 container remove e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:03:16 np0005486759.ooo.test systemd[1]: libpod-conmon-e2c38a04b2a76780908cd5e1dc53cc7192f56d153a6cb0fe61f01b70a5818617.scope: Deactivated successfully.
Oct 14 10:03:16 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:16.049 287366 INFO neutron.agent.linux.ip_lib [-] Device tap3ac5f212-9e cannot be used as it has no MAC address
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test kernel: device tap3ac5f212-9e entered promiscuous mode
Oct 14 10:03:16 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436196.0846] manager: (tap3ac5f212-9e): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00147|binding|INFO|Claiming lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a for this chassis.
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00148|binding|INFO|3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a: Claiming unknown
Oct 14 10:03:16 np0005486759.ooo.test systemd-udevd[317697]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00149|binding|INFO|Setting lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a ovn-installed in OVS
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00150|binding|INFO|Setting lport 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a up in Southbound
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.134 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-a6b02595-ce43-43c7-aca8-531937571464', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6b02595-ce43-43c7-aca8-531937571464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51305e49-a7d0-486f-a42f-28329bac1f69, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.136 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a in datapath a6b02595-ce43-43c7-aca8-531937571464 bound to our chassis
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.139 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port aa18e873-000f-4f86-969f-374cd84a0bc2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.140 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6b02595-ce43-43c7-aca8-531937571464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.141 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[690e75f2-b1bd-44d1-93f6-4f30ac0798b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.188 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.251 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.253 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:03:16 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:16.618 287366 INFO neutron.agent.linux.ip_lib [None req-b8427a22-9819-4432-b13f-bdcde60787a7 - - - - - -] Device tapd0753771-fa cannot be used as it has no MAC address
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test kernel: device tapd0753771-fa entered promiscuous mode
Oct 14 10:03:16 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436196.6448] manager: (tapd0753771-fa): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Oct 14 10:03:16 np0005486759.ooo.test systemd-udevd[317699]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00151|binding|INFO|Claiming lport d0753771-fa10-4305-b602-7f0977570e75 for this chassis.
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00152|binding|INFO|d0753771-fa10-4305-b602-7f0977570e75: Claiming unknown
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00153|binding|INFO|Setting lport d0753771-fa10-4305-b602-7f0977570e75 ovn-installed in OVS
Oct 14 10:03:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:16Z|00154|binding|INFO|Setting lport d0753771-fa10-4305-b602-7f0977570e75 up in Southbound
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.653 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.3/24', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-9197abc5-07db-4abf-9578-9360b49aea49', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9197abc5-07db-4abf-9578-9360b49aea49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62cfbaba-fb96-4812-8b41-6ad8964122a3, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=d0753771-fa10-4305-b602-7f0977570e75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.657 183328 INFO neutron.agent.ovn.metadata.agent [-] Port d0753771-fa10-4305-b602-7f0977570e75 in datapath 9197abc5-07db-4abf-9578-9360b49aea49 bound to our chassis
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.662 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port c6842cfb-8437-4385-bf0a-3b5b76aaedd2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.666 183328 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.681 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[aa63c62b-470f-41b1-ad97-32591bcd3367]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.726 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[edc752f7-805d-490b-9ffe-b6e9861bc191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.732 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[20c5f63d-ef0f-4744-b9db-1fded8251cd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.757 183444 DEBUG oslo.privsep.daemon [-] privsep: reply[868faaa4-72f5-45fc-9581-4f6b33ca1711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.774 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[1a84c61a-f0f5-48c4-84df-25069aa7db10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9197abc5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:b0:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 69, 'rx_bytes': 8926, 'tx_bytes': 7001, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 69, 'rx_bytes': 8926, 'tx_bytes': 7001, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1181282, 'reachable_time': 30428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317748, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.797 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[9837bf1a-5050-4861-a35b-7b9a808a399e]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap9197abc5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1181290, 'tstamp': 1181290}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317751, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9197abc5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1181293, 'tstamp': 1181293}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317751, 'error': None, 'target': 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.799 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9197abc5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:16.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.807 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9197abc5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.808 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.808 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9197abc5-00, col_values=(('external_ids', {'iface-id': '25844137-067c-4137-b11d-9fc6e75f59fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:03:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:16.809 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 14 10:03:16 np0005486759.ooo.test podman[317778]: 
Oct 14 10:03:17 np0005486759.ooo.test podman[317778]: 2025-10-14 10:03:17.011187543 +0000 UTC m=+0.088444400 container create 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009)
Oct 14 10:03:17 np0005486759.ooo.test systemd[1]: Started libpod-conmon-931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507.scope.
Oct 14 10:03:17 np0005486759.ooo.test podman[317778]: 2025-10-14 10:03:16.965815768 +0000 UTC m=+0.043072615 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:03:17 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:03:17 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0a7ebfe94c739ef777fe7dd94ca5ec59d1c8828afac69caf785512caaf64898/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:03:17 np0005486759.ooo.test podman[317778]: 2025-10-14 10:03:17.087561282 +0000 UTC m=+0.164818129 container init 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:03:17 np0005486759.ooo.test podman[317778]: 2025-10-14 10:03:17.096503065 +0000 UTC m=+0.173759912 container start 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317807]: started, version 2.85 cachesize 150
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317807]: DNS service limited to local subnets
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317807]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317807]: warning: no upstream servers configured
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq-dhcp[317807]: DHCP, static leases only on 192.168.122.0, lease time 1d
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:03:17 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:17.154 287366 INFO neutron.agent.dhcp.agent [None req-32c31ddc-07b0-4d10-a692-170cb1b1d016 - - - - - -] Finished network a6b02595-ce43-43c7-aca8-531937571464 dhcp configuration
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.217 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.218 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.218 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.219 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.318 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.393 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.394 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.468 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.469 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.531 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.532 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:03:17 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:17.558 287366 INFO neutron.agent.dhcp.agent [None req-4c4b51e7-4c8d-4dac-a330-36e1cf21e431 - - - - - -] DHCP configuration for ports {'ea804914-9645-4e2d-9117-a7f5a5c8d357', '0e7d62dc-0cdc-4b6f-b4eb-7db2250fa9f3', 'b60156a8-258f-4e97-bbcb-db7630f22c8b', '20e0600d-00d1-463c-a458-e3c8e895ea5f', '3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a'} is completed
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.640 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:03:17 np0005486759.ooo.test podman[317849]: 
Oct 14 10:03:17 np0005486759.ooo.test podman[317849]: 2025-10-14 10:03:17.830696162 +0000 UTC m=+0.092607506 container create b8d2efd7029d5590b9c53683236c6ffef0faab87a94771186f494c943a4b4a64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9197abc5-07db-4abf-9578-9360b49aea49, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.860 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.862 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12264MB free_disk=386.6772003173828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.863 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.863 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:03:17 np0005486759.ooo.test systemd[1]: Started libpod-conmon-b8d2efd7029d5590b9c53683236c6ffef0faab87a94771186f494c943a4b4a64.scope.
Oct 14 10:03:17 np0005486759.ooo.test podman[317849]: 2025-10-14 10:03:17.792202648 +0000 UTC m=+0.054114012 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:03:17 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:03:17 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d59f990267506d73da6573a0b1a4066a04e0c0dab26495d20311dfb80a7fbf50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:03:17 np0005486759.ooo.test podman[317849]: 2025-10-14 10:03:17.906929118 +0000 UTC m=+0.168840422 container init b8d2efd7029d5590b9c53683236c6ffef0faab87a94771186f494c943a4b4a64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9197abc5-07db-4abf-9578-9360b49aea49, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:03:17 np0005486759.ooo.test podman[317849]: 2025-10-14 10:03:17.916035906 +0000 UTC m=+0.177947230 container start b8d2efd7029d5590b9c53683236c6ffef0faab87a94771186f494c943a4b4a64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9197abc5-07db-4abf-9578-9360b49aea49, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317867]: started, version 2.85 cachesize 150
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317867]: DNS service limited to local subnets
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317867]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317867]: warning: no upstream servers configured
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq-dhcp[317867]: DHCP, static leases only on 192.168.0.0, lease time 1d
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq[317867]: read /var/lib/neutron/dhcp/9197abc5-07db-4abf-9578-9360b49aea49/addn_hosts - 2 addresses
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq-dhcp[317867]: read /var/lib/neutron/dhcp/9197abc5-07db-4abf-9578-9360b49aea49/host
Oct 14 10:03:17 np0005486759.ooo.test dnsmasq-dhcp[317867]: read /var/lib/neutron/dhcp/9197abc5-07db-4abf-9578-9360b49aea49/opts
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.994 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.995 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:03:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:17.995 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:03:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:18.043 2 INFO neutron.agent.securitygroups_rpc [req-56f34318-9f73-4ff7-b979-227ff8bfe2a9 req-e0990655-3c75-4de9-82aa-4a8215b4fa7a e8179e09b1474228815f1e24281119fc 9d03b3e18e4541be845419a2c2164824 - - default default] Security group rule updated ['6e04f8f5-6217-4d21-8c1b-0d2c8b0fb155']
Oct 14 10:03:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:18.070 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:03:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:18.088 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:03:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:18.109 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:03:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:18.110 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:03:18 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [NOTICE]   (311474) : haproxy version is 2.8.14-c23fe91
Oct 14 10:03:18 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [NOTICE]   (311474) : path to executable is /usr/sbin/haproxy
Oct 14 10:03:18 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [WARNING]  (311474) : Exiting Master process...
Oct 14 10:03:18 np0005486759.ooo.test podman[317883]: 2025-10-14 10:03:18.132161899 +0000 UTC m=+0.044721975 container kill a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 10:03:18 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [ALERT]    (311474) : Current worker (311476) exited with code 143 (Terminated)
Oct 14 10:03:18 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[311470]: [WARNING]  (311474) : All workers exited. Exiting... (0)
Oct 14 10:03:18 np0005486759.ooo.test systemd[1]: libpod-a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc.scope: Deactivated successfully.
Oct 14 10:03:18 np0005486759.ooo.test podman[317896]: 2025-10-14 10:03:18.208142177 +0000 UTC m=+0.063326973 container died a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:03:18 np0005486759.ooo.test podman[317896]: 2025-10-14 10:03:18.23873283 +0000 UTC m=+0.093917556 container cleanup a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009)
Oct 14 10:03:18 np0005486759.ooo.test systemd[1]: libpod-conmon-a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc.scope: Deactivated successfully.
Oct 14 10:03:18 np0005486759.ooo.test podman[317898]: 2025-10-14 10:03:18.280654639 +0000 UTC m=+0.128183561 container remove a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:18.289 287366 INFO neutron.agent.dhcp.agent [None req-06c6c7dc-a96d-4c1c-9776-3e38113faca1 - - - - - -] Finished network 9197abc5-07db-4abf-9578-9360b49aea49 dhcp configuration
Oct 14 10:03:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:18.290 287366 INFO neutron.agent.dhcp.agent [None req-db1f88a4-5ebb-4c4d-876a-ae6b5d3585a7 - - - - - -] Synchronizing state complete
Oct 14 10:03:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:18.580 2 INFO neutron.agent.securitygroups_rpc [req-cee47dd9-79a5-4420-8ae7-a359cc58d9a5 req-9c04f191-8395-4472-850f-d9e6e32e62e3 e8179e09b1474228815f1e24281119fc 9d03b3e18e4541be845419a2c2164824 - - default default] Security group rule updated ['6e04f8f5-6217-4d21-8c1b-0d2c8b0fb155']
Oct 14 10:03:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:18.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:18.839 287366 INFO neutron.agent.dhcp.agent [None req-4e7ea8ff-25aa-42db-9519-1ca34761b062 - - - - - -] DHCP configuration for ports {'ea804914-9645-4e2d-9117-a7f5a5c8d357', '20e0600d-00d1-463c-a458-e3c8e895ea5f', '3ac5f212-9e06-46dc-9bb4-d856ccc0dd8a', 'eee08de8-f983-4ebe-a654-f67f48659e50', '0e7d62dc-0cdc-4b6f-b4eb-7db2250fa9f3', 'b60156a8-258f-4e97-bbcb-db7630f22c8b', 'cec08f7e-c5e8-414a-b9ef-3d673db32c62', '25844137-067c-4137-b11d-9fc6e75f59fd'} is completed
Oct 14 10:03:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-7df68e89a5a3091b7ccd00355d773dc81bad90001ad9c4fbaf3a8bf1ac010bfb-merged.mount: Deactivated successfully.
Oct 14 10:03:19 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a09b2f6f7e93be5136affca3be1a5249a4b514c163a158903479e08436d205fc-userdata-shm.mount: Deactivated successfully.
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.111 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.111 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.111 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.189 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.190 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.190 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.191 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28988 DF PROTO=TCP SPT=36080 DPT=9102 SEQ=43509050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA35E470000000001030307) 
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.684 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.704 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.705 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:03:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:19.706 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:03:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:03:20 np0005486759.ooo.test podman[317929]: 2025-10-14 10:03:20.44746326 +0000 UTC m=+0.070720398 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:03:20 np0005486759.ooo.test podman[317930]: 2025-10-14 10:03:20.500201479 +0000 UTC m=+0.122734635 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 14 10:03:20 np0005486759.ooo.test podman[317930]: 2025-10-14 10:03:20.506492451 +0000 UTC m=+0.129025567 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 10:03:20 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:03:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28989 DF PROTO=TCP SPT=36080 DPT=9102 SEQ=43509050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA362410000000001030307) 
Oct 14 10:03:20 np0005486759.ooo.test podman[317929]: 2025-10-14 10:03:20.526274964 +0000 UTC m=+0.149532072 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 14 10:03:20 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:03:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:20.779 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:03:22 np0005486759.ooo.test dnsmasq[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/addn_hosts - 0 addresses
Oct 14 10:03:22 np0005486759.ooo.test podman[317989]: 2025-10-14 10:03:22.41262414 +0000 UTC m=+0.058849786 container kill 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:03:22 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/host
Oct 14 10:03:22 np0005486759.ooo.test dnsmasq-dhcp[317454]: read /var/lib/neutron/dhcp/3a358ac5-85f1-49db-a829-f8fa243ab752/opts
Oct 14 10:03:22 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:22Z|00155|binding|INFO|Releasing lport 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 from this chassis (sb_readonly=0)
Oct 14 10:03:22 np0005486759.ooo.test kernel: device tap95b731f3-5e left promiscuous mode
Oct 14 10:03:22 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:22Z|00156|binding|INFO|Setting lport 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 down in Southbound
Oct 14 10:03:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:22.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:22.572 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-3a358ac5-85f1-49db-a829-f8fa243ab752', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a358ac5-85f1-49db-a829-f8fa243ab752', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d03b3e18e4541be845419a2c2164824', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f291ae6-cc0e-4f7e-b811-104735618de6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=95b731f3-5e53-48a3-9d8f-5542b0c1d3c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:03:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28990 DF PROTO=TCP SPT=36080 DPT=9102 SEQ=43509050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA36A410000000001030307) 
Oct 14 10:03:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:22.575 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 95b731f3-5e53-48a3-9d8f-5542b0c1d3c4 in datapath 3a358ac5-85f1-49db-a829-f8fa243ab752 unbound from our chassis
Oct 14 10:03:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:22.577 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a358ac5-85f1-49db-a829-f8fa243ab752, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:03:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:22.579 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[141fa575-f2b5-4ef9-9fb9-bc0988b03f2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:22.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 10:03:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 10:03:23 np0005486759.ooo.test snmpd[52493]: empty variable list in _query
Oct 14 10:03:23 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:23.255 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:03:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:23.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:23 np0005486759.ooo.test dnsmasq-dhcp[317807]: DHCPRELEASE(tap3ac5f212-9e) 192.168.122.202 fa:16:3e:75:af:48
Oct 14 10:03:24 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:24Z|00157|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:03:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:24.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:24.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:24 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:03:24 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:03:24 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:03:24 np0005486759.ooo.test systemd[1]: tmp-crun.AwKW7X.mount: Deactivated successfully.
Oct 14 10:03:24 np0005486759.ooo.test podman[318029]: 2025-10-14 10:03:24.495681706 +0000 UTC m=+0.071477551 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS)
Oct 14 10:03:25 np0005486759.ooo.test podman[318067]: 2025-10-14 10:03:25.04322381 +0000 UTC m=+0.056877717 container kill 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:03:25 np0005486759.ooo.test dnsmasq[317454]: exiting on receipt of SIGTERM
Oct 14 10:03:25 np0005486759.ooo.test systemd[1]: libpod-64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150.scope: Deactivated successfully.
Oct 14 10:03:25 np0005486759.ooo.test podman[318083]: 2025-10-14 10:03:25.093169474 +0000 UTC m=+0.033686949 container died 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 10:03:25 np0005486759.ooo.test podman[318083]: 2025-10-14 10:03:25.121666783 +0000 UTC m=+0.062184228 container remove 64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a358ac5-85f1-49db-a829-f8fa243ab752, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:03:25 np0005486759.ooo.test systemd[1]: libpod-conmon-64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150.scope: Deactivated successfully.
Oct 14 10:03:25 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:25.151 287366 INFO neutron.agent.dhcp.agent [None req-7595feff-01ad-40f1-8126-c710ef650abe - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:03:25 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:25.208 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:03:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-eafaee2a317dd1d16ef24bf2d1502a7c8a703f02a87a99735f92d8ed865b1897-merged.mount: Deactivated successfully.
Oct 14 10:03:25 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-64e0c493254de9c0bbf57ecc8dd6dab9c5fa5f276254c5d9db97d66a71da9150-userdata-shm.mount: Deactivated successfully.
Oct 14 10:03:25 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d3a358ac5\x2d85f1\x2d49db\x2da829\x2df8fa243ab752.mount: Deactivated successfully.
Oct 14 10:03:26 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:26.568 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:03:26Z, description=, device_id=7e9aad21-5526-401c-babf-740cc92bb3dc, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c41c0>], id=11956a8d-8e3c-43ee-9b24-87b251b11870, ip_allocation=immediate, mac_address=fa:16:3e:77:13:d8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=909, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:03:26Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:03:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28991 DF PROTO=TCP SPT=36080 DPT=9102 SEQ=43509050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA37A010000000001030307) 
Oct 14 10:03:26 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:03:26 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:03:26 np0005486759.ooo.test podman[318124]: 2025-10-14 10:03:26.765107677 +0000 UTC m=+0.053987238 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:03:26 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:03:26 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:03:26.984 287366 INFO neutron.agent.dhcp.agent [None req-ab560ddb-cd76-4c23-8051-8b6e26c2ce2e - - - - - -] DHCP configuration for ports {'11956a8d-8e3c-43ee-9b24-87b251b11870'} is completed
Oct 14 10:03:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:27.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:28.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:29.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:30.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:30 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:03:30 np0005486759.ooo.test podman[318145]: 2025-10-14 10:03:30.43622371 +0000 UTC m=+0.067914392 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:30 np0005486759.ooo.test podman[318145]: 2025-10-14 10:03:30.464772951 +0000 UTC m=+0.096463623 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:03:30 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:03:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:33.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:03:33 np0005486759.ooo.test podman[318163]: 2025-10-14 10:03:33.853756816 +0000 UTC m=+0.080138946 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:03:33 np0005486759.ooo.test podman[318163]: 2025-10-14 10:03:33.862478382 +0000 UTC m=+0.088860472 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:03:33 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:03:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:34.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:38.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:38 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:03:38 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:03:38 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:03:38 np0005486759.ooo.test podman[318205]: 2025-10-14 10:03:38.968051122 +0000 UTC m=+0.059102383 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:03:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:03:38Z|00158|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:03:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:39.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:39.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:03:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:03:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:03:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 129486 "" "Go-http-client/1.1"
Oct 14 10:03:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:03:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16212 "" "Go-http-client/1.1"
Oct 14 10:03:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:43.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:03:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:03:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:03:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:44.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: tmp-crun.Op6Gqp.mount: Deactivated successfully.
Oct 14 10:03:44 np0005486759.ooo.test podman[318235]: 2025-10-14 10:03:44.4845221 +0000 UTC m=+0.095220595 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:44 np0005486759.ooo.test podman[318228]: 2025-10-14 10:03:44.542180159 +0000 UTC m=+0.157983360 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:03:44 np0005486759.ooo.test podman[318228]: 2025-10-14 10:03:44.550177933 +0000 UTC m=+0.165981104 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:03:44 np0005486759.ooo.test podman[318235]: 2025-10-14 10:03:44.564297994 +0000 UTC m=+0.174996459 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:03:44 np0005486759.ooo.test podman[318229]: 2025-10-14 10:03:44.505466719 +0000 UTC m=+0.118802415 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:03:44 np0005486759.ooo.test podman[318229]: 2025-10-14 10:03:44.640263761 +0000 UTC m=+0.253599447 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 14 10:03:44 np0005486759.ooo.test podman[318227]: 2025-10-14 10:03:44.648666347 +0000 UTC m=+0.269020537 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:03:44 np0005486759.ooo.test podman[318227]: 2025-10-14 10:03:44.660892031 +0000 UTC m=+0.281246211 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:03:44 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:03:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:48.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:49 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:49.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62286 DF PROTO=TCP SPT=32842 DPT=9102 SEQ=3656991345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA3D3780000000001030307) 
Oct 14 10:03:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62287 DF PROTO=TCP SPT=32842 DPT=9102 SEQ=3656991345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA3D7810000000001030307) 
Oct 14 10:03:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:03:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:03:51 np0005486759.ooo.test podman[318303]: 2025-10-14 10:03:51.450523065 +0000 UTC m=+0.073960476 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct 14 10:03:51 np0005486759.ooo.test podman[318303]: 2025-10-14 10:03:51.487452692 +0000 UTC m=+0.110890083 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container)
Oct 14 10:03:51 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:03:51 np0005486759.ooo.test podman[318302]: 2025-10-14 10:03:51.49328258 +0000 UTC m=+0.120005602 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:03:51 np0005486759.ooo.test podman[318302]: 2025-10-14 10:03:51.572762555 +0000 UTC m=+0.199485577 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:03:51 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:03:52 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:52.265 2 INFO neutron.agent.securitygroups_rpc [None req-5239a237-3807-4667-8b22-10720276b8e6 3b3470f7a89e4032bcf7b08aba946c07 ea4715251f444293b094aae299000991 - - default default] Security group member updated ['6cf9fd9f-39d0-4d07-9ae8-df323b8de87e']
Oct 14 10:03:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62288 DF PROTO=TCP SPT=32842 DPT=9102 SEQ=3656991345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA3DF810000000001030307) 
Oct 14 10:03:52 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:52.846 2 INFO neutron.agent.securitygroups_rpc [None req-9085837c-9fe0-473a-a3f9-1d60a59d6f83 3b3470f7a89e4032bcf7b08aba946c07 ea4715251f444293b094aae299000991 - - default default] Security group member updated ['6cf9fd9f-39d0-4d07-9ae8-df323b8de87e']
Oct 14 10:03:53 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:53.497 2 INFO neutron.agent.securitygroups_rpc [None req-1d91a5c6-1041-4052-9cd0-9ad2553f57b1 3b3470f7a89e4032bcf7b08aba946c07 ea4715251f444293b094aae299000991 - - default default] Security group member updated ['6cf9fd9f-39d0-4d07-9ae8-df323b8de87e']
Oct 14 10:03:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:53.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.170 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.170 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.171 183328 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.171 183328 ERROR neutron.agent.linux.external_process [-] metadata-proxy for metadata with uuid 9197abc5-07db-4abf-9578-9360b49aea49 not found. The process should not have died
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.172 183328 WARNING neutron.agent.linux.external_process [-] Respawning metadata-proxy for uuid 9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.172 183328 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.174 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[e2daa284-5484-4e59-8f31-226f7686d9ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.175 183328 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: global
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     log         /dev/log local0 debug
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     log-tag     haproxy-metadata-proxy-9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     user        root
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     group       root
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     maxconn     1024
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     pidfile     /var/lib/neutron/external/pids/9197abc5-07db-4abf-9578-9360b49aea49.pid.haproxy
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     daemon
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: defaults
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     log global
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     mode http
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     option httplog
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     option dontlognull
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     option http-server-close
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     option forwardfor
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     retries                 3
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-request    30s
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout connect         30s
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout client          32s
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout server          32s
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     timeout http-keep-alive 30s
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: listen listener
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     bind 169.254.169.254:80
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     server metadata /var/lib/neutron/metadata_proxy
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:     http-request add-header X-OVN-Network-ID 9197abc5-07db-4abf-9578-9360b49aea49
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.176 183328 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49', 'env', 'PROCESS_TAG=haproxy-9197abc5-07db-4abf-9578-9360b49aea49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9197abc5-07db-4abf-9578-9360b49aea49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 14 10:03:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:54.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:54 np0005486759.ooo.test podman[318371]: 
Oct 14 10:03:54 np0005486759.ooo.test podman[318371]: 2025-10-14 10:03:54.609125242 +0000 UTC m=+0.069319005 container create df12ff663c7e9119a27dd752eb77e012bad77439b2d9b83fe999e444edfa0bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:03:54 np0005486759.ooo.test systemd[1]: Started libpod-conmon-df12ff663c7e9119a27dd752eb77e012bad77439b2d9b83fe999e444edfa0bdc.scope.
Oct 14 10:03:54 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:03:54 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c8a9aa973bb8c7f7ea225603e81a6eaab9e0abc4f6e8d70cf19d9798ff844cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:03:54 np0005486759.ooo.test podman[318371]: 2025-10-14 10:03:54.574167986 +0000 UTC m=+0.034361759 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 14 10:03:54 np0005486759.ooo.test podman[318371]: 2025-10-14 10:03:54.679493449 +0000 UTC m=+0.139687192 container init df12ff663c7e9119a27dd752eb77e012bad77439b2d9b83fe999e444edfa0bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:03:54 np0005486759.ooo.test podman[318371]: 2025-10-14 10:03:54.685551304 +0000 UTC m=+0.145745047 container start df12ff663c7e9119a27dd752eb77e012bad77439b2d9b83fe999e444edfa0bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:03:54 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[318385]: [NOTICE]   (318389) : New worker (318391) forked
Oct 14 10:03:54 np0005486759.ooo.test neutron-haproxy-ovnmeta-9197abc5-07db-4abf-9578-9360b49aea49[318385]: [NOTICE]   (318389) : Loading success.
Oct 14 10:03:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:03:54.731 2 INFO neutron.agent.securitygroups_rpc [None req-9fe77ece-dd46-47fb-8a19-b802b57e3bca 3b3470f7a89e4032bcf7b08aba946c07 ea4715251f444293b094aae299000991 - - default default] Security group member updated ['6cf9fd9f-39d0-4d07-9ae8-df323b8de87e']
Oct 14 10:03:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:03:54.749 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:03:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62289 DF PROTO=TCP SPT=32842 DPT=9102 SEQ=3656991345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA3EF410000000001030307) 
Oct 14 10:03:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:58.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:03:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:03:59.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:01 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:04:01 np0005486759.ooo.test podman[318400]: 2025-10-14 10:04:01.441289965 +0000 UTC m=+0.071517263 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:04:01 np0005486759.ooo.test podman[318400]: 2025-10-14 10:04:01.473408995 +0000 UTC m=+0.103636263 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent)
Oct 14 10:04:01 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:04:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:03.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:04:04 np0005486759.ooo.test podman[318418]: 2025-10-14 10:04:04.400932912 +0000 UTC m=+0.073463842 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:04:04 np0005486759.ooo.test podman[318418]: 2025-10-14 10:04:04.410310288 +0000 UTC m=+0.082841228 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:04:04 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:04:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:04.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:06 np0005486759.ooo.test sshd[318442]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:04:06 np0005486759.ooo.test sshd[318442]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 10:04:06 np0005486759.ooo.test sshd[318442]: Connection closed by 194.0.234.20 port 65105
Oct 14 10:04:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:07.621 2 INFO neutron.agent.securitygroups_rpc [None req-5fb36799-3042-4c4c-a6c5-ec33abeb4666 a1a692c7a8ed4d42859a31d4704d86c0 9d67eadad13e4ffa94c5f5aeae392709 - - default default] Security group member updated ['fa626312-5d52-496f-8787-c7431d205555']
Oct 14 10:04:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:07.743 2 INFO neutron.agent.securitygroups_rpc [None req-5fb36799-3042-4c4c-a6c5-ec33abeb4666 a1a692c7a8ed4d42859a31d4704d86c0 9d67eadad13e4ffa94c5f5aeae392709 - - default default] Security group member updated ['fa626312-5d52-496f-8787-c7431d205555']
Oct 14 10:04:08 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:08.143 2 INFO neutron.agent.securitygroups_rpc [None req-64dd41b4-f90c-4352-be11-222e50b94c53 a1a692c7a8ed4d42859a31d4704d86c0 9d67eadad13e4ffa94c5f5aeae392709 - - default default] Security group member updated ['fa626312-5d52-496f-8787-c7431d205555']
Oct 14 10:04:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:08.179 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:08 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:08.438 2 INFO neutron.agent.securitygroups_rpc [None req-9f969d6a-b97c-4d32-a085-7b7f4033036e a1a692c7a8ed4d42859a31d4704d86c0 9d67eadad13e4ffa94c5f5aeae392709 - - default default] Security group member updated ['fa626312-5d52-496f-8787-c7431d205555']
Oct 14 10:04:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:08.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:09.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:11.613 287366 INFO neutron.agent.linux.ip_lib [None req-3fc3d5ae-b241-4cc9-bb6c-56f0d2898078 - - - - - -] Device tapfb98cdd2-ed cannot be used as it has no MAC address
Oct 14 10:04:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:11.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:11 np0005486759.ooo.test kernel: device tapfb98cdd2-ed entered promiscuous mode
Oct 14 10:04:11 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436251.6435] manager: (tapfb98cdd2-ed): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Oct 14 10:04:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:11.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:11Z|00159|binding|INFO|Claiming lport fb98cdd2-ed4e-467a-80a1-616076da157f for this chassis.
Oct 14 10:04:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:11Z|00160|binding|INFO|fb98cdd2-ed4e-467a-80a1-616076da157f: Claiming unknown
Oct 14 10:04:11 np0005486759.ooo.test systemd-udevd[318453]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:04:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:11.664 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-599a0d46-2351-4e3b-bc41-11d318205c41', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-599a0d46-2351-4e3b-bc41-11d318205c41', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d67eadad13e4ffa94c5f5aeae392709', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abd8882a-6c84-48c8-a79c-0dfe2ece50d9, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=fb98cdd2-ed4e-467a-80a1-616076da157f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:11.666 183328 INFO neutron.agent.ovn.metadata.agent [-] Port fb98cdd2-ed4e-467a-80a1-616076da157f in datapath 599a0d46-2351-4e3b-bc41-11d318205c41 bound to our chassis
Oct 14 10:04:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:11.669 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port a3b2bf2d-d38b-4de8-8a05-602c49d348b3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:04:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:11.669 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 599a0d46-2351-4e3b-bc41-11d318205c41, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:04:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:11.671 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[779dcf27-dd7b-4ba4-92fb-444f75b065e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:11.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:11Z|00161|binding|INFO|Setting lport fb98cdd2-ed4e-467a-80a1-616076da157f ovn-installed in OVS
Oct 14 10:04:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:11Z|00162|binding|INFO|Setting lport fb98cdd2-ed4e-467a-80a1-616076da157f up in Southbound
Oct 14 10:04:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:11.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapfb98cdd2-ed: No such device
Oct 14 10:04:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:11.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:11.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:04:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:04:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:04:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:04:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:04:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16688 "" "Go-http-client/1.1"
Oct 14 10:04:12 np0005486759.ooo.test podman[318524]: 
Oct 14 10:04:12 np0005486759.ooo.test podman[318524]: 2025-10-14 10:04:12.600373035 +0000 UTC m=+0.086121628 container create 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:04:12 np0005486759.ooo.test systemd[1]: Started libpod-conmon-2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d.scope.
Oct 14 10:04:12 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:04:12 np0005486759.ooo.test podman[318524]: 2025-10-14 10:04:12.556238598 +0000 UTC m=+0.041987211 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:04:12 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c135b4f8de4cbea17c884ba8ede91c986381303c4fb38546eeb95685d0437ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:04:12 np0005486759.ooo.test podman[318524]: 2025-10-14 10:04:12.663767559 +0000 UTC m=+0.149516152 container init 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:04:12 np0005486759.ooo.test podman[318524]: 2025-10-14 10:04:12.675114155 +0000 UTC m=+0.160862738 container start 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq[318544]: started, version 2.85 cachesize 150
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq[318544]: DNS service limited to local subnets
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq[318544]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq[318544]: warning: no upstream servers configured
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq-dhcp[318544]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq[318544]: read /var/lib/neutron/dhcp/599a0d46-2351-4e3b-bc41-11d318205c41/addn_hosts - 0 addresses
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq-dhcp[318544]: read /var/lib/neutron/dhcp/599a0d46-2351-4e3b-bc41-11d318205c41/host
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq-dhcp[318544]: read /var/lib/neutron/dhcp/599a0d46-2351-4e3b-bc41-11d318205c41/opts
Oct 14 10:04:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:12.866 287366 INFO neutron.agent.dhcp.agent [None req-d20d9622-f8e7-4b31-9c38-f9d5b9831cf7 - - - - - -] DHCP configuration for ports {'ccd39842-de7f-4c55-b71b-ea1de312a182'} is completed
Oct 14 10:04:12 np0005486759.ooo.test dnsmasq[318544]: exiting on receipt of SIGTERM
Oct 14 10:04:12 np0005486759.ooo.test podman[318563]: 2025-10-14 10:04:12.96605575 +0000 UTC m=+0.057777533 container kill 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:04:12 np0005486759.ooo.test systemd[1]: libpod-2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d.scope: Deactivated successfully.
Oct 14 10:04:13 np0005486759.ooo.test podman[318576]: 2025-10-14 10:04:13.043342618 +0000 UTC m=+0.059563928 container died 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct 14 10:04:13 np0005486759.ooo.test podman[318576]: 2025-10-14 10:04:13.08013156 +0000 UTC m=+0.096352810 container cleanup 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:04:13 np0005486759.ooo.test systemd[1]: libpod-conmon-2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d.scope: Deactivated successfully.
Oct 14 10:04:13 np0005486759.ooo.test podman[318577]: 2025-10-14 10:04:13.108343611 +0000 UTC m=+0.121218309 container remove 2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-599a0d46-2351-4e3b-bc41-11d318205c41, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2)
Oct 14 10:04:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:13.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:13 np0005486759.ooo.test kernel: device tapfb98cdd2-ed left promiscuous mode
Oct 14 10:04:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:13Z|00163|binding|INFO|Releasing lport fb98cdd2-ed4e-467a-80a1-616076da157f from this chassis (sb_readonly=0)
Oct 14 10:04:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:13Z|00164|binding|INFO|Setting lport fb98cdd2-ed4e-467a-80a1-616076da157f down in Southbound
Oct 14 10:04:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:13.130 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-599a0d46-2351-4e3b-bc41-11d318205c41', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-599a0d46-2351-4e3b-bc41-11d318205c41', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d67eadad13e4ffa94c5f5aeae392709', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abd8882a-6c84-48c8-a79c-0dfe2ece50d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=fb98cdd2-ed4e-467a-80a1-616076da157f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:13.133 183328 INFO neutron.agent.ovn.metadata.agent [-] Port fb98cdd2-ed4e-467a-80a1-616076da157f in datapath 599a0d46-2351-4e3b-bc41-11d318205c41 unbound from our chassis
Oct 14 10:04:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:13.138 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 599a0d46-2351-4e3b-bc41-11d318205c41, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:04:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:13.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:13.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:13 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:13.139 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[cb287291-f78a-4b2c-9aa5-ecb91b5928b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:13.343 287366 INFO neutron.agent.dhcp.agent [None req-2981c76b-9a49-4d18-aca0-f22b5e71c7e1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:13.344 287366 INFO neutron.agent.dhcp.agent [None req-2981c76b-9a49-4d18-aca0-f22b5e71c7e1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:13.486 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-2c135b4f8de4cbea17c884ba8ede91c986381303c4fb38546eeb95685d0437ab-merged.mount: Deactivated successfully.
Oct 14 10:04:13 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a41ed1e2f59419ccbc2712ed0dccd6ff052eddff50beaa84c014a01be497d0d-userdata-shm.mount: Deactivated successfully.
Oct 14 10:04:13 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d599a0d46\x2d2351\x2d4e3b\x2dbc41\x2d11d318205c41.mount: Deactivated successfully.
Oct 14 10:04:13 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:13Z|00165|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:04:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:13.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:13.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:04:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:04:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:14.248 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:14.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: tmp-crun.Eqxtoj.mount: Deactivated successfully.
Oct 14 10:04:15 np0005486759.ooo.test podman[318605]: 2025-10-14 10:04:15.453498743 +0000 UTC m=+0.061774586 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:04:15 np0005486759.ooo.test podman[318605]: 2025-10-14 10:04:15.46093968 +0000 UTC m=+0.069215513 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: tmp-crun.2QsGP7.mount: Deactivated successfully.
Oct 14 10:04:15 np0005486759.ooo.test podman[318604]: 2025-10-14 10:04:15.473613977 +0000 UTC m=+0.082899190 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:04:15 np0005486759.ooo.test podman[318604]: 2025-10-14 10:04:15.480240809 +0000 UTC m=+0.089526022 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid)
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:04:15 np0005486759.ooo.test podman[318603]: 2025-10-14 10:04:15.53042885 +0000 UTC m=+0.140895549 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:04:15 np0005486759.ooo.test podman[318603]: 2025-10-14 10:04:15.566546442 +0000 UTC m=+0.177013161 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:04:15 np0005486759.ooo.test podman[318609]: 2025-10-14 10:04:15.584703965 +0000 UTC m=+0.184307263 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 10:04:15 np0005486759.ooo.test podman[318609]: 2025-10-14 10:04:15.599531128 +0000 UTC m=+0.199134496 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:04:15 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:04:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:16.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:16.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:16.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:16.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.206 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.207 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.207 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.207 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.318 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.394 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.396 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.469 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.470 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.545 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.546 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.588 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.760 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.762 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12266MB free_disk=386.6773490905762GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.762 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.763 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.849 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.850 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.851 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.901 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.916 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.919 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:04:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:17.919 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:04:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:18.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:18.921 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:18.922 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:04:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24141 DF PROTO=TCP SPT=36208 DPT=9102 SEQ=1171013918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA448A70000000001030307) 
Oct 14 10:04:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:19.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.192 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.193 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.193 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.452 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.452 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.453 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.453 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:04:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24142 DF PROTO=TCP SPT=36208 DPT=9102 SEQ=1171013918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA44CC10000000001030307) 
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.923 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.940 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:04:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:20.941 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:04:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:21.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:04:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:04:22 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:04:22 np0005486759.ooo.test podman[318697]: 2025-10-14 10:04:22.463164002 +0000 UTC m=+0.085616993 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct 14 10:04:22 np0005486759.ooo.test podman[318696]: 2025-10-14 10:04:22.512179297 +0000 UTC m=+0.138536427 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:04:22 np0005486759.ooo.test podman[318697]: 2025-10-14 10:04:22.527284738 +0000 UTC m=+0.149737709 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, version=9.6)
Oct 14 10:04:22 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:04:22 np0005486759.ooo.test podman[318696]: 2025-10-14 10:04:22.579388687 +0000 UTC m=+0.205745757 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 10:04:22 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:04:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24143 DF PROTO=TCP SPT=36208 DPT=9102 SEQ=1171013918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA454C20000000001030307) 
Oct 14 10:04:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:23.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.486 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.486 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86a954fa-a144-4168-9815-9338c625b058', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.454151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '29730510-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': 'de923055296967e558c93fee8e116caae076b25b6da4621ae2fd911dfcd4731e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.454151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '29731af0-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': 'ca206d0db6199cf8e3664dfbd6ae4ce1d795880ff9484a6cece7d2f645cdd823'}]}, 'timestamp': '2025-10-14 10:04:24.487413', '_unique_id': '32c041f1104e4d9398fbd38d2f398608'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.488 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.490 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26c19ba1-5151-4e43-b150-7c5d82d71d38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8721, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.490334', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '29743912-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': '880aa74fa9513e132a7eab234bd3771494402104eda14e573c334cb8a03c78c4'}]}, 'timestamp': '2025-10-14 10:04:24.494789', '_unique_id': '11b1d3c2e5c84cc59fa63a0dac4e76f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.496 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.497 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d0f56d5-713d-4453-8029-c2193dc534da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.497013', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '2974a38e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': '46b1ba72c43d51fab0a4b9acc6b03a59d59f538f272ce4f1eb1e38bd257ffaf1'}]}, 'timestamp': '2025-10-14 10:04:24.497475', '_unique_id': '2c93f59da41e4e1cbb59a04694eae563'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.498 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.499 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.499 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e207c320-f647-4af5-96a6-617e52d6fcc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.499832', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297512ce-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': '5c21defd9874cc7ae24f8260ce28d7b584c3fb66f378ee3122179404a08bc6e6'}]}, 'timestamp': '2025-10-14 10:04:24.500326', '_unique_id': '118a715020d440369bc339c8db34f319'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.501 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.502 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.518 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.518 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1abc805b-40a7-43ce-a7b0-5e2ae82f9aec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.502445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2977e1f2-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.697531848, 'message_signature': '7f9bfbb2dfa8d28dd09bface8333ccbe08d1a0e3eb0b4ff4bb4e7a08e003296b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.502445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2977f5f2-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.697531848, 'message_signature': '7c08dd10d40b73ad1bf19a9e59352ae36759bdde3fa5ed29d8d5ab06ced9b0b8'}]}, 'timestamp': '2025-10-14 10:04:24.519230', '_unique_id': '85a5fc5b239747578624d39791d9ef3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.520 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.522 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '301183f0-7641-4d4c-bee0-4a89cd1270a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.521700', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '297867b2-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.697531848, 'message_signature': '0c93548eaa7abe6e3b48feba23b4d939db1070eeffb1690299779b7e480d82a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.521700', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '297878ec-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.697531848, 'message_signature': 'b9a75ab342edb149dfebbb074022c8e9d8ee1363ce1bc79fb1de05428cdd6806'}]}, 'timestamp': '2025-10-14 10:04:24.522569', '_unique_id': '62ad0c8a38ee4edf85a6b8e9934677d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b23c346-5e92-483a-adff-2ee3f5c7bc4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.524822', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '2978e34a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': 'f295b44cade7bfc714092674287a700ba4408a3d4e006c23a8f9688d3481df5f'}]}, 'timestamp': '2025-10-14 10:04:24.525321', '_unique_id': '2ed42751273541e79dd8f7d08173a602'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16c418cf-8173-41f6-b935-6024a52995e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.527452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '29794830-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': 'd9adc8331a0d3656f3480c9301f00be79a2bfdf2fa4138a376b2351d641b47df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.527452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '297959d8-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '046b5f38cc316dc44a878959a537a2d6347e0e94bcd5878b3940886263d6a793'}]}, 'timestamp': '2025-10-14 10:04:24.528331', '_unique_id': 'a44c78794687414eb0b4b73353e7267e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.530 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.547 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f0f7be-ca49-4344-a97b-dd4c03692166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:04:24.530461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '297c6470-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.742760588, 'message_signature': '247a1d365d4e44338c18ebe8c96d75fc985fe01d6611f176c21034db63210f9e'}]}, 'timestamp': '2025-10-14 10:04:24.548277', '_unique_id': '38cbe23a346f4fe6a6e6b9f90aa46ee5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.550 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 530 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '867789c1-e6d6-4434-8655-21ccfa2be7cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 530, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.550389', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297cc848-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': 'b8c2e4f6423d135adb5a22d1006fedfbc888517dcfe2274505f54781e9302f7c'}]}, 'timestamp': '2025-10-14 10:04:24.550840', '_unique_id': '9127d120eac744c7ad1656c216689753'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.551 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.552 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.553 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b9e5be4-c709-4200-9d32-3846819493a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.552946', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297d2e5a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': '654b15e287a0342e21e0819306aba36044f90c152022db778934fac3f9f34ed6'}]}, 'timestamp': '2025-10-14 10:04:24.553483', '_unique_id': 'b40a44dfc12e4f7fba3287a5ad90bdb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.554 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.555 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.556 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd24f8cd0-5c0a-4f45-a910-9e7bcaf5f9cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.555598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '297d9372-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.697531848, 'message_signature': '83acfd7f837502cafd48fb5250bba2f7acd6aaded9e25bfa505438c760e4d8d7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.555598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '297da4ca-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.697531848, 'message_signature': 'aa281b43c4574ec14c252c3b82a66c1feaad24051538a9533ba6af88285cedc9'}]}, 'timestamp': '2025-10-14 10:04:24.556456', '_unique_id': '5524b27d35ca444f9080bc74b77845d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.557 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5df54cb-a7f0-4e33-9d83-4e39b75dacf3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.558567', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297e07a8-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': '43a3ab638b8a51cd23d910bb06562c395f535b70d5ec271d6df94548a8f10756'}]}, 'timestamp': '2025-10-14 10:04:24.559118', '_unique_id': 'ac842cad594b4ab7a46303897d39a179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.560 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.561 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.561 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab97817-9811-475c-ad07-8ccc2ba5cec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.561241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '297e6ff4-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '1e150ae89e7404751a09464c5493a56283daaf97dc2c7a174d51f814de97253d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.561241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '297e7f9e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '53a3cdbd52143ee6c0757a5505c456a8df69985ae04952ed743d772dfa7e9ac4'}]}, 'timestamp': '2025-10-14 10:04:24.562094', '_unique_id': 'f37a3b341c86430f82664a06a593eac0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.564 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.564 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.565 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20fc1902-d31a-4ae4-bdf3-4937daaed217', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.565249', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297f0d9c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': 'ba13509958dbe94e788ebcfa1ee959117f67a1a1066b6bba1890d8513e0e366e'}]}, 'timestamp': '2025-10-14 10:04:24.565734', '_unique_id': 'e7f8d6af7caf43fd8e707bff626741f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.568 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e28e0683-d2a2-479c-8546-d1165549d07d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.568047', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297f7a20-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': '99e9fab150fcff751157c9e03e986c5c0a75a06a914f66173d2953c26748195b'}]}, 'timestamp': '2025-10-14 10:04:24.568512', '_unique_id': '8cdb17878a314276a8bfdf02a8678657'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.569 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.570 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd56b17ee-ae8f-48bb-902e-55b25d0f0a9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:04:24.570667', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '297fe082-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.685431888, 'message_signature': 'df37b93c8b0375c1f0310ae86f6dea2bcba5efd35e77c77b44385659bf09e64f'}]}, 'timestamp': '2025-10-14 10:04:24.571162', '_unique_id': '5161a331960e441fb170ea6ce986d3b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.572 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f09ec598-ee68-4181-9199-a74c5d969c14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.572827', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2980323a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '6559244b4f85bf2ae3ca86429401a59a8f22c490d9635336ee5af914e5637dff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.572827', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '29803c6c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '4ab501011e1224a119b809137b16fb760e38ef34dab8bc66acc832bf28232be1'}]}, 'timestamp': '2025-10-14 10:04:24.573373', '_unique_id': '16fd7a1429064c5f845ad61578cd95b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.573 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.574 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.574 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 12510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c4a8a10-fb23-4ecf-b006-0cc962c7c054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12510000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:04:24.574695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '29807a42-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.742760588, 'message_signature': 'c763475417f9eeae9d52cf38437f0b78e12cba0fed3b7ae5c69d229e4309f8d5'}]}, 'timestamp': '2025-10-14 10:04:24.574987', '_unique_id': '4a4aa235805f4fdd8ab48ac4dc3b92e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.575 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.576 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.576 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.576 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '012f3113-f8ab-47ea-a42f-b5cb7fc42c5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.576321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2980b9f8-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '10257072d27bb17a9eb1c88978990cfae5ef1ebd0d79c436d6390937c978bb80'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.576321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2980c484-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': 'bd8a94c21ae5b713ad0c74a6428a51e478b12f5c037aae8f4e128953605a9087'}]}, 'timestamp': '2025-10-14 10:04:24.576864', '_unique_id': 'ad5a297f3568478786df7c1ea87671d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.577 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.578 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.578 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acb87838-e540-46d1-a39e-84f503025d7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:04:24.578193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '298102e6-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '72e7e8dd5bad16e397241804c1b8c7d4140633c8ee28092e5ed8cf36126be2e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:04:24.578193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '29810c46-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12284.649232724, 'message_signature': '607ca1a5f08189b12140576d0aca6f422684d2c413da19dabdcce891a7778158'}]}, 'timestamp': '2025-10-14 10:04:24.578691', '_unique_id': '7d81ba3628a34601a99d222edd710dac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:04:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:04:24.579 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:04:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:24.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:24.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:24.823 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:24 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:24.824 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:04:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:26.231 2 INFO neutron.agent.securitygroups_rpc [None req-0da36af9-1a94-4f4d-ad02-4b1f7dcf1efe 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24144 DF PROTO=TCP SPT=36208 DPT=9102 SEQ=1171013918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA464820000000001030307) 
Oct 14 10:04:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:27.742 2 INFO neutron.agent.securitygroups_rpc [None req-98814c74-3d3a-49a1-9918-31d102d49189 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:27.836 2 INFO neutron.agent.securitygroups_rpc [None req-98814c74-3d3a-49a1-9918-31d102d49189 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:28 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:28.283 2 INFO neutron.agent.securitygroups_rpc [None req-ead9a8a6-1d78-4598-8371-3b464ab06229 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:28.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:28.827 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:04:28 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:28.956 2 INFO neutron.agent.securitygroups_rpc [None req-65c3ad6a-31e2-41c8-9649-501567b35565 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:29.119 287366 INFO neutron.agent.linux.ip_lib [None req-b2a1b94c-3c2f-4866-82da-421a5163df33 - - - - - -] Device tap324b1115-ec cannot be used as it has no MAC address
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test kernel: device tap324b1115-ec entered promiscuous mode
Oct 14 10:04:29 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436269.1473] manager: (tap324b1115-ec): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00166|binding|INFO|Claiming lport 324b1115-ec41-4f60-92a0-8b7a615d1261 for this chassis.
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00167|binding|INFO|324b1115-ec41-4f60-92a0-8b7a615d1261: Claiming unknown
Oct 14 10:04:29 np0005486759.ooo.test systemd-udevd[318752]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.159 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3203d45687c04170a54fdfabd71a8e39', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7eb57b38-8f71-4944-a9ba-11c4d85c61f3, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=324b1115-ec41-4f60-92a0-8b7a615d1261) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.161 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 324b1115-ec41-4f60-92a0-8b7a615d1261 in datapath fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6 bound to our chassis
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.162 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.163 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec47c05-2701-435f-8712-f25a30032c30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00168|binding|INFO|Setting lport 324b1115-ec41-4f60-92a0-8b7a615d1261 ovn-installed in OVS
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00169|binding|INFO|Setting lport 324b1115-ec41-4f60-92a0-8b7a615d1261 up in Southbound
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap324b1115-ec: No such device
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.325 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a9e3ae2d-17ed-4c8c-bde9-6aef8fe5546c with type ""
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00170|binding|INFO|Removing iface tap324b1115-ec ovn-installed in OVS
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00171|binding|INFO|Removing lport 324b1115-ec41-4f60-92a0-8b7a615d1261 ovn-installed in OVS
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.326 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3203d45687c04170a54fdfabd71a8e39', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7eb57b38-8f71-4944-a9ba-11c4d85c61f3, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=324b1115-ec41-4f60-92a0-8b7a615d1261) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.327 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 324b1115-ec41-4f60-92a0-8b7a615d1261 in datapath fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6 unbound from our chassis
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.329 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:04:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:29.330 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[e04163f9-e37c-42d2-bf15-ae43d6938ad5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:29.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:29Z|00172|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:04:29 np0005486759.ooo.test podman[318822]: 
Oct 14 10:04:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:30.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:30 np0005486759.ooo.test podman[318822]: 2025-10-14 10:04:30.005603703 +0000 UTC m=+0.093910316 container create 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:04:30 np0005486759.ooo.test systemd[1]: Started libpod-conmon-86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e.scope.
Oct 14 10:04:30 np0005486759.ooo.test podman[318822]: 2025-10-14 10:04:29.959346692 +0000 UTC m=+0.047653355 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:04:30 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:04:30 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7649fd7a2f2ed4e900f28b502cf087512e7b14ebd121e131cf1ab8fa7fc738d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:04:30 np0005486759.ooo.test podman[318822]: 2025-10-14 10:04:30.085856712 +0000 UTC m=+0.174163365 container init 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 14 10:04:30 np0005486759.ooo.test podman[318822]: 2025-10-14 10:04:30.095213487 +0000 UTC m=+0.183520110 container start 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq[318840]: started, version 2.85 cachesize 150
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq[318840]: DNS service limited to local subnets
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq[318840]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq[318840]: warning: no upstream servers configured
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq-dhcp[318840]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq[318840]: read /var/lib/neutron/dhcp/fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6/addn_hosts - 0 addresses
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq-dhcp[318840]: read /var/lib/neutron/dhcp/fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6/host
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq-dhcp[318840]: read /var/lib/neutron/dhcp/fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6/opts
Oct 14 10:04:30 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:30.194 287366 INFO neutron.agent.dhcp.agent [None req-6a600c4f-ceef-4272-a69e-d77f7083c826 - - - - - -] DHCP configuration for ports {'85463577-ff1a-440b-9816-c0b6d164140e'} is completed
Oct 14 10:04:30 np0005486759.ooo.test dnsmasq[318840]: exiting on receipt of SIGTERM
Oct 14 10:04:30 np0005486759.ooo.test podman[318857]: 2025-10-14 10:04:30.332480975 +0000 UTC m=+0.058138115 container kill 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2)
Oct 14 10:04:30 np0005486759.ooo.test systemd[1]: libpod-86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e.scope: Deactivated successfully.
Oct 14 10:04:30 np0005486759.ooo.test podman[318871]: 2025-10-14 10:04:30.418379325 +0000 UTC m=+0.067170990 container died 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Oct 14 10:04:30 np0005486759.ooo.test podman[318871]: 2025-10-14 10:04:30.447432602 +0000 UTC m=+0.096224197 container cleanup 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:04:30 np0005486759.ooo.test systemd[1]: libpod-conmon-86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e.scope: Deactivated successfully.
Oct 14 10:04:30 np0005486759.ooo.test podman[318870]: 2025-10-14 10:04:30.498774558 +0000 UTC m=+0.143900501 container remove 86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fe44bc5a-7c97-4d06-a30d-f2e3bdc477a6, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:04:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:30.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:30 np0005486759.ooo.test kernel: device tap324b1115-ec left promiscuous mode
Oct 14 10:04:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:30.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:30 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:30.547 287366 INFO neutron.agent.dhcp.agent [None req-a96f2a30-9301-4391-b19c-bc78f6821281 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:30 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:30.548 287366 INFO neutron.agent.dhcp.agent [None req-a96f2a30-9301-4391-b19c-bc78f6821281 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-7649fd7a2f2ed4e900f28b502cf087512e7b14ebd121e131cf1ab8fa7fc738d7-merged.mount: Deactivated successfully.
Oct 14 10:04:31 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86672b66c6459ac7b6ff335d83044b219538c6a3578574a1c987d1a9a6c6838e-userdata-shm.mount: Deactivated successfully.
Oct 14 10:04:31 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2dfe44bc5a\x2d7c97\x2d4d06\x2da30d\x2df2e3bdc477a6.mount: Deactivated successfully.
Oct 14 10:04:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:31.618 2 INFO neutron.agent.securitygroups_rpc [None req-0dbfe634-13d0-42d7-a5a4-cd7a3206b8d9 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:04:32 np0005486759.ooo.test podman[318900]: 2025-10-14 10:04:32.18769751 +0000 UTC m=+0.067667275 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:04:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:32.188 287366 INFO neutron.agent.linux.ip_lib [None req-565901e0-7be1-49f3-b954-bfe02e56f518 - - - - - -] Device tapdaf94f7c-f7 cannot be used as it has no MAC address
Oct 14 10:04:32 np0005486759.ooo.test podman[318900]: 2025-10-14 10:04:32.195465028 +0000 UTC m=+0.075434733 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:04:32 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:04:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:32.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:32 np0005486759.ooo.test kernel: device tapdaf94f7c-f7 entered promiscuous mode
Oct 14 10:04:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436272.2223] manager: (tapdaf94f7c-f7): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Oct 14 10:04:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:32.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:32Z|00173|binding|INFO|Claiming lport daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd for this chassis.
Oct 14 10:04:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:32Z|00174|binding|INFO|daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd: Claiming unknown
Oct 14 10:04:32 np0005486759.ooo.test systemd-udevd[318926]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:04:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:32.244 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-8a9b3b34-0d2b-42f6-b88b-53af8b93193e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a9b3b34-0d2b-42f6-b88b-53af8b93193e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=81bedf40-fa69-4533-beb4-3b97fc8d693d, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:32.245 183328 INFO neutron.agent.ovn.metadata.agent [-] Port daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd in datapath 8a9b3b34-0d2b-42f6-b88b-53af8b93193e bound to our chassis
Oct 14 10:04:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:32.247 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3ebd6f3c-d42c-4004-a4cc-d6f528f33814 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:04:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:32.247 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a9b3b34-0d2b-42f6-b88b-53af8b93193e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:04:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:32.248 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[e9822d7f-35f7-4443-b7ca-59e17127ea3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:32.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:32Z|00175|binding|INFO|Setting lport daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd ovn-installed in OVS
Oct 14 10:04:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:32Z|00176|binding|INFO|Setting lport daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd up in Southbound
Oct 14 10:04:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:32.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:32.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:32.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:32.549 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:32.580 2 INFO neutron.agent.securitygroups_rpc [None req-0acc6730-e2c7-4eb5-bf4b-93d4dcc9092a 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:32.751 2 INFO neutron.agent.securitygroups_rpc [None req-7f92b218-bebd-4077-9804-9ee2eae5e9fc a886231d667845929009e5bc8dbf5bb5 185a1baaea5d41c28b11ccd2d8e01a42 - - default default] Security group member updated ['d90586f5-59e6-468e-901a-5d8950f6bc5c']
Oct 14 10:04:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:32.790 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:04:32Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6971c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6975b0>], id=b2d06deb-eab1-40e9-b38a-fcbe830b8699, ip_allocation=immediate, mac_address=fa:16:3e:6b:f5:b2, name=tempest-RoutersAdminNegativeIpV6Test-627198645, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=True, project_id=185a1baaea5d41c28b11ccd2d8e01a42, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d90586f5-59e6-468e-901a-5d8950f6bc5c'], standard_attr_id=1265, status=DOWN, tags=[], tenant_id=185a1baaea5d41c28b11ccd2d8e01a42, updated_at=2025-10-14T10:04:32Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:04:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:32.971 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:04:33 np0005486759.ooo.test podman[318980]: 2025-10-14 10:04:33.023353413 +0000 UTC m=+0.068330785 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 10:04:33 np0005486759.ooo.test podman[319007]: 
Oct 14 10:04:33 np0005486759.ooo.test podman[319007]: 2025-10-14 10:04:33.080508797 +0000 UTC m=+0.074236966 container create 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:04:33 np0005486759.ooo.test systemd[1]: Started libpod-conmon-7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9.scope.
Oct 14 10:04:33 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:04:33 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/400502e575271542ad75211912be0de106b12457057d678120db36088773dbd7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:04:33 np0005486759.ooo.test podman[319007]: 2025-10-14 10:04:33.134522394 +0000 UTC m=+0.128250573 container init 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:04:33 np0005486759.ooo.test podman[319007]: 2025-10-14 10:04:33.141477296 +0000 UTC m=+0.135205475 container start 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:04:33 np0005486759.ooo.test podman[319007]: 2025-10-14 10:04:33.047073357 +0000 UTC m=+0.040801606 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: started, version 2.85 cachesize 150
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: DNS service limited to local subnets
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: warning: no upstream servers configured
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[319037]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: read /var/lib/neutron/dhcp/8a9b3b34-0d2b-42f6-b88b-53af8b93193e/addn_hosts - 0 addresses
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[319037]: read /var/lib/neutron/dhcp/8a9b3b34-0d2b-42f6-b88b-53af8b93193e/host
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[319037]: read /var/lib/neutron/dhcp/8a9b3b34-0d2b-42f6-b88b-53af8b93193e/opts
Oct 14 10:04:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:33.193 287366 INFO neutron.agent.dhcp.agent [None req-3e0714c1-9e53-4ca7-8254-78a158c8ea90 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:04:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec66eb20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec815130>], id=b68a0ce3-c157-4149-b408-31659449aa09, ip_allocation=immediate, mac_address=fa:16:3e:bd:bc:aa, name=tempest-PortsTestJSON-1673259646, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:04:29Z, description=, dns_domain=, id=8a9b3b34-0d2b-42f6-b88b-53af8b93193e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-2138161854, port_security_enabled=True, project_id=141e57a7b7e34017814ba8c1818c2cfc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32316, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1234, status=ACTIVE, subnets=['a583da1c-0c81-4dbf-b946-64282c061f91'], tags=[], tenant_id=141e57a7b7e34017814ba8c1818c2cfc, updated_at=2025-10-14T10:04:30Z, vlan_transparent=None, network_id=8a9b3b34-0d2b-42f6-b88b-53af8b93193e, port_security_enabled=True, project_id=141e57a7b7e34017814ba8c1818c2cfc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0'], standard_attr_id=1251, status=DOWN, tags=[], tenant_id=141e57a7b7e34017814ba8c1818c2cfc, updated_at=2025-10-14T10:04:31Z on network 8a9b3b34-0d2b-42f6-b88b-53af8b93193e
Oct 14 10:04:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:33.328 287366 INFO neutron.agent.dhcp.agent [None req-4805fe75-ba83-4672-be31-793948a83676 - - - - - -] DHCP configuration for ports {'3ef12cdc-026a-430c-a781-3427ea20a2a2', 'b2d06deb-eab1-40e9-b38a-fcbe830b8699'} is completed
Oct 14 10:04:33 np0005486759.ooo.test podman[319056]: 2025-10-14 10:04:33.413563257 +0000 UTC m=+0.058842076 container kill 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: read /var/lib/neutron/dhcp/8a9b3b34-0d2b-42f6-b88b-53af8b93193e/addn_hosts - 1 addresses
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[319037]: read /var/lib/neutron/dhcp/8a9b3b34-0d2b-42f6-b88b-53af8b93193e/host
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq-dhcp[319037]: read /var/lib/neutron/dhcp/8a9b3b34-0d2b-42f6-b88b-53af8b93193e/opts
Oct 14 10:04:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:33.645 287366 INFO neutron.agent.dhcp.agent [None req-3b641b9b-8b8a-4886-836c-4c59e177531e - - - - - -] DHCP configuration for ports {'b68a0ce3-c157-4149-b408-31659449aa09'} is completed
Oct 14 10:04:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:33.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:33 np0005486759.ooo.test dnsmasq[319037]: exiting on receipt of SIGTERM
Oct 14 10:04:33 np0005486759.ooo.test podman[319095]: 2025-10-14 10:04:33.819756549 +0000 UTC m=+0.062271312 container kill 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:04:33 np0005486759.ooo.test systemd[1]: libpod-7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9.scope: Deactivated successfully.
Oct 14 10:04:33 np0005486759.ooo.test podman[319111]: 2025-10-14 10:04:33.891119245 +0000 UTC m=+0.053638167 container died 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 14 10:04:33 np0005486759.ooo.test podman[319111]: 2025-10-14 10:04:33.931831978 +0000 UTC m=+0.094350870 container remove 7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9b3b34-0d2b-42f6-b88b-53af8b93193e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:04:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:33.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:33 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:33Z|00177|binding|INFO|Releasing lport daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd from this chassis (sb_readonly=0)
Oct 14 10:04:33 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:33Z|00178|binding|INFO|Setting lport daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd down in Southbound
Oct 14 10:04:33 np0005486759.ooo.test kernel: device tapdaf94f7c-f7 left promiscuous mode
Oct 14 10:04:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:33.962 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-8a9b3b34-0d2b-42f6-b88b-53af8b93193e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a9b3b34-0d2b-42f6-b88b-53af8b93193e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=81bedf40-fa69-4533-beb4-3b97fc8d693d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:33.963 183328 INFO neutron.agent.ovn.metadata.agent [-] Port daf94f7c-f7a4-43d5-8f9f-2ab83590b8dd in datapath 8a9b3b34-0d2b-42f6-b88b-53af8b93193e unbound from our chassis
Oct 14 10:04:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:33.967 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a9b3b34-0d2b-42f6-b88b-53af8b93193e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:04:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:33.968 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[525cd13c-4ffa-4b94-beb5-a37139dc5727]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:33.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:33 np0005486759.ooo.test systemd[1]: libpod-conmon-7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9.scope: Deactivated successfully.
Oct 14 10:04:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:34.002 287366 INFO neutron.agent.dhcp.agent [None req-b82911c0-cb4d-4804-a094-b44fdc1fce65 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:34.027 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-400502e575271542ad75211912be0de106b12457057d678120db36088773dbd7-merged.mount: Deactivated successfully.
Oct 14 10:04:34 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7da953889f3b6ccac5938341163346f4879b8bf7a634635740779dac887b84e9-userdata-shm.mount: Deactivated successfully.
Oct 14 10:04:34 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d8a9b3b34\x2d0d2b\x2d42f6\x2db88b\x2d53af8b93193e.mount: Deactivated successfully.
Oct 14 10:04:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:34.574 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:34 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:34.741 2 INFO neutron.agent.securitygroups_rpc [None req-34587eb5-3a4b-4f78-ac0f-f20b8cf1c6a2 a886231d667845929009e5bc8dbf5bb5 185a1baaea5d41c28b11ccd2d8e01a42 - - default default] Security group member updated ['d90586f5-59e6-468e-901a-5d8950f6bc5c']
Oct 14 10:04:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:34.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:34.920 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:04:34Z|00179|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:04:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:34.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:35 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:04:35 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:04:35 np0005486759.ooo.test podman[319149]: 2025-10-14 10:04:35.025158741 +0000 UTC m=+0.074642259 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:04:35 np0005486759.ooo.test systemd[1]: tmp-crun.otLOVt.mount: Deactivated successfully.
Oct 14 10:04:35 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:04:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:04:35 np0005486759.ooo.test systemd[1]: tmp-crun.1ETJ27.mount: Deactivated successfully.
Oct 14 10:04:35 np0005486759.ooo.test podman[319162]: 2025-10-14 10:04:35.165977206 +0000 UTC m=+0.093821373 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:04:35 np0005486759.ooo.test podman[319162]: 2025-10-14 10:04:35.179446507 +0000 UTC m=+0.107290694 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:04:35 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:04:36 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:36.921 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:38.648 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:38.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:39.109 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:39.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:40 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:40.735 2 INFO neutron.agent.securitygroups_rpc [None req-61a01490-148f-4191-b13c-989df9ea0628 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:41 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:41.030 2 INFO neutron.agent.securitygroups_rpc [None req-2bd246b9-fd01-446f-bcb3-0434cd693311 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:41 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:41.557 2 INFO neutron.agent.securitygroups_rpc [None req-286bc61f-8d0c-4694-8b41-50699603d38d 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:42 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:42.012 2 INFO neutron.agent.securitygroups_rpc [None req-e70409f8-5bb7-4beb-a043-bf6b8911f599 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:04:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:04:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:04:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:04:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:04:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16691 "" "Go-http-client/1.1"
Oct 14 10:04:42 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:42.438 2 INFO neutron.agent.securitygroups_rpc [None req-075270c1-c9fe-4ea9-a506-ff3122662a15 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:43 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:43.264 2 INFO neutron.agent.securitygroups_rpc [None req-24fc2099-d686-46a8-83cb-f4a88dcfa2ca 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:43.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:04:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:04:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:04:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:44.751 287366 INFO neutron.agent.linux.ip_lib [None req-368b5f33-c1e7-412f-8cc1-c4d72e0e6ad5 - - - - - -] Device tapdd89fd78-30 cannot be used as it has no MAC address
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test kernel: device tapdd89fd78-30 entered promiscuous mode
Oct 14 10:04:44 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436284.8946] manager: (tapdd89fd78-30): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test systemd-udevd[319203]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapdd89fd78-30: No such device
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:44.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:45 np0005486759.ooo.test podman[319272]: 
Oct 14 10:04:45 np0005486759.ooo.test podman[319272]: 2025-10-14 10:04:45.731695958 +0000 UTC m=+0.090394269 container create de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: Started libpod-conmon-de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c.scope.
Oct 14 10:04:45 np0005486759.ooo.test podman[319272]: 2025-10-14 10:04:45.689892543 +0000 UTC m=+0.048590854 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:04:45 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f602b56395827d8346d22a8c8217eb078eaee7cbc4eee02b6f02cf157956beca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:04:45 np0005486759.ooo.test podman[319272]: 2025-10-14 10:04:45.834404381 +0000 UTC m=+0.193102702 container init de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq[319331]: started, version 2.85 cachesize 150
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq[319331]: DNS service limited to local subnets
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq[319331]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq[319331]: warning: no upstream servers configured
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq-dhcp[319331]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq[319331]: read /var/lib/neutron/dhcp/deda3578-ad2e-4429-9bd7-b32e7604b8fa/addn_hosts - 0 addresses
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq-dhcp[319331]: read /var/lib/neutron/dhcp/deda3578-ad2e-4429-9bd7-b32e7604b8fa/host
Oct 14 10:04:45 np0005486759.ooo.test dnsmasq-dhcp[319331]: read /var/lib/neutron/dhcp/deda3578-ad2e-4429-9bd7-b32e7604b8fa/opts
Oct 14 10:04:45 np0005486759.ooo.test podman[319286]: 2025-10-14 10:04:45.871630477 +0000 UTC m=+0.096606509 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:04:45 np0005486759.ooo.test podman[319272]: 2025-10-14 10:04:45.896568478 +0000 UTC m=+0.255266789 container start de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:04:45 np0005486759.ooo.test podman[319286]: 2025-10-14 10:04:45.903168249 +0000 UTC m=+0.128144271 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:04:45 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:04:45 np0005486759.ooo.test podman[319287]: 2025-10-14 10:04:45.974810524 +0000 UTC m=+0.194990729 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:04:46 np0005486759.ooo.test podman[319295]: 2025-10-14 10:04:46.045181271 +0000 UTC m=+0.256775234 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 14 10:04:46 np0005486759.ooo.test podman[319295]: 2025-10-14 10:04:46.056009201 +0000 UTC m=+0.267603164 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:46.070 287366 INFO neutron.agent.dhcp.agent [None req-1a1e9aa7-bf16-426c-9dda-73fa53b33721 - - - - - -] DHCP configuration for ports {'fd36ebb5-6926-4a05-9ec8-3126faca48d0'} is completed
Oct 14 10:04:46 np0005486759.ooo.test podman[319288]: 2025-10-14 10:04:46.092244326 +0000 UTC m=+0.308380158 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:04:46 np0005486759.ooo.test podman[319287]: 2025-10-14 10:04:46.11203613 +0000 UTC m=+0.332216335 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test podman[319288]: 2025-10-14 10:04:46.129166714 +0000 UTC m=+0.345302516 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test dnsmasq[319331]: exiting on receipt of SIGTERM
Oct 14 10:04:46 np0005486759.ooo.test podman[319384]: 2025-10-14 10:04:46.189943427 +0000 UTC m=+0.057860275 container kill de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: libpod-de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c.scope: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test podman[319396]: 2025-10-14 10:04:46.256991242 +0000 UTC m=+0.056422932 container died de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:04:46 np0005486759.ooo.test podman[319396]: 2025-10-14 10:04:46.286653968 +0000 UTC m=+0.086085618 container cleanup de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: libpod-conmon-de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c.scope: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test podman[319403]: 2025-10-14 10:04:46.347234485 +0000 UTC m=+0.131784061 container remove de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deda3578-ad2e-4429-9bd7-b32e7604b8fa, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:04:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:46.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:46 np0005486759.ooo.test kernel: device tapdd89fd78-30 left promiscuous mode
Oct 14 10:04:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:46.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:46.392 287366 INFO neutron.agent.dhcp.agent [None req-8bd18ac4-decf-49a3-a45d-f21d70418c5e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:46.393 287366 INFO neutron.agent.dhcp.agent [None req-8bd18ac4-decf-49a3-a45d-f21d70418c5e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f602b56395827d8346d22a8c8217eb078eaee7cbc4eee02b6f02cf157956beca-merged.mount: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de61d4d76f014a51ecfa8c332871adf228851b63bee3d5d3df522b40ee6e171c-userdata-shm.mount: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2ddeda3578\x2dad2e\x2d4429\x2d9bd7\x2db32e7604b8fa.mount: Deactivated successfully.
Oct 14 10:04:46 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:46.893 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:67:4c 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-fb3fea1f-ea45-42e2-beeb-0d9bad11d862', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb3fea1f-ea45-42e2-beeb-0d9bad11d862', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e53eec1-dc20-46c5-a288-cd531930d32d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=466963da-fc59-493e-89b5-c48daf9704d7) old=Port_Binding(mac=['fa:16:3e:ab:67:4c 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-fb3fea1f-ea45-42e2-beeb-0d9bad11d862', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb3fea1f-ea45-42e2-beeb-0d9bad11d862', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:04:46 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:46.895 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 466963da-fc59-493e-89b5-c48daf9704d7 in datapath fb3fea1f-ea45-42e2-beeb-0d9bad11d862 updated
Oct 14 10:04:46 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:46.898 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb3fea1f-ea45-42e2-beeb-0d9bad11d862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:04:46 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:46.899 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d5943de7-fec8-4931-94d6-e43d3bd88a0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:04:47 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:47.828 2 INFO neutron.agent.securitygroups_rpc [None req-aade48f4-115b-49b1-8e6f-715fc5d9a701 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:48 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:48.488 2 INFO neutron.agent.securitygroups_rpc [None req-2ae9a137-479c-46cc-906b-364c53b1d093 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:48.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:49 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:49.225 2 INFO neutron.agent.securitygroups_rpc [None req-682039a8-7be9-4679-a900-76e3b5c58838 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8395 DF PROTO=TCP SPT=36024 DPT=9102 SEQ=1020373158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA4BDD80000000001030307) 
Oct 14 10:04:49 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:49.833 2 INFO neutron.agent.securitygroups_rpc [None req-14b9f620-66f8-469b-8952-913121a5092a 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:49 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:49.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8396 DF PROTO=TCP SPT=36024 DPT=9102 SEQ=1020373158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA4C1C20000000001030307) 
Oct 14 10:04:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8397 DF PROTO=TCP SPT=36024 DPT=9102 SEQ=1020373158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA4C9C20000000001030307) 
Oct 14 10:04:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:04:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:04:53 np0005486759.ooo.test systemd[1]: tmp-crun.cdfliq.mount: Deactivated successfully.
Oct 14 10:04:53 np0005486759.ooo.test podman[319424]: 2025-10-14 10:04:53.463163026 +0000 UTC m=+0.091561175 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 14 10:04:53 np0005486759.ooo.test podman[319425]: 2025-10-14 10:04:53.537671878 +0000 UTC m=+0.162132707 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Oct 14 10:04:53 np0005486759.ooo.test podman[319425]: 2025-10-14 10:04:53.552453619 +0000 UTC m=+0.176914498 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, release=1755695350, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal)
Oct 14 10:04:53 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:04:53 np0005486759.ooo.test podman[319424]: 2025-10-14 10:04:53.574431469 +0000 UTC m=+0.202829588 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:04:53 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:04:53 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:53.649 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:04:53Z, description=, device_id=b3cf1c6f-8e09-43c2-b037-a25c5e4009c8, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec799e20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f2310>], id=03e9c01a-98c4-4e17-809c-6b26cca3ab64, ip_allocation=immediate, mac_address=fa:16:3e:42:33:6c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1404, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:04:53Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:04:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:53.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:53 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:04:53 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:04:53 np0005486759.ooo.test podman[319485]: 2025-10-14 10:04:53.833767331 +0000 UTC m=+0.043245931 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:04:53 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:04:54 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:04:54.164 287366 INFO neutron.agent.dhcp.agent [None req-004e4e4e-5299-4ce6-a15b-8292d99152a3 - - - - - -] DHCP configuration for ports {'03e9c01a-98c4-4e17-809c-6b26cca3ab64'} is completed
Oct 14 10:04:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:54.171 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:04:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:54.172 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:04:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:04:54.173 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:04:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:54.498 2 INFO neutron.agent.securitygroups_rpc [None req-60f16072-ed8d-49de-acb7-b5abb1792c86 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:54.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:55 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:55.286 2 INFO neutron.agent.securitygroups_rpc [None req-c8b1b587-2f84-4440-883c-b94049ab2964 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:56 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:56.357 2 INFO neutron.agent.securitygroups_rpc [None req-a470f81b-ba5b-4c39-aa88-a205a989f851 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8398 DF PROTO=TCP SPT=36024 DPT=9102 SEQ=1020373158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA4D9810000000001030307) 
Oct 14 10:04:57 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:57.478 2 INFO neutron.agent.securitygroups_rpc [None req-9e1d0ef0-e61f-4501-8767-60f144361c57 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:04:58 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:04:58.007 2 INFO neutron.agent.securitygroups_rpc [None req-ae609f6d-407b-4135-a505-0ae2cab5ce79 74733912d6bf4434826a3650f9edf486 0588d65ccb8d4501875c5932df64f1b4 - - default default] Security group member updated ['3e97318c-b0bc-4691-8990-c6a08e03a17c']
Oct 14 10:04:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:58.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:04:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:04:59.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:01.199 287366 INFO neutron.agent.linux.ip_lib [None req-efe78b44-ae4c-4a8d-9783-32c128cf72ad - - - - - -] Device tap618b87a3-dc cannot be used as it has no MAC address
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test kernel: device tap618b87a3-dc entered promiscuous mode
Oct 14 10:05:01 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436301.2351] manager: (tap618b87a3-dc): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00180|binding|INFO|Claiming lport 618b87a3-dc30-464b-9e4a-60fd70a7b78b for this chassis.
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00181|binding|INFO|618b87a3-dc30-464b-9e4a-60fd70a7b78b: Claiming unknown
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test systemd-udevd[319518]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.249 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-7f12e3b8-8114-4393-8ed9-067020f7af99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f12e3b8-8114-4393-8ed9-067020f7af99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3203d45687c04170a54fdfabd71a8e39', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a979bb7-9a56-4417-ae69-43358c723e9d, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=618b87a3-dc30-464b-9e4a-60fd70a7b78b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.250 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 618b87a3-dc30-464b-9e4a-60fd70a7b78b in datapath 7f12e3b8-8114-4393-8ed9-067020f7af99 bound to our chassis
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.255 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port cc20ae5e-2e57-4cb9-bc17-897fcbd938cc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.255 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f12e3b8-8114-4393-8ed9-067020f7af99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.256 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[0f79823d-5cfe-4c2a-a069-ecf746f1d3a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00182|binding|INFO|Setting lport 618b87a3-dc30-464b-9e4a-60fd70a7b78b ovn-installed in OVS
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00183|binding|INFO|Setting lport 618b87a3-dc30-464b-9e4a-60fd70a7b78b up in Southbound
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap618b87a3-dc: No such device
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:01.550 287366 INFO neutron.agent.linux.ip_lib [None req-45d91047-5a86-48e8-9650-1619b07ac3b8 - - - - - -] Device tap3cb74b95-14 cannot be used as it has no MAC address
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test kernel: device tap3cb74b95-14 entered promiscuous mode
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test systemd-udevd[319520]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:05:01 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436301.5933] manager: (tap3cb74b95-14): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00184|binding|INFO|Claiming lport 3cb74b95-14ff-4b93-88f8-d9624644f87d for this chassis.
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00185|binding|INFO|3cb74b95-14ff-4b93-88f8-d9624644f87d: Claiming unknown
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00186|binding|INFO|Setting lport 3cb74b95-14ff-4b93-88f8-d9624644f87d ovn-installed in OVS
Oct 14 10:05:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:01Z|00187|binding|INFO|Setting lport 3cb74b95-14ff-4b93-88f8-d9624644f87d up in Southbound
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.617 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-7ea77a3d-989f-4274-94fe-ef7044ee08e6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ea77a3d-989f-4274-94fe-ef7044ee08e6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=faaef901-0fe0-41db-bcbc-ae02f3c74529, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3cb74b95-14ff-4b93-88f8-d9624644f87d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.622 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3cb74b95-14ff-4b93-88f8-d9624644f87d in datapath 7ea77a3d-989f-4274-94fe-ef7044ee08e6 bound to our chassis
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.625 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7ea77a3d-989f-4274-94fe-ef7044ee08e6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:05:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:01.626 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a61462d7-c399-4124-9746-121bcb734ca3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:01.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:01 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:01.760 2 INFO neutron.agent.securitygroups_rpc [None req-61d9519d-6695-49a0-897f-d5d505feacd5 74733912d6bf4434826a3650f9edf486 0588d65ccb8d4501875c5932df64f1b4 - - default default] Security group member updated ['3e97318c-b0bc-4691-8990-c6a08e03a17c']
Oct 14 10:05:02 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:02.099 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port cc20ae5e-2e57-4cb9-bc17-897fcbd938cc with type ""
Oct 14 10:05:02 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:02Z|00188|binding|INFO|Removing iface tap618b87a3-dc ovn-installed in OVS
Oct 14 10:05:02 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:02Z|00189|binding|INFO|Removing lport 618b87a3-dc30-464b-9e4a-60fd70a7b78b ovn-installed in OVS
Oct 14 10:05:02 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:02.100 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-7f12e3b8-8114-4393-8ed9-067020f7af99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f12e3b8-8114-4393-8ed9-067020f7af99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3203d45687c04170a54fdfabd71a8e39', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a979bb7-9a56-4417-ae69-43358c723e9d, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=618b87a3-dc30-464b-9e4a-60fd70a7b78b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:02.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:02 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:02.103 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 618b87a3-dc30-464b-9e4a-60fd70a7b78b in datapath 7f12e3b8-8114-4393-8ed9-067020f7af99 unbound from our chassis
Oct 14 10:05:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:02.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:02 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:02.108 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f12e3b8-8114-4393-8ed9-067020f7af99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:02 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:02.108 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a51413fa-0fed-4ec3-b9ff-374183245988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:02 np0005486759.ooo.test podman[319625]: 
Oct 14 10:05:02 np0005486759.ooo.test podman[319625]: 2025-10-14 10:05:02.221978494 +0000 UTC m=+0.091389758 container create 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 10:05:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:05:02 np0005486759.ooo.test systemd[1]: Started libpod-conmon-9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883.scope.
Oct 14 10:05:02 np0005486759.ooo.test podman[319625]: 2025-10-14 10:05:02.176709634 +0000 UTC m=+0.046120928 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:02 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:02 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fe601c47b62bc937168b3db6cf6ef7785f685ba15f5e713ca90033cf8fe24d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:02 np0005486759.ooo.test podman[319625]: 2025-10-14 10:05:02.297695884 +0000 UTC m=+0.167107148 container init 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319655]: started, version 2.85 cachesize 150
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319655]: DNS service limited to local subnets
Oct 14 10:05:02 np0005486759.ooo.test podman[319625]: 2025-10-14 10:05:02.311731772 +0000 UTC m=+0.181143036 container start 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319655]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319655]: warning: no upstream servers configured
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319655]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/addn_hosts - 0 addresses
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/host
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/opts
Oct 14 10:05:02 np0005486759.ooo.test podman[319643]: 2025-10-14 10:05:02.364147342 +0000 UTC m=+0.094872886 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 14 10:05:02 np0005486759.ooo.test podman[319643]: 2025-10-14 10:05:02.396473437 +0000 UTC m=+0.127198961 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:02 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:05:02 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:02.509 287366 INFO neutron.agent.dhcp.agent [None req-b70917a2-d869-432c-b1ac-e62a79aed7cd - - - - - -] DHCP configuration for ports {'2df442ce-01f7-466e-a789-b3f6009654b0'} is completed
Oct 14 10:05:02 np0005486759.ooo.test podman[319701]: 
Oct 14 10:05:02 np0005486759.ooo.test podman[319701]: 2025-10-14 10:05:02.673617752 +0000 UTC m=+0.101098545 container create a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:02 np0005486759.ooo.test systemd[1]: Started libpod-conmon-a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d.scope.
Oct 14 10:05:02 np0005486759.ooo.test podman[319701]: 2025-10-14 10:05:02.628988571 +0000 UTC m=+0.056469354 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:02 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/addn_hosts - 0 addresses
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/host
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/opts
Oct 14 10:05:02 np0005486759.ooo.test podman[319717]: 2025-10-14 10:05:02.73025081 +0000 UTC m=+0.094555115 container kill 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:05:02 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7805bd5146c087044b93107328e2925b841b276a34c0d0134cad1b3de25649c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:02 np0005486759.ooo.test podman[319701]: 2025-10-14 10:05:02.743864445 +0000 UTC m=+0.171345228 container init a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:02 np0005486759.ooo.test podman[319701]: 2025-10-14 10:05:02.753052786 +0000 UTC m=+0.180533579 container start a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319739]: started, version 2.85 cachesize 150
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319739]: DNS service limited to local subnets
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319739]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319739]: warning: no upstream servers configured
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319739]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/addn_hosts - 0 addresses
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/host
Oct 14 10:05:02 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/opts
Oct 14 10:05:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:02.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:02 np0005486759.ooo.test kernel: device tap618b87a3-dc left promiscuous mode
Oct 14 10:05:02 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:02.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:02 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:02.990 287366 INFO neutron.agent.dhcp.agent [None req-51b16b8e-8569-43ca-b32f-9724a77a3a4e - - - - - -] DHCP configuration for ports {'2df442ce-01f7-466e-a789-b3f6009654b0', 'cf0885b7-3c91-4460-ae2c-4ff9384bd90c'} is completed
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/addn_hosts - 0 addresses
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq-dhcp[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/host
Oct 14 10:05:03 np0005486759.ooo.test podman[319764]: 2025-10-14 10:05:03.065625971 +0000 UTC m=+0.061835438 container kill 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq-dhcp[319655]: read /var/lib/neutron/dhcp/7f12e3b8-8114-4393-8ed9-067020f7af99/opts
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent [None req-dd81ff68-595f-4c58-a2df-ab2965b6fa03 - - - - - -] Unable to reload_allocations dhcp for 7f12e3b8-8114-4393-8ed9-067020f7af99.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap618b87a3-dc not found in namespace qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99.
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     return fut.result()
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     raise self._exception
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap618b87a3-dc not found in namespace qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99.
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.087 287366 ERROR neutron.agent.dhcp.agent 
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.091 287366 INFO neutron.agent.dhcp.agent [None req-db1f88a4-5ebb-4c4d-876a-ae6b5d3585a7 - - - - - -] Synchronizing state
Oct 14 10:05:03 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:03.158 2 INFO neutron.agent.securitygroups_rpc [None req-00c0b970-e8e9-4534-8371-98c28157dfa4 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.250 287366 INFO neutron.agent.dhcp.agent [None req-34f92cee-79ec-4469-9eff-93703db8357d - - - - - -] All active networks have been fetched through RPC.
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.251 287366 INFO neutron.agent.dhcp.agent [-] Starting network 7f12e3b8-8114-4393-8ed9-067020f7af99 dhcp configuration
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.251 287366 INFO neutron.agent.dhcp.agent [-] Finished network 7f12e3b8-8114-4393-8ed9-067020f7af99 dhcp configuration
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.252 287366 INFO neutron.agent.dhcp.agent [None req-34f92cee-79ec-4469-9eff-93703db8357d - - - - - -] Synchronizing state complete
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.253 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7879a0>], id=ac32b372-f7d5-4ade-bb8b-6002e5b3affb, ip_allocation=immediate, mac_address=fa:16:3e:cd:a5:24, name=tempest-PortsTestJSON-1582150543, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:04:59Z, description=, dns_domain=, id=7ea77a3d-989f-4274-94fe-ef7044ee08e6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1613288758, port_security_enabled=True, project_id=141e57a7b7e34017814ba8c1818c2cfc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25948, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1460, status=ACTIVE, subnets=['2c5951c2-9e76-4719-a08e-cdd2102869c1'], tags=[], tenant_id=141e57a7b7e34017814ba8c1818c2cfc, updated_at=2025-10-14T10:05:00Z, vlan_transparent=None, network_id=7ea77a3d-989f-4274-94fe-ef7044ee08e6, port_security_enabled=True, project_id=141e57a7b7e34017814ba8c1818c2cfc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0'], standard_attr_id=1473, status=DOWN, tags=[], tenant_id=141e57a7b7e34017814ba8c1818c2cfc, updated_at=2025-10-14T10:05:03Z on network 7ea77a3d-989f-4274-94fe-ef7044ee08e6
Oct 14 10:05:03 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:03Z|00190|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:05:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:03.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/addn_hosts - 1 addresses
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/host
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/opts
Oct 14 10:05:03 np0005486759.ooo.test podman[319808]: 2025-10-14 10:05:03.495172325 +0000 UTC m=+0.066781328 container kill a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:05:03 np0005486759.ooo.test dnsmasq[319655]: exiting on receipt of SIGTERM
Oct 14 10:05:03 np0005486759.ooo.test podman[319822]: 2025-10-14 10:05:03.539068834 +0000 UTC m=+0.054030099 container kill 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:05:03 np0005486759.ooo.test systemd[1]: tmp-crun.CdeWKq.mount: Deactivated successfully.
Oct 14 10:05:03 np0005486759.ooo.test systemd[1]: libpod-9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883.scope: Deactivated successfully.
Oct 14 10:05:03 np0005486759.ooo.test podman[319841]: 2025-10-14 10:05:03.591265836 +0000 UTC m=+0.043866169 container died 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:05:03 np0005486759.ooo.test podman[319841]: 2025-10-14 10:05:03.621145228 +0000 UTC m=+0.073745481 container cleanup 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:05:03 np0005486759.ooo.test systemd[1]: libpod-conmon-9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883.scope: Deactivated successfully.
Oct 14 10:05:03 np0005486759.ooo.test podman[319848]: 2025-10-14 10:05:03.674838476 +0000 UTC m=+0.115433282 container remove 9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f12e3b8-8114-4393-8ed9-067020f7af99, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:03 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:03.771 287366 INFO neutron.agent.dhcp.agent [None req-de9646e4-c58e-4c68-9549-c67a13b98f3d - - - - - -] DHCP configuration for ports {'ac32b372-f7d5-4ade-bb8b-6002e5b3affb'} is completed
Oct 14 10:05:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:03.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-b7fe601c47b62bc937168b3db6cf6ef7785f685ba15f5e713ca90033cf8fe24d-merged.mount: Deactivated successfully.
Oct 14 10:05:04 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9065ea2570bb77d7094cfe7e76a856085d0cab999849e5e9a02e11b7b49d2883-userdata-shm.mount: Deactivated successfully.
Oct 14 10:05:04 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d7f12e3b8\x2d8114\x2d4393\x2d8ed9\x2d067020f7af99.mount: Deactivated successfully.
Oct 14 10:05:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:04.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:05 np0005486759.ooo.test podman[319889]: 2025-10-14 10:05:05.358285552 +0000 UTC m=+0.046571072 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:05 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:05:05 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:05 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:05:05 np0005486759.ooo.test systemd[1]: tmp-crun.EtmCTn.mount: Deactivated successfully.
Oct 14 10:05:05 np0005486759.ooo.test podman[319902]: 2025-10-14 10:05:05.457794277 +0000 UTC m=+0.084279292 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:05:05 np0005486759.ooo.test podman[319902]: 2025-10-14 10:05:05.493448795 +0000 UTC m=+0.119933770 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:05:05 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:05:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:06.648 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:02Z, description=, device_id=4120f13a-182e-46f0-b73f-1f9a32cc4733, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec624490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5550>], id=ac32b372-f7d5-4ade-bb8b-6002e5b3affb, ip_allocation=immediate, mac_address=fa:16:3e:cd:a5:24, name=tempest-PortsTestJSON-1582150543, network_id=7ea77a3d-989f-4274-94fe-ef7044ee08e6, port_security_enabled=True, project_id=141e57a7b7e34017814ba8c1818c2cfc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0'], standard_attr_id=1473, status=DOWN, tags=[], tenant_id=141e57a7b7e34017814ba8c1818c2cfc, updated_at=2025-10-14T10:05:03Z on network 7ea77a3d-989f-4274-94fe-ef7044ee08e6
Oct 14 10:05:06 np0005486759.ooo.test dnsmasq[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/addn_hosts - 1 addresses
Oct 14 10:05:06 np0005486759.ooo.test podman[319950]: 2025-10-14 10:05:06.857162457 +0000 UTC m=+0.059579959 container kill a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3)
Oct 14 10:05:06 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/host
Oct 14 10:05:06 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/opts
Oct 14 10:05:07 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:07.130 287366 INFO neutron.agent.dhcp.agent [None req-15b772ce-cef4-464a-a5e1-703419a892ab - - - - - -] DHCP configuration for ports {'ac32b372-f7d5-4ade-bb8b-6002e5b3affb'} is completed
Oct 14 10:05:08 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:08.136 2 INFO neutron.agent.securitygroups_rpc [None req-ebf95002-b4f4-46bf-8950-33de87f5c1ee 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:05:08 np0005486759.ooo.test dnsmasq[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/addn_hosts - 0 addresses
Oct 14 10:05:08 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/host
Oct 14 10:05:08 np0005486759.ooo.test dnsmasq-dhcp[319739]: read /var/lib/neutron/dhcp/7ea77a3d-989f-4274-94fe-ef7044ee08e6/opts
Oct 14 10:05:08 np0005486759.ooo.test podman[319990]: 2025-10-14 10:05:08.336715543 +0000 UTC m=+0.039654331 container kill a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:05:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:08Z|00191|binding|INFO|Releasing lport 3cb74b95-14ff-4b93-88f8-d9624644f87d from this chassis (sb_readonly=0)
Oct 14 10:05:08 np0005486759.ooo.test kernel: device tap3cb74b95-14 left promiscuous mode
Oct 14 10:05:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:08Z|00192|binding|INFO|Setting lport 3cb74b95-14ff-4b93-88f8-d9624644f87d down in Southbound
Oct 14 10:05:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:08.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:08.509 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-7ea77a3d-989f-4274-94fe-ef7044ee08e6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ea77a3d-989f-4274-94fe-ef7044ee08e6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=faaef901-0fe0-41db-bcbc-ae02f3c74529, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3cb74b95-14ff-4b93-88f8-d9624644f87d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:08.511 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3cb74b95-14ff-4b93-88f8-d9624644f87d in datapath 7ea77a3d-989f-4274-94fe-ef7044ee08e6 unbound from our chassis
Oct 14 10:05:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:08.513 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7ea77a3d-989f-4274-94fe-ef7044ee08e6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:08.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:08.516 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c0907ab8-0bcf-475d-ac1f-73aa0d048b6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:08.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:09 np0005486759.ooo.test dnsmasq[319739]: exiting on receipt of SIGTERM
Oct 14 10:05:09 np0005486759.ooo.test podman[320028]: 2025-10-14 10:05:09.655201445 +0000 UTC m=+0.063961742 container kill a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:09 np0005486759.ooo.test systemd[1]: libpod-a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d.scope: Deactivated successfully.
Oct 14 10:05:09 np0005486759.ooo.test podman[320040]: 2025-10-14 10:05:09.726045006 +0000 UTC m=+0.055753282 container died a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:05:09 np0005486759.ooo.test podman[320040]: 2025-10-14 10:05:09.751361619 +0000 UTC m=+0.081069885 container cleanup a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:05:09 np0005486759.ooo.test systemd[1]: libpod-conmon-a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d.scope: Deactivated successfully.
Oct 14 10:05:09 np0005486759.ooo.test podman[320042]: 2025-10-14 10:05:09.797538577 +0000 UTC m=+0.119275979 container remove a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7ea77a3d-989f-4274-94fe-ef7044ee08e6, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:05:09 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:09.835 287366 INFO neutron.agent.dhcp.agent [None req-787ca965-ba33-4803-918d-84398d15fd55 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:05:09 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:09.836 287366 INFO neutron.agent.dhcp.agent [None req-787ca965-ba33-4803-918d-84398d15fd55 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:05:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:09.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:10 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:10Z|00193|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:05:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:10.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-d7805bd5146c087044b93107328e2925b841b276a34c0d0134cad1b3de25649c-merged.mount: Deactivated successfully.
Oct 14 10:05:10 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0b0e42cd4e4a980a8493570dab447920774de508f45fe6dcc441ae026363d3d-userdata-shm.mount: Deactivated successfully.
Oct 14 10:05:10 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d7ea77a3d\x2d989f\x2d4274\x2d94fe\x2def7044ee08e6.mount: Deactivated successfully.
Oct 14 10:05:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:05:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:05:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:05:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:05:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:05:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16693 "" "Go-http-client/1.1"
Oct 14 10:05:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:13.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:13.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 14 10:05:13 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:13.671 2 INFO neutron.agent.securitygroups_rpc [None req-b3e376ee-d99d-44db-996a-a116534320b5 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['29c160c7-b39b-46f8-9f10-78478571a053']
Oct 14 10:05:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:13.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:05:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:05:14 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:14.931 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:cb:e3 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9a11b08-56ff-40fe-ac24-3c9837428cfb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=13c47791-ba8c-468a-b4a7-890a7ca81541) old=Port_Binding(mac=['fa:16:3e:e8:cb:e3 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:14 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:14.933 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 13c47791-ba8c-468a-b4a7-890a7ca81541 in datapath bc1d2da3-bea1-48d9-8064-d4ed4892597b updated
Oct 14 10:05:14 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:14.936 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc1d2da3-bea1-48d9-8064-d4ed4892597b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:14 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:14.937 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[63956356-22d2-4aa4-8cee-966ac880f2b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:14.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:15.231 2 INFO neutron.agent.securitygroups_rpc [None req-a5d8fd32-5fcd-4554-8b15-56573dd3621d c147b1d5045e45ee881d16392fd312d9 d9899fedfbba4525a6e5cd57c52e4aa4 - - default default] Security group member updated ['7faf12df-5d2c-4073-a518-5e5689319cb3']
Oct 14 10:05:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:15.271 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:15Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec64c0a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec64c070>], id=6c182e54-1a33-431c-8316-a6346143300e, ip_allocation=immediate, mac_address=fa:16:3e:a6:9f:4e, name=tempest-RoutersAdminNegativeTest-28109962, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=True, project_id=d9899fedfbba4525a6e5cd57c52e4aa4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7faf12df-5d2c-4073-a518-5e5689319cb3'], standard_attr_id=1529, status=DOWN, tags=[], tenant_id=d9899fedfbba4525a6e5cd57c52e4aa4, updated_at=2025-10-14T10:05:15Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:15 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:15.438 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:15 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:15.439 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:05:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:15.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:15 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:15.441 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:05:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:15.484 2 INFO neutron.agent.securitygroups_rpc [None req-b54f8d61-4ff9-4f12-b7c4-4437880a4ca3 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['29c160c7-b39b-46f8-9f10-78478571a053', '9b6af020-41c4-4426-9a8f-3954958e92be']
Oct 14 10:05:15 np0005486759.ooo.test podman[320084]: 2025-10-14 10:05:15.490478428 +0000 UTC m=+0.065239112 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:15 np0005486759.ooo.test systemd[1]: tmp-crun.80LDfx.mount: Deactivated successfully.
Oct 14 10:05:15 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:15 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:15 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:15.765 287366 INFO neutron.agent.dhcp.agent [None req-48da62e2-7432-4d07-bd72-d7810bf26ca3 - - - - - -] DHCP configuration for ports {'6c182e54-1a33-431c-8316-a6346143300e'} is completed
Oct 14 10:05:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:15.931 2 INFO neutron.agent.securitygroups_rpc [None req-1512a630-986e-443d-9e0c-68acbf216fd1 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['9b6af020-41c4-4426-9a8f-3954958e92be']
Oct 14 10:05:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:16.208 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:05:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:16.380 2 INFO neutron.agent.securitygroups_rpc [None req-1141bb25-c8cd-450d-90d4-91e6b5988a8a c147b1d5045e45ee881d16392fd312d9 d9899fedfbba4525a6e5cd57c52e4aa4 - - default default] Security group member updated ['7faf12df-5d2c-4073-a518-5e5689319cb3']
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: tmp-crun.vVfiaQ.mount: Deactivated successfully.
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: tmp-crun.pAAKCP.mount: Deactivated successfully.
Oct 14 10:05:16 np0005486759.ooo.test podman[320106]: 2025-10-14 10:05:16.450350369 +0000 UTC m=+0.076220776 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:05:16 np0005486759.ooo.test podman[320105]: 2025-10-14 10:05:16.512624678 +0000 UTC m=+0.133683498 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:05:16 np0005486759.ooo.test podman[320114]: 2025-10-14 10:05:16.485494711 +0000 UTC m=+0.095671109 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:05:16 np0005486759.ooo.test podman[320106]: 2025-10-14 10:05:16.541668395 +0000 UTC m=+0.167538792 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 14 10:05:16 np0005486759.ooo.test podman[320105]: 2025-10-14 10:05:16.551380731 +0000 UTC m=+0.172439531 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:05:16 np0005486759.ooo.test podman[320114]: 2025-10-14 10:05:16.5693641 +0000 UTC m=+0.179540518 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:05:16 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:05:16 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:16 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:16 np0005486759.ooo.test podman[320190]: 2025-10-14 10:05:16.599758497 +0000 UTC m=+0.040189017 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:16 np0005486759.ooo.test podman[320107]: 2025-10-14 10:05:16.675870429 +0000 UTC m=+0.291871076 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:05:16 np0005486759.ooo.test podman[320107]: 2025-10-14 10:05:16.712795915 +0000 UTC m=+0.328796632 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 14 10:05:16 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.219 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.219 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.220 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.220 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.276 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.351 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.352 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.426 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.427 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.477 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.479 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.551 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.805 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.807 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12272MB free_disk=386.67736053466797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.808 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:05:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:17.808 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:05:18 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:18.179 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:cb:e3 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9a11b08-56ff-40fe-ac24-3c9837428cfb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=13c47791-ba8c-468a-b4a7-890a7ca81541) old=Port_Binding(mac=['fa:16:3e:e8:cb:e3 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:18 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:18.181 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 13c47791-ba8c-468a-b4a7-890a7ca81541 in datapath bc1d2da3-bea1-48d9-8064-d4ed4892597b updated
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.182 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.183 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.183 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:05:18 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:18.184 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc1d2da3-bea1-48d9-8064-d4ed4892597b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:18 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:18.185 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[ea00770a-e0df-4bcb-b51e-8f1485d360fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:18.375 2 INFO neutron.agent.securitygroups_rpc [None req-5aa7f8d2-f613-4c23-ab04-90a309ecd3dd 9644ac996cd34aeebfbe6d09534f0e18 ffaa688d672848928232f179dd31360d - - default default] Security group member updated ['250dd44f-51cb-4ea2-9291-f5030ca6d0b8']
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.391 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.644 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.645 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.664 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.683 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.744 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.760 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.763 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.763 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:05:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:18.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:18.812 2 INFO neutron.agent.securitygroups_rpc [None req-12797da7-da16-4b32-9cbd-b7c8f7dffc58 9644ac996cd34aeebfbe6d09534f0e18 ffaa688d672848928232f179dd31360d - - default default] Security group member updated ['250dd44f-51cb-4ea2-9291-f5030ca6d0b8']
Oct 14 10:05:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20169 DF PROTO=TCP SPT=44560 DPT=9102 SEQ=4081864187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA533070000000001030307) 
Oct 14 10:05:19 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:19.538 2 INFO neutron.agent.securitygroups_rpc [None req-25a7e258-1477-4b9f-bed2-875fd0c3375e 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['9b040326-fb44-4bdf-afdc-2495d9fef570']
Oct 14 10:05:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:19.763 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:19.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:20.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:20.189 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:05:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20170 DF PROTO=TCP SPT=44560 DPT=9102 SEQ=4081864187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA537010000000001030307) 
Oct 14 10:05:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:21.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:22.132 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:cb:e3 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9a11b08-56ff-40fe-ac24-3c9837428cfb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=13c47791-ba8c-468a-b4a7-890a7ca81541) old=Port_Binding(mac=['fa:16:3e:e8:cb:e3 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc1d2da3-bea1-48d9-8064-d4ed4892597b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '141e57a7b7e34017814ba8c1818c2cfc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:22.134 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 13c47791-ba8c-468a-b4a7-890a7ca81541 in datapath bc1d2da3-bea1-48d9-8064-d4ed4892597b updated
Oct 14 10:05:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:22.137 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc1d2da3-bea1-48d9-8064-d4ed4892597b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:22 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:22.138 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcf885b-9a22-47dc-ac0c-725202b533a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.192 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.192 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.321 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.321 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.321 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:05:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:22.322 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:05:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20171 DF PROTO=TCP SPT=44560 DPT=9102 SEQ=4081864187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA53F010000000001030307) 
Oct 14 10:05:23 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:23.186 2 INFO neutron.agent.securitygroups_rpc [None req-729dfcd4-af44-4895-ba3f-6f657d737d4b 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['9b040326-fb44-4bdf-afdc-2495d9fef570', '4a5205e2-c445-4c6c-b5d0-9149b1f5fabf', '95ba973b-6a6e-43fd-9065-cbba78993a98']
Oct 14 10:05:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:23.647 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:05:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:23.670 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:05:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:23.670 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:05:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:23.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:23 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:23.972 2 INFO neutron.agent.securitygroups_rpc [None req-c94ac6a6-40a6-4929-aab9-fc32a95d472f fc12ba0e15344d4780bd333987f1c029 bd9c5544795e4b4d8101d70737aa8519 - - default default] Security group member updated ['cbafc132-c0e9-4ebb-9fc1-838d9918f643']
Oct 14 10:05:24 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:24.066 2 INFO neutron.agent.securitygroups_rpc [None req-8935a238-a76c-4f81-9a04-93bef284cdcb 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['4a5205e2-c445-4c6c-b5d0-9149b1f5fabf', '95ba973b-6a6e-43fd-9065-cbba78993a98']
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.210 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.210 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.230 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] There are 1 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.230 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 9b229ec3-a035-4949-8484-69924247065a] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11158
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.231 2 INFO nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 9b229ec3-a035-4949-8484-69924247065a] Deletion of /var/lib/nova/instances/9b229ec3-a035-4949-8484-69924247065a_del complete
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.309 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:05:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:05:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:05:24 np0005486759.ooo.test podman[320235]: 2025-10-14 10:05:24.44614644 +0000 UTC m=+0.073459382 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Oct 14 10:05:24 np0005486759.ooo.test podman[320235]: 2025-10-14 10:05:24.484438338 +0000 UTC m=+0.111751271 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 14 10:05:24 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:05:24 np0005486759.ooo.test podman[320234]: 2025-10-14 10:05:24.50186554 +0000 UTC m=+0.132278576 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:05:24 np0005486759.ooo.test podman[320234]: 2025-10-14 10:05:24.565537473 +0000 UTC m=+0.195950509 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 14 10:05:24 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:05:24 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:24.626 2 INFO neutron.agent.securitygroups_rpc [None req-8d5d6891-696b-4719-9d8d-0565818319cf 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:24.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:25 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:24.999 2 INFO neutron.agent.securitygroups_rpc [None req-5a1e4b74-0757-4f6f-bffd-c0e05f184743 fc12ba0e15344d4780bd333987f1c029 bd9c5544795e4b4d8101d70737aa8519 - - default default] Security group member updated ['cbafc132-c0e9-4ebb-9fc1-838d9918f643']
Oct 14 10:05:25 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:25.581 2 INFO neutron.agent.securitygroups_rpc [None req-77f5e837-8bf1-49cd-82e4-e3c10da0a508 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:25 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:25.861 2 INFO neutron.agent.securitygroups_rpc [None req-f0255fbf-92b5-4916-8612-7f4be1edd16f fc12ba0e15344d4780bd333987f1c029 bd9c5544795e4b4d8101d70737aa8519 - - default default] Security group member updated ['cbafc132-c0e9-4ebb-9fc1-838d9918f643']
Oct 14 10:05:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:26.558 2 INFO neutron.agent.securitygroups_rpc [None req-0b960880-455c-4da3-acfd-8569d110b0a3 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:26 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:26.586 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:26Z, description=, device_id=ee92532d-b477-48de-a8a7-db5948001a7b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c42e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c4550>], id=9dbfe94c-2389-4147-bacf-8426cefa599f, ip_allocation=immediate, mac_address=fa:16:3e:f0:dc:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1625, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:05:26Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20172 DF PROTO=TCP SPT=44560 DPT=9102 SEQ=4081864187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA54EC10000000001030307) 
Oct 14 10:05:26 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:26 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:26 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:26 np0005486759.ooo.test podman[320296]: 2025-10-14 10:05:26.770042664 +0000 UTC m=+0.046410707 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:26.920 2 INFO neutron.agent.securitygroups_rpc [None req-0df3d330-69c6-4de0-9545-1c8bef935c92 897e5c5aeb714e8789dd0a4d4bf95e74 141e57a7b7e34017814ba8c1818c2cfc - - default default] Security group member updated ['63ce3340-2f7e-4974-a36f-6a7e8b3a4fe0']
Oct 14 10:05:27 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:27.023 287366 INFO neutron.agent.dhcp.agent [None req-80fa673a-7414-4cfe-9b38-ecf8fc147115 - - - - - -] DHCP configuration for ports {'9dbfe94c-2389-4147-bacf-8426cefa599f'} is completed
Oct 14 10:05:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:27.165 2 INFO neutron.agent.securitygroups_rpc [None req-6a0d9963-db00-44f1-b96f-b0809338e156 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:27.823 2 INFO neutron.agent.securitygroups_rpc [None req-396095f0-dbbb-4e36-b8cd-8b4a0ae4f0a0 fc12ba0e15344d4780bd333987f1c029 bd9c5544795e4b4d8101d70737aa8519 - - default default] Security group member updated ['cbafc132-c0e9-4ebb-9fc1-838d9918f643']
Oct 14 10:05:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:27.878 2 INFO neutron.agent.securitygroups_rpc [None req-6e67595d-cac8-4c4c-a6ee-9e36831e860d 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:28 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:28.630 2 INFO neutron.agent.securitygroups_rpc [None req-8970cd3b-6457-4470-a5a6-8e5bef58fa3a 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:28.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:29 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:29.962 2 INFO neutron.agent.securitygroups_rpc [None req-5e82705b-073b-4418-816a-45580036124f 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:30.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:30 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:30.373 2 INFO neutron.agent.securitygroups_rpc [None req-ff0ac900-d867-4234-8ea6-cc1f71ae6b19 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:30 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:05:30 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:30 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:30 np0005486759.ooo.test podman[320332]: 2025-10-14 10:05:30.426091636 +0000 UTC m=+0.058842156 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:05:30 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:30.856 2 INFO neutron.agent.securitygroups_rpc [None req-74eb139d-feed-46bc-b50c-bfbedcb54111 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:31.762 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:31Z, description=, device_id=5e2ce29d-6e39-4974-9d0f-ab8e332e5715, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec624c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6241c0>], id=ca6185ac-b0fe-4fee-ad71-6e1d9cad2fa1, ip_allocation=immediate, mac_address=fa:16:3e:59:e5:4f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1636, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:05:31Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:31 np0005486759.ooo.test systemd[1]: tmp-crun.fLtftP.mount: Deactivated successfully.
Oct 14 10:05:31 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:31 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:31 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:31 np0005486759.ooo.test podman[320367]: 2025-10-14 10:05:31.965636302 +0000 UTC m=+0.059957601 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:05:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:32.233 287366 INFO neutron.agent.dhcp.agent [None req-aabb0e45-a14e-40c0-a8fe-68d0b2510971 - - - - - -] DHCP configuration for ports {'ca6185ac-b0fe-4fee-ad71-6e1d9cad2fa1'} is completed
Oct 14 10:05:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:32.315 2 INFO neutron.agent.securitygroups_rpc [None req-67cde94a-4b38-439f-ba6b-4fadfa654e24 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:33 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:33.147 2 INFO neutron.agent.securitygroups_rpc [None req-fc49f39b-1c31-4b21-96ab-5bcabe84ffd5 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:33 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:05:33 np0005486759.ooo.test podman[320388]: 2025-10-14 10:05:33.457567165 +0000 UTC m=+0.090820291 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:05:33 np0005486759.ooo.test podman[320388]: 2025-10-14 10:05:33.468597241 +0000 UTC m=+0.101850357 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 14 10:05:33 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:05:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:33.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:33 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:33.787 2 INFO neutron.agent.securitygroups_rpc [None req-a1c27f67-9eb2-49c8-9827-9c776bc4c76a 21ced48626a141bda70dc7a6f239742b 13cc8ffeb195400a80342171f18b6334 - - default default] Security group member updated ['08a22905-c87f-460c-8f6d-5eb09ab0671d']
Oct 14 10:05:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:33.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:34.784 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:34Z, description=, device_id=f0baab86-ecb0-4f2c-a81c-94a8ff98e96c, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6707c0>], id=e1b4c3ab-d10f-4f2b-9502-8582c57ee167, ip_allocation=immediate, mac_address=fa:16:3e:92:ac:c1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1655, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:05:34Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:34 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:34.911 2 INFO neutron.agent.securitygroups_rpc [None req-5e9cbebe-901d-4cd4-b4d4-2f73eafad59d 4aceb5785eb547179587ca23aff33523 14ba2e741d664259968e9fb6dc6a2daf - - default default] Security group member updated ['196dc4e8-af02-4ef0-9b66-22038e2068f3']
Oct 14 10:05:35 np0005486759.ooo.test systemd[1]: tmp-crun.ybuRPn.mount: Deactivated successfully.
Oct 14 10:05:35 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:05:35 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:35 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:35 np0005486759.ooo.test podman[320424]: 2025-10-14 10:05:35.006775475 +0000 UTC m=+0.048031676 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:35.241 287366 INFO neutron.agent.dhcp.agent [None req-5dfe92d4-12fb-404e-8258-cdcc32ea3aa3 - - - - - -] DHCP configuration for ports {'e1b4c3ab-d10f-4f2b-9502-8582c57ee167'} is completed
Oct 14 10:05:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:35.469 287366 INFO neutron.agent.linux.ip_lib [None req-727fcbe9-3d79-43a8-940b-13b239d41f11 - - - - - -] Device tap861a2076-86 cannot be used as it has no MAC address
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test kernel: device tap861a2076-86 entered promiscuous mode
Oct 14 10:05:35 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436335.4933] manager: (tap861a2076-86): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Oct 14 10:05:35 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:35Z|00194|binding|INFO|Claiming lport 861a2076-8663-4c08-b722-5fccfccef2b4 for this chassis.
Oct 14 10:05:35 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:35Z|00195|binding|INFO|861a2076-8663-4c08-b722-5fccfccef2b4: Claiming unknown
Oct 14 10:05:35 np0005486759.ooo.test systemd-udevd[320456]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:35Z|00196|binding|INFO|Setting lport 861a2076-8663-4c08-b722-5fccfccef2b4 ovn-installed in OVS
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:35Z|00197|binding|INFO|Setting lport 861a2076-8663-4c08-b722-5fccfccef2b4 up in Southbound
Oct 14 10:05:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:35.517 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-088bccb5-24b9-4ee0-b0b7-043a59f5de32', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-088bccb5-24b9-4ee0-b0b7-043a59f5de32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79966a36fd094eb0adde863a9489901f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4d10b8ef-6175-448f-9156-671086675b82, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=861a2076-8663-4c08-b722-5fccfccef2b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:35.518 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 861a2076-8663-4c08-b722-5fccfccef2b4 in datapath 088bccb5-24b9-4ee0-b0b7-043a59f5de32 bound to our chassis
Oct 14 10:05:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:35.519 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 088bccb5-24b9-4ee0-b0b7-043a59f5de32 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:05:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:35.521 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[12076d15-6f93-4dc9-b6f3-71c823dab479]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:35.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:35 np0005486759.ooo.test podman[320462]: 2025-10-14 10:05:35.610165103 +0000 UTC m=+0.064780738 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:05:35 np0005486759.ooo.test podman[320462]: 2025-10-14 10:05:35.616896708 +0000 UTC m=+0.071512343 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:05:35 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:05:36 np0005486759.ooo.test podman[320532]: 
Oct 14 10:05:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:36.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:36 np0005486759.ooo.test podman[320532]: 2025-10-14 10:05:36.327843877 +0000 UTC m=+0.088378668 container create b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:05:36 np0005486759.ooo.test systemd[1]: Started libpod-conmon-b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990.scope.
Oct 14 10:05:36 np0005486759.ooo.test podman[320532]: 2025-10-14 10:05:36.281572065 +0000 UTC m=+0.042106886 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:36 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:36 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2d49ee205966d34f85f889204e0ca2e1924b88f7b584bcf89f2e9a30765a329/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:36 np0005486759.ooo.test podman[320532]: 2025-10-14 10:05:36.406179716 +0000 UTC m=+0.166714477 container init b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:36 np0005486759.ooo.test podman[320532]: 2025-10-14 10:05:36.416271424 +0000 UTC m=+0.176806175 container start b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0)
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq[320550]: started, version 2.85 cachesize 150
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq[320550]: DNS service limited to local subnets
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq[320550]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq[320550]: warning: no upstream servers configured
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq-dhcp[320550]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/addn_hosts - 0 addresses
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/host
Oct 14 10:05:36 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/opts
Oct 14 10:05:36 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:36.608 287366 INFO neutron.agent.dhcp.agent [None req-fe426db2-8ab5-4be7-81c6-37133a18ac12 - - - - - -] DHCP configuration for ports {'8ecab855-8a3f-4091-831a-a82d17ec79c8'} is completed
Oct 14 10:05:37 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:37 np0005486759.ooo.test podman[320568]: 2025-10-14 10:05:37.16549663 +0000 UTC m=+0.052044409 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 14 10:05:37 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:37 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:37 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:37Z|00198|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:37.544 287366 INFO neutron.agent.linux.ip_lib [None req-d264cf71-2df2-4393-8c21-f986f93728ad - - - - - -] Device tapb8f01e0c-01 cannot be used as it has no MAC address
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test kernel: device tapb8f01e0c-01 entered promiscuous mode
Oct 14 10:05:37 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436337.6209] manager: (tapb8f01e0c-01): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Oct 14 10:05:37 np0005486759.ooo.test systemd-udevd[320458]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:05:37 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:37Z|00199|binding|INFO|Claiming lport b8f01e0c-0193-4664-9f12-ca464132580c for this chassis.
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:37Z|00200|binding|INFO|b8f01e0c-0193-4664-9f12-ca464132580c: Claiming unknown
Oct 14 10:05:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:37.634 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-499c3c38-1e44-4c6c-ad9b-74f94e7028d0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-499c3c38-1e44-4c6c-ad9b-74f94e7028d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79966a36fd094eb0adde863a9489901f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02793762-e150-46e3-97b8-d65402e3f35b, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=b8f01e0c-0193-4664-9f12-ca464132580c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:37.637 183328 INFO neutron.agent.ovn.metadata.agent [-] Port b8f01e0c-0193-4664-9f12-ca464132580c in datapath 499c3c38-1e44-4c6c-ad9b-74f94e7028d0 bound to our chassis
Oct 14 10:05:37 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:37Z|00201|binding|INFO|Setting lport b8f01e0c-0193-4664-9f12-ca464132580c ovn-installed in OVS
Oct 14 10:05:37 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:37Z|00202|binding|INFO|Setting lport b8f01e0c-0193-4664-9f12-ca464132580c up in Southbound
Oct 14 10:05:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:37.640 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 499c3c38-1e44-4c6c-ad9b-74f94e7028d0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:37.642 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[05d0d72b-59b0-4d9b-a99a-e9f21c0a4670]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:37.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:37 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:37Z|00203|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:05:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:38.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:38 np0005486759.ooo.test podman[320637]: 2025-10-14 10:05:38.056670086 +0000 UTC m=+0.069846361 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:38 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:38.102 2 INFO neutron.agent.securitygroups_rpc [None req-fc7509ef-1d18-4446-a4a9-218e6f7408d4 4aceb5785eb547179587ca23aff33523 14ba2e741d664259968e9fb6dc6a2daf - - default default] Security group member updated ['196dc4e8-af02-4ef0-9b66-22038e2068f3']
Oct 14 10:05:38 np0005486759.ooo.test podman[320688]: 
Oct 14 10:05:38 np0005486759.ooo.test podman[320688]: 2025-10-14 10:05:38.567276633 +0000 UTC m=+0.082088735 container create daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:05:38 np0005486759.ooo.test systemd[1]: Started libpod-conmon-daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6.scope.
Oct 14 10:05:38 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:38 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6650d53946a035da6a32a6f142e82c6cf1e8f40af95638d2349f1f5ce44162ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:38 np0005486759.ooo.test podman[320688]: 2025-10-14 10:05:38.626282433 +0000 UTC m=+0.141094545 container init daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:38 np0005486759.ooo.test podman[320688]: 2025-10-14 10:05:38.529444969 +0000 UTC m=+0.044257181 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:38 np0005486759.ooo.test podman[320688]: 2025-10-14 10:05:38.640184247 +0000 UTC m=+0.154996359 container start daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[320706]: started, version 2.85 cachesize 150
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[320706]: DNS service limited to local subnets
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[320706]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[320706]: warning: no upstream servers configured
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[320706]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/addn_hosts - 0 addresses
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/host
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/opts
Oct 14 10:05:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:38.710 287366 INFO neutron.agent.dhcp.agent [None req-d264cf71-2df2-4393-8c21-f986f93728ad - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:37Z, description=, device_id=3b267fe5-d0e5-42a8-932d-6d0a436d6b05, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7a2b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7de0a0>], id=666bd5eb-44ad-4940-8995-17522eb10121, ip_allocation=immediate, mac_address=fa:16:3e:95:f2:5d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:35Z, description=, dns_domain=, id=499c3c38-1e44-4c6c-ad9b-74f94e7028d0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--203616564, port_security_enabled=True, project_id=79966a36fd094eb0adde863a9489901f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10702, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1664, status=ACTIVE, subnets=['0178838d-db5d-427a-9b28-36b8d553257a'], tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:36Z, vlan_transparent=None, network_id=499c3c38-1e44-4c6c-ad9b-74f94e7028d0, port_security_enabled=False, project_id=79966a36fd094eb0adde863a9489901f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1685, status=DOWN, tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:37Z on network 499c3c38-1e44-4c6c-ad9b-74f94e7028d0
Oct 14 10:05:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:38.787 287366 INFO neutron.agent.dhcp.agent [None req-20577bea-ef58-4cae-8add-f0c2104245b4 - - - - - -] DHCP configuration for ports {'2f3eb6b2-26c0-44b4-90f6-7d6f8637cf79'} is completed
Oct 14 10:05:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:38.832 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:38Z, description=, device_id=d4e56b9d-9603-40bb-af67-6041cd4689cf, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec60c640>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5fe400>], id=ae3d6870-d13a-4f2d-a1d2-7a77c22ada28, ip_allocation=immediate, mac_address=fa:16:3e:ba:8b:aa, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1691, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:05:38Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:38.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:38 np0005486759.ooo.test podman[320726]: 2025-10-14 10:05:38.862191959 +0000 UTC m=+0.046060845 container kill daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/addn_hosts - 1 addresses
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/host
Oct 14 10:05:38 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/opts
Oct 14 10:05:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:38.921 287366 INFO neutron.agent.linux.ip_lib [None req-ff0f18d6-072e-4463-ad22-0cefe689b2bf - - - - - -] Device tap6bfcee1c-22 cannot be used as it has no MAC address
Oct 14 10:05:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:38.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:38 np0005486759.ooo.test kernel: device tap6bfcee1c-22 entered promiscuous mode
Oct 14 10:05:38 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436338.9521] manager: (tap6bfcee1c-22): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Oct 14 10:05:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:38.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:38Z|00204|binding|INFO|Claiming lport 6bfcee1c-22b8-411a-9e77-f54280874243 for this chassis.
Oct 14 10:05:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:38Z|00205|binding|INFO|6bfcee1c-22b8-411a-9e77-f54280874243: Claiming unknown
Oct 14 10:05:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:38.965 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-dd3b5582-1d48-49b6-b016-99de2f2b1a36', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd3b5582-1d48-49b6-b016-99de2f2b1a36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26de2128d7d44d46983310bb259305ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c591649e-af66-4427-bcca-cd29f880129a, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=6bfcee1c-22b8-411a-9e77-f54280874243) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:38.967 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 6bfcee1c-22b8-411a-9e77-f54280874243 in datapath dd3b5582-1d48-49b6-b016-99de2f2b1a36 bound to our chassis
Oct 14 10:05:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:38.968 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network dd3b5582-1d48-49b6-b016-99de2f2b1a36 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:05:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:38.969 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[83831474-bbf6-4939-9f63-55668499b869]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:38.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:39Z|00206|binding|INFO|Setting lport 6bfcee1c-22b8-411a-9e77-f54280874243 ovn-installed in OVS
Oct 14 10:05:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:39Z|00207|binding|INFO|Setting lport 6bfcee1c-22b8-411a-9e77-f54280874243 up in Southbound
Oct 14 10:05:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:39.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:39.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:39.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:39.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:39 np0005486759.ooo.test podman[320773]: 2025-10-14 10:05:39.06906731 +0000 UTC m=+0.045899950 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:39.261 287366 INFO neutron.agent.dhcp.agent [None req-300610d3-2ae5-4ac2-ae41-d7b3cd035f65 - - - - - -] DHCP configuration for ports {'666bd5eb-44ad-4940-8995-17522eb10121'} is completed
Oct 14 10:05:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:39.419 287366 INFO neutron.agent.dhcp.agent [None req-81201541-9a9f-4591-b4e0-86525f6a6a90 - - - - - -] DHCP configuration for ports {'ae3d6870-d13a-4f2d-a1d2-7a77c22ada28'} is completed
Oct 14 10:05:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:39.566 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:37Z, description=, device_id=3b267fe5-d0e5-42a8-932d-6d0a436d6b05, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec748fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec60cd00>], id=666bd5eb-44ad-4940-8995-17522eb10121, ip_allocation=immediate, mac_address=fa:16:3e:95:f2:5d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:35Z, description=, dns_domain=, id=499c3c38-1e44-4c6c-ad9b-74f94e7028d0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--203616564, port_security_enabled=True, project_id=79966a36fd094eb0adde863a9489901f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10702, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1664, status=ACTIVE, subnets=['0178838d-db5d-427a-9b28-36b8d553257a'], tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:36Z, vlan_transparent=None, network_id=499c3c38-1e44-4c6c-ad9b-74f94e7028d0, port_security_enabled=False, project_id=79966a36fd094eb0adde863a9489901f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1685, status=DOWN, tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:37Z on network 499c3c38-1e44-4c6c-ad9b-74f94e7028d0
Oct 14 10:05:39 np0005486759.ooo.test systemd[1]: tmp-crun.r9e0lB.mount: Deactivated successfully.
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/addn_hosts - 1 addresses
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/host
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/opts
Oct 14 10:05:39 np0005486759.ooo.test podman[320854]: 2025-10-14 10:05:39.738452521 +0000 UTC m=+0.058278859 container kill daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:39 np0005486759.ooo.test podman[320871]: 
Oct 14 10:05:39 np0005486759.ooo.test podman[320871]: 2025-10-14 10:05:39.828432656 +0000 UTC m=+0.085863540 container create 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 10:05:39 np0005486759.ooo.test systemd[1]: Started libpod-conmon-4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec.scope.
Oct 14 10:05:39 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:39 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9afa860c30fd5093a283c43ce2d04a186eef830d50acdbcea2c9cae6daac72b8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:39 np0005486759.ooo.test podman[320871]: 2025-10-14 10:05:39.876873714 +0000 UTC m=+0.134304608 container init 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:39 np0005486759.ooo.test podman[320871]: 2025-10-14 10:05:39.882844666 +0000 UTC m=+0.140275570 container start 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:39 np0005486759.ooo.test podman[320871]: 2025-10-14 10:05:39.784318141 +0000 UTC m=+0.041749145 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[320897]: started, version 2.85 cachesize 150
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[320897]: DNS service limited to local subnets
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[320897]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[320897]: warning: no upstream servers configured
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[320897]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq[320897]: read /var/lib/neutron/dhcp/dd3b5582-1d48-49b6-b016-99de2f2b1a36/addn_hosts - 0 addresses
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[320897]: read /var/lib/neutron/dhcp/dd3b5582-1d48-49b6-b016-99de2f2b1a36/host
Oct 14 10:05:39 np0005486759.ooo.test dnsmasq-dhcp[320897]: read /var/lib/neutron/dhcp/dd3b5582-1d48-49b6-b016-99de2f2b1a36/opts
Oct 14 10:05:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:39.988 287366 INFO neutron.agent.dhcp.agent [None req-9ad99324-77e6-4128-9523-cedcca52440e - - - - - -] DHCP configuration for ports {'666bd5eb-44ad-4940-8995-17522eb10121'} is completed
Oct 14 10:05:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:40.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:40 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:40.184 287366 INFO neutron.agent.dhcp.agent [None req-811b22c9-30b0-4e08-b246-e171c12e49dd - - - - - -] DHCP configuration for ports {'f93b0f39-531a-4f42-8d38-0e3cfcb4fb7a'} is completed
Oct 14 10:05:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:41.885 287366 INFO neutron.agent.linux.ip_lib [None req-2e6a0228-17af-44d4-87e0-9b1b2b5a321f - - - - - -] Device tape3a2fcfb-ee cannot be used as it has no MAC address
Oct 14 10:05:41 np0005486759.ooo.test podman[320920]: 2025-10-14 10:05:41.895906347 +0000 UTC m=+0.050701598 container kill daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:41 np0005486759.ooo.test dnsmasq[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/addn_hosts - 0 addresses
Oct 14 10:05:41 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/host
Oct 14 10:05:41 np0005486759.ooo.test dnsmasq-dhcp[320706]: read /var/lib/neutron/dhcp/499c3c38-1e44-4c6c-ad9b-74f94e7028d0/opts
Oct 14 10:05:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:41.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:41 np0005486759.ooo.test kernel: device tape3a2fcfb-ee entered promiscuous mode
Oct 14 10:05:41 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436341.9044] manager: (tape3a2fcfb-ee): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Oct 14 10:05:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:41.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:41 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:41Z|00208|binding|INFO|Claiming lport e3a2fcfb-eef5-47e8-8364-7bbeff183052 for this chassis.
Oct 14 10:05:41 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:41Z|00209|binding|INFO|e3a2fcfb-eef5-47e8-8364-7bbeff183052: Claiming unknown
Oct 14 10:05:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:41.929 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec4:f67e/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-ff396185-31d8-403a-a897-35f3e2f7e01d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff396185-31d8-403a-a897-35f3e2f7e01d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26de2128d7d44d46983310bb259305ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=990129ee-b905-4335-8009-1d92c4b4554d, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=e3a2fcfb-eef5-47e8-8364-7bbeff183052) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:41.930 183328 INFO neutron.agent.ovn.metadata.agent [-] Port e3a2fcfb-eef5-47e8-8364-7bbeff183052 in datapath ff396185-31d8-403a-a897-35f3e2f7e01d bound to our chassis
Oct 14 10:05:41 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:41Z|00210|binding|INFO|Setting lport e3a2fcfb-eef5-47e8-8364-7bbeff183052 ovn-installed in OVS
Oct 14 10:05:41 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:41Z|00211|binding|INFO|Setting lport e3a2fcfb-eef5-47e8-8364-7bbeff183052 up in Southbound
Oct 14 10:05:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:41.932 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8e04ca7e-788a-4a25-b0be-fea8f3cbe435 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:05:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:41.932 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff396185-31d8-403a-a897-35f3e2f7e01d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:41.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:41.934 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[76691269-7d36-4658-b739-376768642431]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:41.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:41.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:42 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:42.026 2 INFO neutron.agent.securitygroups_rpc [None req-bda0dd4b-5174-47fa-a47f-d53082b6c9c7 44fda180099d4024ad942a0c70fafe77 26de2128d7d44d46983310bb259305ba - - default default] Security group member updated ['1d17bda2-b0cf-4089-bdf8-8b6724a420c4']
Oct 14 10:05:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:42Z|00212|binding|INFO|Releasing lport b8f01e0c-0193-4664-9f12-ca464132580c from this chassis (sb_readonly=0)
Oct 14 10:05:42 np0005486759.ooo.test kernel: device tapb8f01e0c-01 left promiscuous mode
Oct 14 10:05:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:42.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:42Z|00213|binding|INFO|Setting lport b8f01e0c-0193-4664-9f12-ca464132580c down in Southbound
Oct 14 10:05:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:42.086 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-499c3c38-1e44-4c6c-ad9b-74f94e7028d0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-499c3c38-1e44-4c6c-ad9b-74f94e7028d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79966a36fd094eb0adde863a9489901f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02793762-e150-46e3-97b8-d65402e3f35b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=b8f01e0c-0193-4664-9f12-ca464132580c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:42.087 183328 INFO neutron.agent.ovn.metadata.agent [-] Port b8f01e0c-0193-4664-9f12-ca464132580c in datapath 499c3c38-1e44-4c6c-ad9b-74f94e7028d0 unbound from our chassis
Oct 14 10:05:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:42.089 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 499c3c38-1e44-4c6c-ad9b-74f94e7028d0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:05:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:42.089 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[35491810-70e9-45b0-a3f5-9fb0bf26d683]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:42.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:05:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:05:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:05:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 136129 "" "Go-http-client/1.1"
Oct 14 10:05:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:05:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18111 "" "Go-http-client/1.1"
Oct 14 10:05:42 np0005486759.ooo.test podman[321000]: 
Oct 14 10:05:42 np0005486759.ooo.test podman[321000]: 2025-10-14 10:05:42.802505364 +0000 UTC m=+0.094830524 container create f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:42 np0005486759.ooo.test systemd[1]: Started libpod-conmon-f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771.scope.
Oct 14 10:05:42 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:42 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbcbb13a71b2ea9617b84162f25154ddcddb16473052eaaf9f607f6227cdf100/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:42 np0005486759.ooo.test podman[321000]: 2025-10-14 10:05:42.863943468 +0000 UTC m=+0.156268628 container init f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 14 10:05:42 np0005486759.ooo.test podman[321000]: 2025-10-14 10:05:42.869277161 +0000 UTC m=+0.161602321 container start f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:05:42 np0005486759.ooo.test podman[321000]: 2025-10-14 10:05:42.773717496 +0000 UTC m=+0.066042676 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:42 np0005486759.ooo.test dnsmasq[321018]: started, version 2.85 cachesize 150
Oct 14 10:05:42 np0005486759.ooo.test dnsmasq[321018]: DNS service limited to local subnets
Oct 14 10:05:42 np0005486759.ooo.test dnsmasq[321018]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:42 np0005486759.ooo.test dnsmasq[321018]: warning: no upstream servers configured
Oct 14 10:05:42 np0005486759.ooo.test dnsmasq[321018]: read /var/lib/neutron/dhcp/ff396185-31d8-403a-a897-35f3e2f7e01d/addn_hosts - 0 addresses
Oct 14 10:05:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:42.914 287366 INFO neutron.agent.dhcp.agent [None req-2e6a0228-17af-44d4-87e0-9b1b2b5a321f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec683b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6836a0>], id=77e0c9c0-0e2b-4761-a63c-e683efbdcc7a, ip_allocation=immediate, mac_address=fa:16:3e:d4:d3:0d, name=tempest-NetworksIpV6TestAttrs-223479033, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:38Z, description=, dns_domain=, id=ff396185-31d8-403a-a897-35f3e2f7e01d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-547744161, port_security_enabled=True, project_id=26de2128d7d44d46983310bb259305ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1689, status=ACTIVE, subnets=['f97ffe49-b351-4a88-8c9a-d79e06c17eb8'], tags=[], tenant_id=26de2128d7d44d46983310bb259305ba, updated_at=2025-10-14T10:05:39Z, vlan_transparent=None, network_id=ff396185-31d8-403a-a897-35f3e2f7e01d, port_security_enabled=True, project_id=26de2128d7d44d46983310bb259305ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['1d17bda2-b0cf-4089-bdf8-8b6724a420c4'], standard_attr_id=1710, status=DOWN, tags=[], tenant_id=26de2128d7d44d46983310bb259305ba, updated_at=2025-10-14T10:05:41Z on network ff396185-31d8-403a-a897-35f3e2f7e01d
Oct 14 10:05:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:43.055 287366 INFO neutron.agent.dhcp.agent [None req-9115c89d-208d-417b-a008-c5e0d9285f06 - - - - - -] DHCP configuration for ports {'8d2e7447-019e-470f-80cb-69726a969bec'} is completed
Oct 14 10:05:43 np0005486759.ooo.test dnsmasq[321018]: read /var/lib/neutron/dhcp/ff396185-31d8-403a-a897-35f3e2f7e01d/addn_hosts - 1 addresses
Oct 14 10:05:43 np0005486759.ooo.test podman[321037]: 2025-10-14 10:05:43.067490548 +0000 UTC m=+0.043838139 container kill f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3)
Oct 14 10:05:43 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:43.166 2 INFO neutron.agent.securitygroups_rpc [None req-335f9723-dfc4-4870-8733-433ab95d5514 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:43.330 287366 INFO neutron.agent.dhcp.agent [None req-12caedd6-d82e-4298-b729-daa4a80fe246 - - - - - -] DHCP configuration for ports {'77e0c9c0-0e2b-4761-a63c-e683efbdcc7a'} is completed
Oct 14 10:05:43 np0005486759.ooo.test podman[321075]: 2025-10-14 10:05:43.654911007 +0000 UTC m=+0.043188648 container kill f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:05:43 np0005486759.ooo.test dnsmasq[321018]: exiting on receipt of SIGTERM
Oct 14 10:05:43 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:43.654 2 INFO neutron.agent.securitygroups_rpc [None req-11cd63f8-6cae-4a7d-afd4-04eaf1ea16cd e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:43 np0005486759.ooo.test systemd[1]: libpod-f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771.scope: Deactivated successfully.
Oct 14 10:05:43 np0005486759.ooo.test podman[321091]: 2025-10-14 10:05:43.716658291 +0000 UTC m=+0.045334664 container died f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:43 np0005486759.ooo.test podman[321091]: 2025-10-14 10:05:43.758644822 +0000 UTC m=+0.087321175 container remove f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff396185-31d8-403a-a897-35f3e2f7e01d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:43Z|00214|binding|INFO|Releasing lport e3a2fcfb-eef5-47e8-8364-7bbeff183052 from this chassis (sb_readonly=0)
Oct 14 10:05:43 np0005486759.ooo.test kernel: device tape3a2fcfb-ee left promiscuous mode
Oct 14 10:05:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:43.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:43Z|00215|binding|INFO|Setting lport e3a2fcfb-eef5-47e8-8364-7bbeff183052 down in Southbound
Oct 14 10:05:43 np0005486759.ooo.test systemd[1]: libpod-conmon-f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771.scope: Deactivated successfully.
Oct 14 10:05:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:43.777 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec4:f67e/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-ff396185-31d8-403a-a897-35f3e2f7e01d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff396185-31d8-403a-a897-35f3e2f7e01d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26de2128d7d44d46983310bb259305ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=990129ee-b905-4335-8009-1d92c4b4554d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=e3a2fcfb-eef5-47e8-8364-7bbeff183052) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:43.778 183328 INFO neutron.agent.ovn.metadata.agent [-] Port e3a2fcfb-eef5-47e8-8364-7bbeff183052 in datapath ff396185-31d8-403a-a897-35f3e2f7e01d unbound from our chassis
Oct 14 10:05:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:43.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:43.780 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff396185-31d8-403a-a897-35f3e2f7e01d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:43.781 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[108eaca0-00f7-4a03-b44b-6a59785773b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-bbcbb13a71b2ea9617b84162f25154ddcddb16473052eaaf9f607f6227cdf100-merged.mount: Deactivated successfully.
Oct 14 10:05:43 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f7fa99d85d7039056e41c4643334609f5fa5df3ac7b06120e119ee6b50d8b771-userdata-shm.mount: Deactivated successfully.
Oct 14 10:05:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:43.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:05:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:05:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:05:44 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2dff396185\x2d31d8\x2d403a\x2da897\x2d35f3e2f7e01d.mount: Deactivated successfully.
Oct 14 10:05:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:44.014 287366 INFO neutron.agent.dhcp.agent [None req-2a36591d-d0a0-439f-9ac7-53b85c4f51ec - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:05:44 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:44.522 2 INFO neutron.agent.securitygroups_rpc [None req-3c9c9f8f-3cfd-44cc-9153-2aab29c6d043 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:44.703 287366 INFO neutron.agent.linux.ip_lib [None req-3b12d7a1-e6ba-4eaf-860b-b8245367525b - - - - - -] Device tap94ddb033-7b cannot be used as it has no MAC address
Oct 14 10:05:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:44.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:44 np0005486759.ooo.test kernel: device tap94ddb033-7b entered promiscuous mode
Oct 14 10:05:44 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436344.7266] manager: (tap94ddb033-7b): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Oct 14 10:05:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:44Z|00216|binding|INFO|Claiming lport 94ddb033-7b68-4611-a364-93abba56085b for this chassis.
Oct 14 10:05:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:44Z|00217|binding|INFO|94ddb033-7b68-4611-a364-93abba56085b: Claiming unknown
Oct 14 10:05:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:44.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:44.740 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-980f946d-2066-4c56-a997-9ce99dc83806', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-980f946d-2066-4c56-a997-9ce99dc83806', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a00be71437c46aa93ad65c658262d59', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d28d608e-4f39-45dc-bf0e-59c674c60314, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=94ddb033-7b68-4611-a364-93abba56085b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:44.743 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 94ddb033-7b68-4611-a364-93abba56085b in datapath 980f946d-2066-4c56-a997-9ce99dc83806 bound to our chassis
Oct 14 10:05:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:44.747 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 596573dc-ef9b-46f1-94f2-0b30b7d99bef IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:05:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:44.747 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 980f946d-2066-4c56-a997-9ce99dc83806, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:05:44 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:44.749 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[3340c0fd-2a6e-4037-9626-b2de5a0ab69c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:44.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:44Z|00218|binding|INFO|Setting lport 94ddb033-7b68-4611-a364-93abba56085b ovn-installed in OVS
Oct 14 10:05:44 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:44Z|00219|binding|INFO|Setting lport 94ddb033-7b68-4611-a364-93abba56085b up in Southbound
Oct 14 10:05:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:44.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap94ddb033-7b: No such device
Oct 14 10:05:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:44.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:45.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:45 np0005486759.ooo.test podman[321191]: 2025-10-14 10:05:45.364321335 +0000 UTC m=+0.043450876 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:05:45 np0005486759.ooo.test podman[321231]: 
Oct 14 10:05:45 np0005486759.ooo.test podman[321231]: 2025-10-14 10:05:45.532180446 +0000 UTC m=+0.062422615 container create 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:45 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:45.534 2 INFO neutron.agent.securitygroups_rpc [None req-0ffc520b-7520-4476-9942-2c29e649afb6 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:45 np0005486759.ooo.test systemd[1]: Started libpod-conmon-6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193.scope.
Oct 14 10:05:45 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:05:45 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee08651271b4fc2776944ff502bc264202af6c754f4f786e349f67222249bd38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:05:45 np0005486759.ooo.test podman[321231]: 2025-10-14 10:05:45.5929484 +0000 UTC m=+0.123190559 container init 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:45 np0005486759.ooo.test podman[321231]: 2025-10-14 10:05:45.598600362 +0000 UTC m=+0.128842521 container start 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:05:45 np0005486759.ooo.test podman[321231]: 2025-10-14 10:05:45.502228053 +0000 UTC m=+0.032470242 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[321250]: started, version 2.85 cachesize 150
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[321250]: DNS service limited to local subnets
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[321250]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[321250]: warning: no upstream servers configured
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[321250]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 0 addresses
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:45.666 287366 INFO neutron.agent.dhcp.agent [None req-a3628f0e-f668-48ad-b8b1-f507ce76c8d5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:44Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7aed60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7ae640>], id=67561450-5384-4aae-b471-aac29334816f, ip_allocation=immediate, mac_address=fa:16:3e:27:82:6c, name=tempest-AllowedAddressPairTestJSON-1314859379, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:41Z, description=, dns_domain=, id=980f946d-2066-4c56-a997-9ce99dc83806, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-461773809, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19490, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['50e5e092-bc7c-40c0-a767-39624498db48'], tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:42Z, vlan_transparent=None, network_id=980f946d-2066-4c56-a997-9ce99dc83806, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['84c95655-fba8-47b6-bbda-854b4ef04515'], standard_attr_id=1723, status=DOWN, tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:44Z on network 980f946d-2066-4c56-a997-9ce99dc83806
Oct 14 10:05:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:45.846 287366 INFO neutron.agent.dhcp.agent [None req-75c48c05-360b-428d-acc3-ab1cf2308d45 - - - - - -] DHCP configuration for ports {'6e5f8c8a-addb-4764-a233-607d955b4833'} is completed
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 1 addresses
Oct 14 10:05:45 np0005486759.ooo.test podman[321268]: 2025-10-14 10:05:45.907350641 +0000 UTC m=+0.054069380 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:45 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:45 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:45.921 2 INFO neutron.agent.securitygroups_rpc [None req-38861387-cb77-4d43-8376-f7eaa7789d32 44fda180099d4024ad942a0c70fafe77 26de2128d7d44d46983310bb259305ba - - default default] Security group member updated ['1d17bda2-b0cf-4089-bdf8-8b6724a420c4']
Oct 14 10:05:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:46.324 287366 INFO neutron.agent.dhcp.agent [None req-c7d0199f-852e-426b-8ddd-04fd1a7ffe70 - - - - - -] DHCP configuration for ports {'67561450-5384-4aae-b471-aac29334816f'} is completed
Oct 14 10:05:46 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:46.674 2 INFO neutron.agent.securitygroups_rpc [None req-640fe84e-b499-4d33-93c1-d774d0113277 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:46.802 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c83a0>], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:45Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d5580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c8be0>], id=5b5e375b-5280-4c6b-a6be-82531eeb975b, ip_allocation=immediate, mac_address=fa:16:3e:a1:c6:7f, name=tempest-AllowedAddressPairTestJSON-1955542461, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:41Z, description=, dns_domain=, id=980f946d-2066-4c56-a997-9ce99dc83806, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-461773809, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19490, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['50e5e092-bc7c-40c0-a767-39624498db48'], tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:42Z, vlan_transparent=None, network_id=980f946d-2066-4c56-a997-9ce99dc83806, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['84c95655-fba8-47b6-bbda-854b4ef04515'], standard_attr_id=1732, status=DOWN, tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:46Z on network 980f946d-2066-4c56-a997-9ce99dc83806
Oct 14 10:05:46 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 2 addresses
Oct 14 10:05:46 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:46 np0005486759.ooo.test podman[321308]: 2025-10-14 10:05:46.997007273 +0000 UTC m=+0.052095530 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:46 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:05:47 np0005486759.ooo.test podman[321323]: 2025-10-14 10:05:47.118433197 +0000 UTC m=+0.085330364 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:05:47 np0005486759.ooo.test podman[321323]: 2025-10-14 10:05:47.127225506 +0000 UTC m=+0.094122723 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:05:47 np0005486759.ooo.test podman[321324]: 2025-10-14 10:05:47.097076906 +0000 UTC m=+0.064430807 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 14 10:05:47 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:47.207 287366 INFO neutron.agent.dhcp.agent [None req-a544fa38-24ed-42ae-889b-a005c237463e - - - - - -] DHCP configuration for ports {'5b5e375b-5280-4c6b-a6be-82531eeb975b'} is completed
Oct 14 10:05:47 np0005486759.ooo.test podman[321325]: 2025-10-14 10:05:47.217707406 +0000 UTC m=+0.179679753 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:05:47 np0005486759.ooo.test podman[321325]: 2025-10-14 10:05:47.231250719 +0000 UTC m=+0.193223026 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:05:47 np0005486759.ooo.test podman[321324]: 2025-10-14 10:05:47.281569184 +0000 UTC m=+0.248923075 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:05:47 np0005486759.ooo.test podman[321326]: 2025-10-14 10:05:47.182919425 +0000 UTC m=+0.143140978 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: tmp-crun.CkINxC.mount: Deactivated successfully.
Oct 14 10:05:47 np0005486759.ooo.test podman[321326]: 2025-10-14 10:05:47.368382632 +0000 UTC m=+0.328604175 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 14 10:05:47 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:05:47 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:47.676 2 INFO neutron.agent.securitygroups_rpc [None req-bdac2720-e2c0-4f65-9490-cbd87fbb312f e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:48 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:48.066 2 INFO neutron.agent.securitygroups_rpc [None req-ed8dcd2a-ca68-4524-b3a9-0026d79781ec fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:48.070 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:47Z, description=, device_id=3b267fe5-d0e5-42a8-932d-6d0a436d6b05, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f2400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7f27c0>], id=6b3496f9-e9f9-4f73-9f61-6890790e85ab, ip_allocation=immediate, mac_address=fa:16:3e:8d:c9:de, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:33Z, description=, dns_domain=, id=088bccb5-24b9-4ee0-b0b7-043a59f5de32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-787701176, port_security_enabled=True, project_id=79966a36fd094eb0adde863a9489901f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1645, status=ACTIVE, subnets=['5dfd4e30-9709-4eac-9218-2224be68d607'], tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:34Z, vlan_transparent=None, network_id=088bccb5-24b9-4ee0-b0b7-043a59f5de32, port_security_enabled=False, project_id=79966a36fd094eb0adde863a9489901f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1742, status=DOWN, tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:47Z on network 088bccb5-24b9-4ee0-b0b7-043a59f5de32
Oct 14 10:05:48 np0005486759.ooo.test podman[321421]: 2025-10-14 10:05:48.245022535 +0000 UTC m=+0.050000816 container kill b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:05:48 np0005486759.ooo.test dnsmasq[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/addn_hosts - 1 addresses
Oct 14 10:05:48 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/host
Oct 14 10:05:48 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/opts
Oct 14 10:05:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:48.478 287366 INFO neutron.agent.dhcp.agent [None req-7b6a887c-4410-4443-bc76-471407290261 - - - - - -] DHCP configuration for ports {'6b3496f9-e9f9-4f73-9f61-6890790e85ab'} is completed
Oct 14 10:05:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:48.655 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:47Z, description=, device_id=9b60b994-9044-4800-b0f9-cc237b518197, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5feb20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5feac0>], id=081d55cc-ec71-455c-8b1b-968561c6d793, ip_allocation=immediate, mac_address=fa:16:3e:68:2c:98, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1744, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:05:48Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:48 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:48 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:48 np0005486759.ooo.test podman[321459]: 2025-10-14 10:05:48.832162526 +0000 UTC m=+0.059367452 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3)
Oct 14 10:05:48 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:48 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:48.886 2 INFO neutron.agent.securitygroups_rpc [None req-a520b2a7-ce00-4bf5-a9c0-72f9e81769d4 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:48 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:48.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:49 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:49.063 287366 INFO neutron.agent.dhcp.agent [None req-bd4b7b83-1a90-42f7-bd38-eb075e5d2491 - - - - - -] DHCP configuration for ports {'081d55cc-ec71-455c-8b1b-968561c6d793'} is completed
Oct 14 10:05:49 np0005486759.ooo.test podman[321498]: 2025-10-14 10:05:49.225695191 +0000 UTC m=+0.060206758 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 14 10:05:49 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 1 addresses
Oct 14 10:05:49 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:49 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53438 DF PROTO=TCP SPT=34346 DPT=9102 SEQ=1289649417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA5A8380000000001030307) 
Oct 14 10:05:49 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:49.855 2 INFO neutron.agent.securitygroups_rpc [None req-0015e24f-f378-493a-9321-a555202dbe45 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:50.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:50 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:50.223 2 INFO neutron.agent.securitygroups_rpc [None req-22df02c5-8d63-4c8e-97ad-5e95419e0604 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:50 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:50.265 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:50Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec61a820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec61a6a0>], id=f2d07f14-45da-4be7-b227-d7808757372c, ip_allocation=immediate, mac_address=fa:16:3e:52:c4:50, name=tempest-AllowedAddressPairTestJSON-1016434460, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:41Z, description=, dns_domain=, id=980f946d-2066-4c56-a997-9ce99dc83806, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-461773809, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19490, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['50e5e092-bc7c-40c0-a767-39624498db48'], tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:42Z, vlan_transparent=None, network_id=980f946d-2066-4c56-a997-9ce99dc83806, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['84c95655-fba8-47b6-bbda-854b4ef04515'], standard_attr_id=1754, status=DOWN, tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:50Z on network 980f946d-2066-4c56-a997-9ce99dc83806
Oct 14 10:05:50 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 2 addresses
Oct 14 10:05:50 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:50 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:50 np0005486759.ooo.test podman[321535]: 2025-10-14 10:05:50.467186644 +0000 UTC m=+0.056288298 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:05:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53439 DF PROTO=TCP SPT=34346 DPT=9102 SEQ=1289649417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA5AC410000000001030307) 
Oct 14 10:05:50 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:50.774 287366 INFO neutron.agent.dhcp.agent [None req-c35b8c9e-77cd-401b-9354-6b4c20aa0841 - - - - - -] DHCP configuration for ports {'f2d07f14-45da-4be7-b227-d7808757372c'} is completed
Oct 14 10:05:50 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:50.820 2 INFO neutron.agent.securitygroups_rpc [None req-0df6d1b0-d511-4a60-9760-2c9026cddfa6 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:50 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:50.911 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:47Z, description=, device_id=3b267fe5-d0e5-42a8-932d-6d0a436d6b05, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec618040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec618070>], id=6b3496f9-e9f9-4f73-9f61-6890790e85ab, ip_allocation=immediate, mac_address=fa:16:3e:8d:c9:de, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:33Z, description=, dns_domain=, id=088bccb5-24b9-4ee0-b0b7-043a59f5de32, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-787701176, port_security_enabled=True, project_id=79966a36fd094eb0adde863a9489901f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1645, status=ACTIVE, subnets=['5dfd4e30-9709-4eac-9218-2224be68d607'], tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:34Z, vlan_transparent=None, network_id=088bccb5-24b9-4ee0-b0b7-043a59f5de32, port_security_enabled=False, project_id=79966a36fd094eb0adde863a9489901f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1742, status=DOWN, tags=[], tenant_id=79966a36fd094eb0adde863a9489901f, updated_at=2025-10-14T10:05:47Z on network 088bccb5-24b9-4ee0-b0b7-043a59f5de32
Oct 14 10:05:51 np0005486759.ooo.test dnsmasq[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/addn_hosts - 1 addresses
Oct 14 10:05:51 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/host
Oct 14 10:05:51 np0005486759.ooo.test podman[321571]: 2025-10-14 10:05:51.104120364 +0000 UTC m=+0.062951921 container kill b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:05:51 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/opts
Oct 14 10:05:51 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:51.513 287366 INFO neutron.agent.dhcp.agent [None req-f7b9ec70-f145-4472-98c0-82a518b191b4 - - - - - -] DHCP configuration for ports {'6b3496f9-e9f9-4f73-9f61-6890790e85ab'} is completed
Oct 14 10:05:51 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:51.871 2 INFO neutron.agent.securitygroups_rpc [None req-1002c515-021c-46d2-a29e-ec24b9db67e1 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:52 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:52.010 2 INFO neutron.agent.securitygroups_rpc [None req-1002c515-021c-46d2-a29e-ec24b9db67e1 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53440 DF PROTO=TCP SPT=34346 DPT=9102 SEQ=1289649417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA5B4410000000001030307) 
Oct 14 10:05:53 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:53.064 2 INFO neutron.agent.securitygroups_rpc [None req-c7119749-ded4-4ad3-a26f-35bba1b62f29 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:53 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:53.579 2 INFO neutron.agent.securitygroups_rpc [None req-e6464181-3796-4be5-9e62-2b682722b118 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:53 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 1 addresses
Oct 14 10:05:53 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:53 np0005486759.ooo.test podman[321607]: 2025-10-14 10:05:53.833125507 +0000 UTC m=+0.044988324 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:53 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:53 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:53.864 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:53.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:53 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:53.866 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:05:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:53.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:54 np0005486759.ooo.test dnsmasq[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/addn_hosts - 0 addresses
Oct 14 10:05:54 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/host
Oct 14 10:05:54 np0005486759.ooo.test dnsmasq-dhcp[320550]: read /var/lib/neutron/dhcp/088bccb5-24b9-4ee0-b0b7-043a59f5de32/opts
Oct 14 10:05:54 np0005486759.ooo.test podman[321643]: 2025-10-14 10:05:54.027767294 +0000 UTC m=+0.038959109 container kill b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:05:54 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:54Z|00220|binding|INFO|Releasing lport 861a2076-8663-4c08-b722-5fccfccef2b4 from this chassis (sb_readonly=0)
Oct 14 10:05:54 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:05:54Z|00221|binding|INFO|Setting lport 861a2076-8663-4c08-b722-5fccfccef2b4 down in Southbound
Oct 14 10:05:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:54.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:54 np0005486759.ooo.test kernel: device tap861a2076-86 left promiscuous mode
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.149 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-088bccb5-24b9-4ee0-b0b7-043a59f5de32', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-088bccb5-24b9-4ee0-b0b7-043a59f5de32', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79966a36fd094eb0adde863a9489901f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4d10b8ef-6175-448f-9156-671086675b82, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=861a2076-8663-4c08-b722-5fccfccef2b4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.151 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 861a2076-8663-4c08-b722-5fccfccef2b4 in datapath 088bccb5-24b9-4ee0-b0b7-043a59f5de32 unbound from our chassis
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.152 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 088bccb5-24b9-4ee0-b0b7-043a59f5de32 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.153 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[59449e70-19b5-4133-b2da-e51878a1ccd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:05:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:54.154 2 INFO neutron.agent.securitygroups_rpc [None req-3dcc20a9-f290-4800-8c16-06e374056277 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:54.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.171 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.172 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:05:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:54.173 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:05:54 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:05:54 np0005486759.ooo.test podman[321682]: 2025-10-14 10:05:54.509760188 +0000 UTC m=+0.060379974 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:05:54 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:54 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:54 np0005486759.ooo.test systemd[1]: tmp-crun.wJ6r7c.mount: Deactivated successfully.
Oct 14 10:05:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:05:54 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:05:54 np0005486759.ooo.test podman[321695]: 2025-10-14 10:05:54.644152878 +0000 UTC m=+0.105248102 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64)
Oct 14 10:05:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:54.656 2 INFO neutron.agent.securitygroups_rpc [None req-35caed98-e753-477e-a24a-7bb7b67b04e3 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:54.658 2 INFO neutron.agent.securitygroups_rpc [None req-305c963e-a1d0-4dc4-871a-118a38b244a3 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:54 np0005486759.ooo.test podman[321695]: 2025-10-14 10:05:54.679782845 +0000 UTC m=+0.140878089 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 14 10:05:54 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:05:54 np0005486759.ooo.test sshd[321734]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:05:54 np0005486759.ooo.test podman[321719]: 2025-10-14 10:05:54.773331268 +0000 UTC m=+0.127913683 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:54 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:54.792 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:54Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec784580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed0bfdf0>], id=68788650-db3a-4799-bff0-a3c3674f9292, ip_allocation=immediate, mac_address=fa:16:3e:9f:0f:8d, name=tempest-AllowedAddressPairTestJSON-1879034163, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:41Z, description=, dns_domain=, id=980f946d-2066-4c56-a997-9ce99dc83806, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-461773809, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19490, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['50e5e092-bc7c-40c0-a767-39624498db48'], tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:42Z, vlan_transparent=None, network_id=980f946d-2066-4c56-a997-9ce99dc83806, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['84c95655-fba8-47b6-bbda-854b4ef04515'], standard_attr_id=1764, status=DOWN, tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:54Z on network 980f946d-2066-4c56-a997-9ce99dc83806
Oct 14 10:05:54 np0005486759.ooo.test sshd[321734]: error: kex_exchange_identification: Connection closed by remote host
Oct 14 10:05:54 np0005486759.ooo.test sshd[321734]: Connection closed by 196.251.114.29 port 51824
Oct 14 10:05:54 np0005486759.ooo.test podman[321719]: 2025-10-14 10:05:54.874650759 +0000 UTC m=+0.229233174 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 14 10:05:54 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:05:55 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 2 addresses
Oct 14 10:05:55 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:55 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:55 np0005486759.ooo.test podman[321766]: 2025-10-14 10:05:55.007066169 +0000 UTC m=+0.057922218 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009)
Oct 14 10:05:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:55.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:55 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:55.272 287366 INFO neutron.agent.dhcp.agent [None req-c365478b-fb66-4573-8498-0b59ef68e35d - - - - - -] DHCP configuration for ports {'68788650-db3a-4799-bff0-a3c3674f9292'} is completed
Oct 14 10:05:55 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:55.560 2 INFO neutron.agent.securitygroups_rpc [None req-e7068338-b9cf-41e3-913e-99cc19eb5d5c e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53441 DF PROTO=TCP SPT=34346 DPT=9102 SEQ=1289649417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA5C4010000000001030307) 
Oct 14 10:05:56 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:05:56.868 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:05:57 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:57.176 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:56Z, description=, device_id=b0c4cefc-9498-4a5f-8494-a55bb8fc724a, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec624f40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c87c0>], id=6a809a4e-1168-41bf-bae2-08efdd803cbf, ip_allocation=immediate, mac_address=fa:16:3e:df:21:94, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1770, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:05:56Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:05:57 np0005486759.ooo.test podman[321805]: 2025-10-14 10:05:57.426144415 +0000 UTC m=+0.063476897 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:05:57 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:57.464 2 INFO neutron.agent.securitygroups_rpc [None req-d8ed8a19-15a7-46ae-bc97-55aed581a51e bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq[320706]: exiting on receipt of SIGTERM
Oct 14 10:05:57 np0005486759.ooo.test podman[321840]: 2025-10-14 10:05:57.606564109 +0000 UTC m=+0.051131631 container kill daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:05:57 np0005486759.ooo.test systemd[1]: libpod-daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6.scope: Deactivated successfully.
Oct 14 10:05:57 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:57.653 287366 INFO neutron.agent.dhcp.agent [None req-42b5ef9d-7341-4d64-9e08-afe33e55aa99 - - - - - -] DHCP configuration for ports {'6a809a4e-1168-41bf-bae2-08efdd803cbf'} is completed
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 1 addresses
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:57 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:57 np0005486759.ooo.test podman[321879]: 2025-10-14 10:05:57.704015922 +0000 UTC m=+0.061338272 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:05:57 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:57.776 2 INFO neutron.agent.securitygroups_rpc [None req-1d065575-6c40-4537-a1ef-41b56a6e9b5e fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:57 np0005486759.ooo.test podman[321869]: 2025-10-14 10:05:57.791874583 +0000 UTC m=+0.168806972 container died daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 10:05:57 np0005486759.ooo.test podman[321869]: 2025-10-14 10:05:57.833399999 +0000 UTC m=+0.210332348 container cleanup daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:05:57 np0005486759.ooo.test systemd[1]: libpod-conmon-daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6.scope: Deactivated successfully.
Oct 14 10:05:57 np0005486759.ooo.test podman[321870]: 2025-10-14 10:05:57.86191438 +0000 UTC m=+0.228037589 container remove daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-499c3c38-1e44-4c6c-ad9b-74f94e7028d0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:05:58 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:58.084 2 INFO neutron.agent.securitygroups_rpc [None req-2f742139-41ab-4bb6-b881-96da60f6c9a6 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:05:58 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:58.362 287366 INFO neutron.agent.dhcp.agent [None req-187fa81e-bb51-4305-9ad2-cf97ac6b63be - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:05:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-6650d53946a035da6a32a6f142e82c6cf1e8f40af95638d2349f1f5ce44162ac-merged.mount: Deactivated successfully.
Oct 14 10:05:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daea9a4aa844c6242a214a88644d6021dab7749fcadd77bc95764de93fed45f6-userdata-shm.mount: Deactivated successfully.
Oct 14 10:05:58 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d499c3c38\x2d1e44\x2d4c6c\x2dad9b\x2d74f94e7028d0.mount: Deactivated successfully.
Oct 14 10:05:58 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:58.774 2 INFO neutron.agent.securitygroups_rpc [None req-a465a34d-fb51-434e-aad0-f07fe28d0e25 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:05:58 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:05:58.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:05:59 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:05:59.449 2 INFO neutron.agent.securitygroups_rpc [None req-593ae95f-b1c3-47fe-b125-9d8fc7246947 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:05:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:05:59.519 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:05:58Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec683e80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec683d60>], id=1fcdaf11-71e8-483a-a6e2-d3fd696fc51f, ip_allocation=immediate, mac_address=fa:16:3e:20:a7:b7, name=tempest-AllowedAddressPairTestJSON-1602252553, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:41Z, description=, dns_domain=, id=980f946d-2066-4c56-a997-9ce99dc83806, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-461773809, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19490, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['50e5e092-bc7c-40c0-a767-39624498db48'], tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:42Z, vlan_transparent=None, network_id=980f946d-2066-4c56-a997-9ce99dc83806, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['84c95655-fba8-47b6-bbda-854b4ef04515'], standard_attr_id=1781, status=DOWN, tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:58Z on network 980f946d-2066-4c56-a997-9ce99dc83806
Oct 14 10:05:59 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 2 addresses
Oct 14 10:05:59 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:05:59 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:05:59 np0005486759.ooo.test podman[321934]: 2025-10-14 10:05:59.775727483 +0000 UTC m=+0.066258633 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:06:00 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:00.080 287366 INFO neutron.agent.dhcp.agent [None req-08656eaa-43a7-4d69-b32a-3d92153258be - - - - - -] DHCP configuration for ports {'1fcdaf11-71e8-483a-a6e2-d3fd696fc51f'} is completed
Oct 14 10:06:00 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:00.106 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:00 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:00.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:00 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:00.340 2 INFO neutron.agent.securitygroups_rpc [None req-b409906d-13e4-4b2c-8538-f5936c743197 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:01 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:01Z|00222|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:06:01 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:01.223 2 INFO neutron.agent.securitygroups_rpc [None req-584612f9-96a7-444a-b5cb-5bb46ad7b77f bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:06:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:01.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:01.432 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:00Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5ba280>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5ba130>], id=411db94d-3351-43bd-a698-8f24a258af25, ip_allocation=immediate, mac_address=fa:16:3e:bd:4c:56, name=tempest-AllowedAddressPairTestJSON-60936842, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:05:41Z, description=, dns_domain=, id=980f946d-2066-4c56-a997-9ce99dc83806, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-461773809, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19490, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['50e5e092-bc7c-40c0-a767-39624498db48'], tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:05:42Z, vlan_transparent=None, network_id=980f946d-2066-4c56-a997-9ce99dc83806, port_security_enabled=True, project_id=4a00be71437c46aa93ad65c658262d59, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['84c95655-fba8-47b6-bbda-854b4ef04515'], standard_attr_id=1784, status=DOWN, tags=[], tenant_id=4a00be71437c46aa93ad65c658262d59, updated_at=2025-10-14T10:06:00Z on network 980f946d-2066-4c56-a997-9ce99dc83806
Oct 14 10:06:01 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 3 addresses
Oct 14 10:06:01 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:06:01 np0005486759.ooo.test podman[321971]: 2025-10-14 10:06:01.621061527 +0000 UTC m=+0.040941850 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 10:06:01 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:06:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:01.825 287366 INFO neutron.agent.dhcp.agent [None req-19266067-2f5d-45cf-81b4-dc41e6f18a30 - - - - - -] DHCP configuration for ports {'411db94d-3351-43bd-a698-8f24a258af25'} is completed
Oct 14 10:06:02 np0005486759.ooo.test dnsmasq[320550]: exiting on receipt of SIGTERM
Oct 14 10:06:02 np0005486759.ooo.test podman[322010]: 2025-10-14 10:06:02.15286894 +0000 UTC m=+0.107210452 container kill b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:06:02 np0005486759.ooo.test systemd[1]: libpod-b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990.scope: Deactivated successfully.
Oct 14 10:06:02 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:06:02 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:02 np0005486759.ooo.test podman[322036]: 2025-10-14 10:06:02.188822237 +0000 UTC m=+0.071951186 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 14 10:06:02 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:02 np0005486759.ooo.test systemd[1]: tmp-crun.1VOHlb.mount: Deactivated successfully.
Oct 14 10:06:02 np0005486759.ooo.test podman[322050]: 2025-10-14 10:06:02.239380409 +0000 UTC m=+0.062629611 container died b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:06:02 np0005486759.ooo.test podman[322050]: 2025-10-14 10:06:02.276975646 +0000 UTC m=+0.100224768 container remove b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-088bccb5-24b9-4ee0-b0b7-043a59f5de32, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:06:02 np0005486759.ooo.test systemd[1]: libpod-conmon-b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990.scope: Deactivated successfully.
Oct 14 10:06:02 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:02.584 287366 INFO neutron.agent.dhcp.agent [None req-5f1e97a3-d265-41b9-b86b-4252a0d4e045 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:02 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:02.596 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-e2d49ee205966d34f85f889204e0ca2e1924b88f7b584bcf89f2e9a30765a329-merged.mount: Deactivated successfully.
Oct 14 10:06:02 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9f0a6714b5f0e3b18c1be6b05d35dd350a82685ccf4d58f2c924cee756ce990-userdata-shm.mount: Deactivated successfully.
Oct 14 10:06:02 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d088bccb5\x2d24b9\x2d4ee0\x2db0b7\x2d043a59f5de32.mount: Deactivated successfully.
Oct 14 10:06:03 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:03.530 2 INFO neutron.agent.securitygroups_rpc [None req-17ae0919-c680-413e-a4cb-b9a29e033677 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:03 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:03.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:04 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:06:04 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:04.340 2 INFO neutron.agent.securitygroups_rpc [None req-b1570957-a549-4a53-a2bc-3afdc7a0203a 44fda180099d4024ad942a0c70fafe77 26de2128d7d44d46983310bb259305ba - - default default] Security group member updated ['1d17bda2-b0cf-4089-bdf8-8b6724a420c4']
Oct 14 10:06:04 np0005486759.ooo.test podman[322086]: 2025-10-14 10:06:04.415562546 +0000 UTC m=+0.083805167 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 10:06:04 np0005486759.ooo.test podman[322086]: 2025-10-14 10:06:04.424249291 +0000 UTC m=+0.092491892 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 14 10:06:04 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:06:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:04.494 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:04.981 287366 INFO neutron.agent.linux.ip_lib [None req-a8d00ee0-f4de-4b7e-a24c-cdae53446ff3 - - - - - -] Device tap5dc52f2b-a5 cannot be used as it has no MAC address
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test kernel: device tap5dc52f2b-a5 entered promiscuous mode
Oct 14 10:06:05 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436365.0596] manager: (tap5dc52f2b-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Oct 14 10:06:05 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:05Z|00223|binding|INFO|Claiming lport 5dc52f2b-a554-4cd5-a532-8689d984ef19 for this chassis.
Oct 14 10:06:05 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:05Z|00224|binding|INFO|5dc52f2b-a554-4cd5-a532-8689d984ef19: Claiming unknown
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test systemd-udevd[322115]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:05.089 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:04Z, description=, device_id=e9dcb4a8-794f-4e33-9f24-592b95faaf41, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec748040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec698cd0>], id=2620f6a3-8f02-4dd1-a139-f90259e83905, ip_allocation=immediate, mac_address=fa:16:3e:32:5b:bd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1792, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:06:04Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:06:05 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:05.089 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f80bd80798c4b65b9ca3457716b0229', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1e61eac-28ba-4788-bacd-5bc624203f5c, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=5dc52f2b-a554-4cd5-a532-8689d984ef19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:05 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:05.091 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 5dc52f2b-a554-4cd5-a532-8689d984ef19 in datapath a344373a-cb0d-4dbe-9d18-78d6ba8e73c4 bound to our chassis
Oct 14 10:06:05 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:05.093 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a344373a-cb0d-4dbe-9d18-78d6ba8e73c4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:06:05 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:05.094 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[02756c8f-e446-47c2-a13e-ffe2ab6a4171]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:05Z|00225|binding|INFO|Setting lport 5dc52f2b-a554-4cd5-a532-8689d984ef19 ovn-installed in OVS
Oct 14 10:06:05 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:05Z|00226|binding|INFO|Setting lport 5dc52f2b-a554-4cd5-a532-8689d984ef19 up in Southbound
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5dc52f2b-a5: No such device
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:05.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:05 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:05.322 2 INFO neutron.agent.securitygroups_rpc [None req-950e2da7-a9c4-4232-992e-619c984d68a5 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:06:05 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:06:05 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:05 np0005486759.ooo.test podman[322161]: 2025-10-14 10:06:05.335903083 +0000 UTC m=+0.051145182 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:05 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:05 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:05.506 2 INFO neutron.agent.securitygroups_rpc [None req-7dcc006f-d3e9-49a7-9267-84fd40ea1218 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:05.615 287366 INFO neutron.agent.dhcp.agent [None req-5267d924-5087-4327-9bbb-178006b7bb42 - - - - - -] DHCP configuration for ports {'2620f6a3-8f02-4dd1-a139-f90259e83905'} is completed
Oct 14 10:06:05 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 2 addresses
Oct 14 10:06:05 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:06:05 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:06:05 np0005486759.ooo.test podman[322214]: 2025-10-14 10:06:05.659080991 +0000 UTC m=+0.059550807 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:06:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:06:05 np0005486759.ooo.test systemd[1]: tmp-crun.lEgyqJ.mount: Deactivated successfully.
Oct 14 10:06:05 np0005486759.ooo.test podman[322233]: 2025-10-14 10:06:05.780734792 +0000 UTC m=+0.089256333 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:06:05 np0005486759.ooo.test podman[322233]: 2025-10-14 10:06:05.792358608 +0000 UTC m=+0.100880139 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:06:05 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:06:05 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:05.968 2 INFO neutron.agent.securitygroups_rpc [None req-dbb7a938-0d90-42fa-b919-1dd09dc6d15a 44fda180099d4024ad942a0c70fafe77 26de2128d7d44d46983310bb259305ba - - default default] Security group member updated ['1d17bda2-b0cf-4089-bdf8-8b6724a420c4']
Oct 14 10:06:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:05.990 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:06 np0005486759.ooo.test podman[322282]: 
Oct 14 10:06:06 np0005486759.ooo.test podman[322282]: 2025-10-14 10:06:06.030565454 +0000 UTC m=+0.091082540 container create 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 10:06:06 np0005486759.ooo.test systemd[1]: Started libpod-conmon-2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba.scope.
Oct 14 10:06:06 np0005486759.ooo.test podman[322282]: 2025-10-14 10:06:05.978189436 +0000 UTC m=+0.038706572 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:06:06 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:06:06 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b1b1820e5094ddfa4b11ff05727acf01a3f040a89b631a1d0808b47f858cb6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:06:06 np0005486759.ooo.test podman[322282]: 2025-10-14 10:06:06.098448905 +0000 UTC m=+0.158965951 container init 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:06:06 np0005486759.ooo.test podman[322282]: 2025-10-14 10:06:06.106914173 +0000 UTC m=+0.167431229 container start 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: started, version 2.85 cachesize 150
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: DNS service limited to local subnets
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: warning: no upstream servers configured
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/addn_hosts - 0 addresses
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/host
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/opts
Oct 14 10:06:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:06.158 287366 INFO neutron.agent.dhcp.agent [None req-a8d00ee0-f4de-4b7e-a24c-cdae53446ff3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:04Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec799790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec799550>], id=1390c940-da84-43bb-8269-e1ebce4f3a36, ip_allocation=immediate, mac_address=fa:16:3e:6e:f5:9f, name=tempest-PortsIpV6TestJSON-1813871855, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:06:00Z, description=, dns_domain=, id=a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-627074281, port_security_enabled=True, project_id=7f80bd80798c4b65b9ca3457716b0229, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18662, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1782, status=ACTIVE, subnets=['7491351d-1b29-475f-bf07-432ebdb7e03c'], tags=[], tenant_id=7f80bd80798c4b65b9ca3457716b0229, updated_at=2025-10-14T10:06:02Z, vlan_transparent=None, network_id=a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, port_security_enabled=True, project_id=7f80bd80798c4b65b9ca3457716b0229, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1793, status=DOWN, tags=[], tenant_id=7f80bd80798c4b65b9ca3457716b0229, updated_at=2025-10-14T10:06:05Z on network a344373a-cb0d-4dbe-9d18-78d6ba8e73c4
Oct 14 10:06:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:06.303 287366 INFO neutron.agent.dhcp.agent [None req-dc4d3b66-7d4a-4895-8d54-bab7f1bcddd5 - - - - - -] DHCP configuration for ports {'5278faa8-b83e-42f0-bc57-c07260fd6db3'} is completed
Oct 14 10:06:06 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:06.355 2 INFO neutron.agent.securitygroups_rpc [None req-fe290446-2fc1-41a0-a4d1-87c681802754 bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/addn_hosts - 1 addresses
Oct 14 10:06:06 np0005486759.ooo.test podman[322318]: 2025-10-14 10:06:06.363767099 +0000 UTC m=+0.067422249 container kill 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/host
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/opts
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 1 addresses
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:06:06 np0005486759.ooo.test podman[322356]: 2025-10-14 10:06:06.58410275 +0000 UTC m=+0.059772634 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/addn_hosts - 0 addresses
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/host
Oct 14 10:06:06 np0005486759.ooo.test dnsmasq-dhcp[322300]: read /var/lib/neutron/dhcp/a344373a-cb0d-4dbe-9d18-78d6ba8e73c4/opts
Oct 14 10:06:06 np0005486759.ooo.test podman[322386]: 2025-10-14 10:06:06.680932195 +0000 UTC m=+0.059615201 container kill 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:06.718 287366 INFO neutron.agent.dhcp.agent [None req-f3d9fa4e-0d81-4e24-9cd2-26f2de08de00 - - - - - -] DHCP configuration for ports {'1390c940-da84-43bb-8269-e1ebce4f3a36'} is completed
Oct 14 10:06:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:06.744 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:07.105 2 INFO neutron.agent.securitygroups_rpc [None req-57fda8a0-04b2-418a-8305-58b621f15bab bc75d80895a64874b331df89e7ceb9bd 4a00be71437c46aa93ad65c658262d59 - - default default] Security group member updated ['84c95655-fba8-47b6-bbda-854b4ef04515']
Oct 14 10:06:07 np0005486759.ooo.test dnsmasq[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/addn_hosts - 0 addresses
Oct 14 10:06:07 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/host
Oct 14 10:06:07 np0005486759.ooo.test dnsmasq-dhcp[321250]: read /var/lib/neutron/dhcp/980f946d-2066-4c56-a997-9ce99dc83806/opts
Oct 14 10:06:07 np0005486759.ooo.test podman[322429]: 2025-10-14 10:06:07.372103299 +0000 UTC m=+0.064447767 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:06:07 np0005486759.ooo.test podman[322460]: 2025-10-14 10:06:07.502618071 +0000 UTC m=+0.058121715 container kill 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:06:07 np0005486759.ooo.test dnsmasq[322300]: exiting on receipt of SIGTERM
Oct 14 10:06:07 np0005486759.ooo.test systemd[1]: libpod-2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba.scope: Deactivated successfully.
Oct 14 10:06:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:07.521 2 INFO neutron.agent.securitygroups_rpc [None req-706c9965-8369-47d4-8328-c95fd04e19f7 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:07 np0005486759.ooo.test podman[322478]: 2025-10-14 10:06:07.570173341 +0000 UTC m=+0.058078193 container died 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:06:07 np0005486759.ooo.test podman[322478]: 2025-10-14 10:06:07.60093071 +0000 UTC m=+0.088835472 container cleanup 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:06:07 np0005486759.ooo.test systemd[1]: libpod-conmon-2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba.scope: Deactivated successfully.
Oct 14 10:06:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-71b1b1820e5094ddfa4b11ff05727acf01a3f040a89b631a1d0808b47f858cb6-merged.mount: Deactivated successfully.
Oct 14 10:06:07 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba-userdata-shm.mount: Deactivated successfully.
Oct 14 10:06:07 np0005486759.ooo.test podman[322485]: 2025-10-14 10:06:07.660682152 +0000 UTC m=+0.134523935 container remove 2e6a247763977c8ddfd85cde7e870aa2925ffed736d18e4e1c3f7d72959316ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:06:07 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:07Z|00227|binding|INFO|Releasing lport 5dc52f2b-a554-4cd5-a532-8689d984ef19 from this chassis (sb_readonly=0)
Oct 14 10:06:07 np0005486759.ooo.test kernel: device tap5dc52f2b-a5 left promiscuous mode
Oct 14 10:06:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:07.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:07 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:07Z|00228|binding|INFO|Setting lport 5dc52f2b-a554-4cd5-a532-8689d984ef19 down in Southbound
Oct 14 10:06:07 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:07.725 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a344373a-cb0d-4dbe-9d18-78d6ba8e73c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f80bd80798c4b65b9ca3457716b0229', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1e61eac-28ba-4788-bacd-5bc624203f5c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=5dc52f2b-a554-4cd5-a532-8689d984ef19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:07 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:07.727 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 5dc52f2b-a554-4cd5-a532-8689d984ef19 in datapath a344373a-cb0d-4dbe-9d18-78d6ba8e73c4 unbound from our chassis
Oct 14 10:06:07 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:07.729 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a344373a-cb0d-4dbe-9d18-78d6ba8e73c4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:06:07 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:07.730 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c69c7e41-6e04-4cce-a524-62835eb55aa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:07.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test dnsmasq[320897]: exiting on receipt of SIGTERM
Oct 14 10:06:08 np0005486759.ooo.test podman[322525]: 2025-10-14 10:06:08.12376246 +0000 UTC m=+0.066441198 container kill 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: libpod-4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec.scope: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2da344373a\x2dcb0d\x2d4dbe\x2d9d18\x2d78d6ba8e73c4.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.162 287366 INFO neutron.agent.dhcp.agent [None req-d0e79167-7a44-4b2e-8820-87fc2aaa5634 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.164 287366 INFO neutron.agent.dhcp.agent [None req-d0e79167-7a44-4b2e-8820-87fc2aaa5634 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test podman[322539]: 2025-10-14 10:06:08.19917526 +0000 UTC m=+0.059522707 container died 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:08 np0005486759.ooo.test podman[322539]: 2025-10-14 10:06:08.232322092 +0000 UTC m=+0.092669489 container cleanup 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: libpod-conmon-4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec.scope: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test podman[322541]: 2025-10-14 10:06:08.283226805 +0000 UTC m=+0.135992781 container remove 4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dd3b5582-1d48-49b6-b016-99de2f2b1a36, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:08Z|00229|binding|INFO|Releasing lport 6bfcee1c-22b8-411a-9e77-f54280874243 from this chassis (sb_readonly=0)
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:08Z|00230|binding|INFO|Setting lport 6bfcee1c-22b8-411a-9e77-f54280874243 down in Southbound
Oct 14 10:06:08 np0005486759.ooo.test kernel: device tap6bfcee1c-22 left promiscuous mode
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.306 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-dd3b5582-1d48-49b6-b016-99de2f2b1a36', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd3b5582-1d48-49b6-b016-99de2f2b1a36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26de2128d7d44d46983310bb259305ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c591649e-af66-4427-bcca-cd29f880129a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=6bfcee1c-22b8-411a-9e77-f54280874243) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.308 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 6bfcee1c-22b8-411a-9e77-f54280874243 in datapath dd3b5582-1d48-49b6-b016-99de2f2b1a36 unbound from our chassis
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.310 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network dd3b5582-1d48-49b6-b016-99de2f2b1a36 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.311 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[bd307d0b-53c1-42ce-ba25-30638b78fa4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.520 287366 INFO neutron.agent.dhcp.agent [None req-1ff6b7c9-96a6-468a-9df9-e6c5822fbc59 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.523 287366 INFO neutron.agent.dhcp.agent [None req-1ff6b7c9-96a6-468a-9df9-e6c5822fbc59 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test dnsmasq[321250]: exiting on receipt of SIGTERM
Oct 14 10:06:08 np0005486759.ooo.test podman[322587]: 2025-10-14 10:06:08.551342333 +0000 UTC m=+0.073251595 container kill 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: libpod-6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193.scope: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:08Z|00231|binding|INFO|Removing iface tap94ddb033-7b ovn-installed in OVS
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.578 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 596573dc-ef9b-46f1-94f2-0b30b7d99bef with type ""
Oct 14 10:06:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:08Z|00232|binding|INFO|Removing lport 94ddb033-7b68-4611-a364-93abba56085b ovn-installed in OVS
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.580 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-980f946d-2066-4c56-a997-9ce99dc83806', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-980f946d-2066-4c56-a997-9ce99dc83806', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4a00be71437c46aa93ad65c658262d59', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d28d608e-4f39-45dc-bf0e-59c674c60314, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=94ddb033-7b68-4611-a364-93abba56085b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.583 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 94ddb033-7b68-4611-a364-93abba56085b in datapath 980f946d-2066-4c56-a997-9ce99dc83806 unbound from our chassis
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.587 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 980f946d-2066-4c56-a997-9ce99dc83806, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:08 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:08.588 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[4423c020-8d3d-4642-94c3-31063cd06d0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:08 np0005486759.ooo.test podman[322601]: 2025-10-14 10:06:08.627002321 +0000 UTC m=+0.052699908 container died 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.651 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-ee08651271b4fc2776944ff502bc264202af6c754f4f786e349f67222249bd38-merged.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193-userdata-shm.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-9afa860c30fd5093a283c43ce2d04a186eef830d50acdbcea2c9cae6daac72b8-merged.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f84413250008c45f6f0c26aa5b45934c1c4fb533fd40724e0b338249271a1ec-userdata-shm.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2ddd3b5582\x2d1d48\x2d49b6\x2db016\x2d99de2f2b1a36.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test podman[322601]: 2025-10-14 10:06:08.677939336 +0000 UTC m=+0.103636873 container remove 6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980f946d-2066-4c56-a997-9ce99dc83806, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: libpod-conmon-6584963c95bbafb938a35d532eaad49a9fe57c35b46e948577c1151ffafd6193.scope: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test kernel: device tap94ddb033-7b left promiscuous mode
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d980f946d\x2d2066\x2d4c56\x2da997\x2d9ce99dc83806.mount: Deactivated successfully.
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.742 287366 INFO neutron.agent.dhcp.agent [None req-3e41c7a5-04cd-4c1c-b267-a3f3cad774a6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:08.743 287366 INFO neutron.agent.dhcp.agent [None req-3e41c7a5-04cd-4c1c-b267-a3f3cad774a6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:08 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:06:08 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:08 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:08 np0005486759.ooo.test podman[322643]: 2025-10-14 10:06:08.820067641 +0000 UTC m=+0.052828043 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:06:08 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:08Z|00233|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:08 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:08.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:09 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:09.001 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:09 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:09.520 2 INFO neutron.agent.securitygroups_rpc [None req-88268be2-f086-453e-9fcc-312ab5a654c7 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:09 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:09.691 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:09Z, description=, device_id=e5e49013-5c53-4e9e-b661-dd3bd4ef71c4, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec768c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7688e0>], id=50dc027f-cdbb-4fbe-8e26-bfc8ccb95dc0, ip_allocation=immediate, mac_address=fa:16:3e:5e:89:3a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1799, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:06:09Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:06:09 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:09.833 2 INFO neutron.agent.securitygroups_rpc [None req-331288ba-92f0-4d75-8290-7375d5b3dd28 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:09 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:06:09 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:09 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:09 np0005486759.ooo.test podman[322682]: 2025-10-14 10:06:09.904020058 +0000 UTC m=+0.062924410 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:10.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:10 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:10.181 287366 INFO neutron.agent.dhcp.agent [None req-122706e2-79b1-4c96-bfa0-0bc97ad7e868 - - - - - -] DHCP configuration for ports {'50dc027f-cdbb-4fbe-8e26-bfc8ccb95dc0'} is completed
Oct 14 10:06:10 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:10.314 2 INFO neutron.agent.securitygroups_rpc [None req-2144e6e1-6a42-4a06-b40f-3d87e4dbefa7 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:10 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:10.790 2 INFO neutron.agent.securitygroups_rpc [None req-b150ab2a-473a-4508-b9ec-8f8981f07942 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:11 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:11.738 2 INFO neutron.agent.securitygroups_rpc [None req-d80d0f4c-409b-49e9-b927-e836a876ef1d fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:12 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:12.165 2 INFO neutron.agent.securitygroups_rpc [None req-17287f4b-b47a-482d-aad5-46cb036a1126 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:06:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:06:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:06:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:06:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:06:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16693 "" "Go-http-client/1.1"
Oct 14 10:06:12 np0005486759.ooo.test podman[322720]: 2025-10-14 10:06:12.369510221 +0000 UTC m=+0.096071772 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:12 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:06:12 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:12 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:12 np0005486759.ooo.test systemd[1]: tmp-crun.KUpXNb.mount: Deactivated successfully.
Oct 14 10:06:12 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:12.892 2 INFO neutron.agent.securitygroups_rpc [None req-b1b524b4-fec7-4960-b8ba-e24ddc66df22 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:13 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:13.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:06:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:06:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:15.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:15.747 2 INFO neutron.agent.securitygroups_rpc [None req-183048ea-52d7-424a-80a3-f3d4211af3b8 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:15.836 2 INFO neutron.agent.securitygroups_rpc [None req-0399a708-2a61-44ed-a93b-a068adeee61e 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['d5008811-1afd-43f8-9841-f9af842c9b22']
Oct 14 10:06:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:16.118 2 INFO neutron.agent.securitygroups_rpc [None req-02ce22e8-631b-4bec-bb56-41a64af242fc 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['d5008811-1afd-43f8-9841-f9af842c9b22']
Oct 14 10:06:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:16.768 2 INFO neutron.agent.securitygroups_rpc [None req-c41e7be0-76d9-4d2e-8515-ba8eac79300b fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:17.305 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:17.306 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:06:17 np0005486759.ooo.test podman[322743]: 2025-10-14 10:06:17.467254395 +0000 UTC m=+0.087407128 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:17 np0005486759.ooo.test podman[322743]: 2025-10-14 10:06:17.47299167 +0000 UTC m=+0.093144403 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:06:17 np0005486759.ooo.test podman[322741]: 2025-10-14 10:06:17.442453208 +0000 UTC m=+0.072748000 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:06:17 np0005486759.ooo.test podman[322742]: 2025-10-14 10:06:17.508991238 +0000 UTC m=+0.131309837 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 14 10:06:17 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:17.510 2 INFO neutron.agent.securitygroups_rpc [None req-e4010c66-12be-4aec-8599-81acfff442e6 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:17 np0005486759.ooo.test podman[322742]: 2025-10-14 10:06:17.520265882 +0000 UTC m=+0.142584491 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:06:17 np0005486759.ooo.test podman[322741]: 2025-10-14 10:06:17.575692203 +0000 UTC m=+0.205987035 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:06:17 np0005486759.ooo.test podman[322790]: 2025-10-14 10:06:17.578278602 +0000 UTC m=+0.111240315 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:17 np0005486759.ooo.test podman[322790]: 2025-10-14 10:06:17.65790129 +0000 UTC m=+0.190862973 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true)
Oct 14 10:06:17 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:06:17 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:17.826 2 INFO neutron.agent.securitygroups_rpc [None req-7f490dec-bb66-40a6-b708-bfee5bb655bd 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.243 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.285 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Triggering sync for uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.286 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "4408214d-dae5-4452-92e9-eb4abd6589d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.286 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:06:18 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:18.315 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "4408214d-dae5-4452-92e9-eb4abd6589d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:06:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:18.418 2 INFO neutron.agent.securitygroups_rpc [None req-ccbea7ad-c90b-4e41-b496-c2ba9f195249 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:18.487 2 INFO neutron.agent.securitygroups_rpc [None req-5eb2e37b-fa52-4968-9d75-60e0b3d9c999 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:18 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:18.728 2 INFO neutron.agent.securitygroups_rpc [None req-2a5dcfe0-b15f-4b88-9a84-c40b918e2a32 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:19 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:19.162 2 INFO neutron.agent.securitygroups_rpc [None req-301a74d3-bbcb-4e94-8379-0529ca1edce2 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.218 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.219 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.219 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.220 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.306 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.383 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.385 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:06:19 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:19.412 2 INFO neutron.agent.securitygroups_rpc [None req-06081f2e-6b3d-4c8c-ba83-18207a58df2e 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.439 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.440 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:06:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42436 DF PROTO=TCP SPT=56768 DPT=9102 SEQ=3082571195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA61D680000000001030307) 
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.516 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.517 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.566 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:06:19 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:19.598 2 INFO neutron.agent.securitygroups_rpc [None req-239832dd-3bf3-499c-ae8e-9ddbf0852c69 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.786 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.788 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12278MB free_disk=386.6773567199707GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.788 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.788 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.865 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.866 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.866 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.930 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.945 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.947 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:06:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:19.947 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:06:20 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:20.114 2 INFO neutron.agent.securitygroups_rpc [None req-6394c57a-a61f-4693-a58b-2777d6c5f935 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:20.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42437 DF PROTO=TCP SPT=56768 DPT=9102 SEQ=3082571195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA621820000000001030307) 
Oct 14 10:06:20 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:20.713 2 INFO neutron.agent.securitygroups_rpc [None req-99b09343-82cf-47c1-ba86-b89a7a1c739d 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:20.948 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:21 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:21.174 2 INFO neutron.agent.securitygroups_rpc [None req-3241e7a8-2a0c-4eda-8506-2c96a1321111 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:21 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:21.775 2 INFO neutron.agent.securitygroups_rpc [None req-025f4af4-0811-4604-bc14-1cfc4687d3a8 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8137d955-144c-480b-9cc2-5402016e0cd3']
Oct 14 10:06:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:22.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:22.189 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:06:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42438 DF PROTO=TCP SPT=56768 DPT=9102 SEQ=3082571195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA629810000000001030307) 
Oct 14 10:06:22 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:22.929 2 INFO neutron.agent.securitygroups_rpc [None req-34759cc9-d63e-4db6-888c-26ca53de45fb fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:23.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:23.191 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:06:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:23.191 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:06:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:24.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:24.049 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:06:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:24.050 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:06:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:24.050 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:06:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:24.051 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:06:24 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:24.138 2 INFO neutron.agent.securitygroups_rpc [None req-db2a71ad-3ff8-48dd-bb4a-0a9bb6c057de 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['98d2dbec-6094-4f3c-9616-cfef7e86b577']
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.457 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9d4002b-ed1a-4051-91b5-d0d0d539b8b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.454072', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '70f522ba-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '9aad1471fe7b6b1256ffb1a8cdafe7cf42cc95ed497edb4e3ef183df431855dd'}]}, 'timestamp': '2025-10-14 10:06:24.457887', '_unique_id': '7f373cd69b434d2582c299564853ab54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.459 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.460 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.488 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.488 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4f76c5f-054f-485a-96b9-5105bad6ea7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.460900', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '70f9dc6a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '0c8e435f46bb19b9cd05080d19592826b3b44a0f40869e0f0de0af3cdf2df9aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.460900', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '70f9efa2-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '6d3298f124de6f3173f8318fdc9b83f39aec53d148e8ee5eaef7bd5b7b351897'}]}, 'timestamp': '2025-10-14 10:06:24.489270', '_unique_id': '21b221f59044439dacee250d2c399767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.490 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.491 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fffc974-2716-491b-bbff-97eeb816b986', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.491702', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '70fa6086-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': 'd6237b28714f915594254fc025e7c788c3ea3e40a4a65d93f64556f46b8ec074'}]}, 'timestamp': '2025-10-14 10:06:24.492213', '_unique_id': '60dcc8aeb69e4012b7be9a8c247c8aa0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.493 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '230dad4d-4e37-4d1f-a098-436f2b8cbd3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.494803', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '70fadcdc-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '1a386376a819e4a50ae04b233b3c03e46797738d66d8e344851a5c3cc9abdf16'}]}, 'timestamp': '2025-10-14 10:06:24.495365', '_unique_id': '20dc9d1c024743cc92b0864f5c7f57ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.496 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.511 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.512 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c16799b-3e8e-4b09-b172-b328d47825e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.497546', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '70fd6f7e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.692638516, 'message_signature': 'e866c52c2a001abb4e4e7bd3b8e1056061f48f54f8d714bcb9ad46e389f89abc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.497546', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '70fd80c2-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.692638516, 'message_signature': '5c7f759989534585358a995f295ea7c3a6a45d1ec8dbb226d9df49222531145e'}]}, 'timestamp': '2025-10-14 10:06:24.512636', '_unique_id': 'de834713dc094977b73648bca66333bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.513 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.514 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.515 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adff0f9c-a1c4-438b-b1c2-27e497885d82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.514982', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '70fdedf0-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '9cfd74a45306c08e3170257bbbdb28261a9e0cff9c592211fab7c6a536233788'}]}, 'timestamp': '2025-10-14 10:06:24.515513', '_unique_id': '3fdf7496e2f24cb1ab442f56e930e41e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.516 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.517 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.517 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b4d648a-0469-4835-927a-510e4365451a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.517710', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '70fe5844-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '2c049314473ea4cb289d0f14df495cbd1e25de8957044c8a0f6827ec290065ca'}]}, 'timestamp': '2025-10-14 10:06:24.518226', '_unique_id': 'ba688e821abd4ca3aed5c706792f13c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.520 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.520 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.520 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7507207e-1b58-423d-b945-2643fc346e22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.520445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '70fec2fc-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.692638516, 'message_signature': '894d1c3cb1e30bd3587647362d157c0e842f5fa920b7881a8e770848b6e21c39'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.520445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '70fed51c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.692638516, 'message_signature': '432ea6220f4a1b5f11ee441265919579237c10e36a0987410412223d884647be'}]}, 'timestamp': '2025-10-14 10:06:24.521352', '_unique_id': '2d02ea9f8a9f4006849606d101995934'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.522 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.523 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.523 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '665f90c2-2c4e-44c5-8e67-35a25f9c4d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.523820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '70ff481c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '0c248a1be1a566796341a8164c6319cd01fef92df3c66730c8766518cac72bc5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.523820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '70ff58ca-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '8ee5e2769315548601d55db8b0df7e7eb96ba24d3c9fe6c8a8441e7c62fce02f'}]}, 'timestamp': '2025-10-14 10:06:24.524722', '_unique_id': '69e80c12c5a04369b6c41dc2d403cda9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.525 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.526 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29181fa8-82c5-432c-b816-32248d724486', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8721, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.527135', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '70ffe1e6-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '3bf87067b95d411ae87fe2a3033ffb3d0b6f2bdc7f32eeb2947d30e7db3e4279'}]}, 'timestamp': '2025-10-14 10:06:24.528296', '_unique_id': '04513427a65d4ff4bde780611f65d355'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c0bc6fb-ea94-479d-86c7-e4c9e12ad625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.530648', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '710051d0-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '7661ef33b761615d10593c1506eeaabf1112ab5017ea11fcf8c4c5d00dfb0372'}]}, 'timestamp': '2025-10-14 10:06:24.531164', '_unique_id': '9af71a1914db4aaaa45d77d4903de1e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.533 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3054244-377b-462b-8f3c-5ff18dcda610', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.533376', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7100bc24-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': 'dc9ced048c2e70027b51303232bd6da7fc2162a258c483795b6d417a497130e6'}]}, 'timestamp': '2025-10-14 10:06:24.533850', '_unique_id': '05726b1483df441988d993f666f59704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '749d3650-3ead-4b34-8761-95c9f4ebf73d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.536113', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71012718-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '326311ad002e9e2a5f0e6c40e927dfea9deb226171ea2fd308a879e5dcdaf5f6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.536113', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7101379e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '3b7e11fe8824bf3cf967aa49bc95032ea973341be3a0929ef00f0c041160cb25'}]}, 'timestamp': '2025-10-14 10:06:24.537012', '_unique_id': '80e34bb88baf4d18807157df7e86feca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.537 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a94b986-0e67-4913-b924-d15d65663ebe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.539234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7101a26a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.692638516, 'message_signature': '99e269f3c4d3e67d09b0ec3facbbf86879720c6acd9cc32439ed55d26ed3950f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.539234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7101ae9a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.692638516, 'message_signature': '8a8c5dcbbe5ccf34c60675d3f0eaae673f9d40d451890d40e3f57f1ffc5dac40'}]}, 'timestamp': '2025-10-14 10:06:24.539947', '_unique_id': 'f242166582a14f3990f0c1afa868e5e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.541 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.541 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '777af410-45f4-46e0-9f0c-f113b88ab9f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.541685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7101fd78-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': 'd16a18c61eb8b95b5ecba3a2e8997624d34d5aa77c01c9d2c3934dbe1075b0e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.541685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '710208d6-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '0867da489bf29b10e4156cdf39b4286dd614f00ac05ecfe16bb3d07e288a96d7'}]}, 'timestamp': '2025-10-14 10:06:24.542256', '_unique_id': 'b2e2b168a51c4f66992467dd2adfa2c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.543 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.543 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.543 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe43dec4-af61-49c0-9e57-dbdd31229e44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.543644', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '710249ae-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': '2b1b5da4a81d49078ff8c3ed3c51aaf36e607f281519491e63f663e5b831801c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.543644', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '710254da-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': 'a0d780525308d7765f1f065d0f80084ede5235a6971f0502d1f9e21b315e6d69'}]}, 'timestamp': '2025-10-14 10:06:24.544202', '_unique_id': 'a0ada37e0411477bac8e05eb08fc9aae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.545 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21bb2ab0-d61d-4749-8128-280a16da7a8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:06:24.545632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7102976a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': 'fffc58f7751717e4747e042c4c4abec2056226343475e7fa64442066754378a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:06:24.545632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7102a2be-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.656014519, 'message_signature': 'a0d6f8b8933ccee77a9059719f278047dc818ae46a9d57e1d25ccd0a5e25b95f'}]}, 'timestamp': '2025-10-14 10:06:24.546196', '_unique_id': 'f03f3ab2175e4dc7afb1efbaa0327ea8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.547 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.562 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 13140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e92f0585-5d51-4396-a11d-f45609fa6b4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13140000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:06:24.547579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7105306a-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.757617888, 'message_signature': 'f9c94b182bde132bafd7883cca9213f0e348aff4b12d490bad0f2ea6a07d84ac'}]}, 'timestamp': '2025-10-14 10:06:24.562947', '_unique_id': '422b76ff2ad644349e47b4e6e9213929'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.563 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.564 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4c41dee-640b-4ecd-a287-028411b919e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.564438', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '710575fc-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': '198a36eb3e5f5972e321eede091147006a394d9f94ca51b4ae481917a08a7a53'}]}, 'timestamp': '2025-10-14 10:06:24.564728', '_unique_id': '322b970141794a158cab0c8f8618f2d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d1b2db9-bf7b-49e8-b394-44bce1d22051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:06:24.566066', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '7105b580-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.649160009, 'message_signature': 'bc1d5b5a6b44dcaf91b5f130d33bfcb60ff99131afd70fe10b2f2033535db419'}]}, 'timestamp': '2025-10-14 10:06:24.566354', '_unique_id': '5f6b59179efc4931b8f8668481a8bd75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.566 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.567 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.567 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd05d5190-f2d5-4346-b5e2-34537c0bbf4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:06:24.567666', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7105f3ec-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12404.757617888, 'message_signature': '6791f027c4b5d252f5603847560f9ae98c78b83e589941f82216534cfcc66b95'}]}, 'timestamp': '2025-10-14 10:06:24.567943', '_unique_id': 'aa6b553e98ff488a9a15c0462e91d3a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:06:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:06:24.568 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:06:24 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:24.672 2 INFO neutron.agent.securitygroups_rpc [None req-588c2505-9045-4928-92b9-72c394fc508e fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:25.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:06:25 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:06:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:25.401 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:06:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:25.425 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:06:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:25.425 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:06:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:25.426 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:06:25 np0005486759.ooo.test podman[322833]: 2025-10-14 10:06:25.457304199 +0000 UTC m=+0.083632233 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:06:25 np0005486759.ooo.test podman[322833]: 2025-10-14 10:06:25.50260174 +0000 UTC m=+0.128929774 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Oct 14 10:06:25 np0005486759.ooo.test systemd[1]: tmp-crun.BrGRZ1.mount: Deactivated successfully.
Oct 14 10:06:25 np0005486759.ooo.test podman[322834]: 2025-10-14 10:06:25.511429801 +0000 UTC m=+0.137284829 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350)
Oct 14 10:06:25 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:06:25 np0005486759.ooo.test podman[322834]: 2025-10-14 10:06:25.554516655 +0000 UTC m=+0.180371673 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Oct 14 10:06:25 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:06:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:26.149 2 INFO neutron.agent.securitygroups_rpc [None req-39682e85-0827-423f-9729-41406de5313d fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:26.470 2 INFO neutron.agent.securitygroups_rpc [None req-528ec089-a50e-4b8d-a066-9d4ef691f002 211859164e31414eada81a29d3456b68 0090dd78428a41f1af5efb7163c92dad - - default default] Security group member updated ['4148a097-20c8-4213-aae7-42cb023d1444']
Oct 14 10:06:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42439 DF PROTO=TCP SPT=56768 DPT=9102 SEQ=3082571195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA639410000000001030307) 
Oct 14 10:06:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:27.048 2 INFO neutron.agent.securitygroups_rpc [None req-e8f27e3a-38dc-4b12-9fcb-949997d013ee 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8a6513ae-1542-40ec-ae31-0c828ae32ba5']
Oct 14 10:06:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:27.314 2 INFO neutron.agent.securitygroups_rpc [None req-c6b47bb0-7331-44d5-be77-f94efc2bf7f1 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:27.460 2 INFO neutron.agent.securitygroups_rpc [None req-95719bdb-ea1f-43ed-84f7-2c29200fd7d2 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8a6513ae-1542-40ec-ae31-0c828ae32ba5']
Oct 14 10:06:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:27.729 2 INFO neutron.agent.securitygroups_rpc [None req-a04d7337-e343-4ea8-aaab-7142ae1947d7 211859164e31414eada81a29d3456b68 0090dd78428a41f1af5efb7163c92dad - - default default] Security group member updated ['4148a097-20c8-4213-aae7-42cb023d1444']
Oct 14 10:06:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:28.408 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:28.411 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:06:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:28.415 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:28.416 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c3f338-e024-4084-8104-b1e4a4e36937]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:29.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:30.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:30 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:30.705 2 INFO neutron.agent.securitygroups_rpc [None req-b439c60d-fd24-41ad-8e46-605fc24d5881 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:31.692 2 INFO neutron.agent.securitygroups_rpc [None req-0f07315e-49e7-4b86-97bc-85057f13f5b5 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['a9a0b3a9-7dd4-4d18-82ee-7e7a746aaac8']
Oct 14 10:06:33 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:33.083 2 INFO neutron.agent.securitygroups_rpc [None req-01c53ab3-b7e9-45a2-8982-a035c6ac3e4a e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:33 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:33.560 2 INFO neutron.agent.securitygroups_rpc [None req-0a6ddb82-3bff-4973-a76f-b831df095bf0 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['a9a0b3a9-7dd4-4d18-82ee-7e7a746aaac8']
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:34.373 287366 INFO neutron.agent.linux.ip_lib [None req-a0067b0e-3b84-4b18-9345-ffbd69c07778 - - - - - -] Device tapdbf78059-3d cannot be used as it has no MAC address
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test kernel: device tapdbf78059-3d entered promiscuous mode
Oct 14 10:06:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:34Z|00234|binding|INFO|Claiming lport dbf78059-3dc1-4517-a0cd-5132fccfc0db for this chassis.
Oct 14 10:06:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:34Z|00235|binding|INFO|dbf78059-3dc1-4517-a0cd-5132fccfc0db: Claiming unknown
Oct 14 10:06:34 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436394.4029] manager: (tapdbf78059-3d): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test systemd-udevd[322884]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:34Z|00236|binding|INFO|Setting lport dbf78059-3dc1-4517-a0cd-5132fccfc0db ovn-installed in OVS
Oct 14 10:06:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:34Z|00237|binding|INFO|Setting lport dbf78059-3dc1-4517-a0cd-5132fccfc0db up in Southbound
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:34.416 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-796f0bd6-8bbf-4716-8fc6-e2c483ebd936', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-796f0bd6-8bbf-4716-8fc6-e2c483ebd936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f80bd80798c4b65b9ca3457716b0229', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a955b76-1ea0-47d5-a535-efbc2063c865, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=dbf78059-3dc1-4517-a0cd-5132fccfc0db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:34 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:34.418 183328 INFO neutron.agent.ovn.metadata.agent [-] Port dbf78059-3dc1-4517-a0cd-5132fccfc0db in datapath 796f0bd6-8bbf-4716-8fc6-e2c483ebd936 bound to our chassis
Oct 14 10:06:34 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:34.421 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 796f0bd6-8bbf-4716-8fc6-e2c483ebd936 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:06:34 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:34.422 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[e514874d-6650-422f-8ffc-209cb802280f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:34.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:34 np0005486759.ooo.test podman[322890]: 2025-10-14 10:06:34.515468056 +0000 UTC m=+0.061284228 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 14 10:06:34 np0005486759.ooo.test podman[322890]: 2025-10-14 10:06:34.525200714 +0000 UTC m=+0.071016906 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:06:34 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:06:35 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:35.088 2 INFO neutron.agent.securitygroups_rpc [None req-5e2fa0ee-0a84-4759-9ac3-efe681b3869f fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:35 np0005486759.ooo.test podman[322958]: 
Oct 14 10:06:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:35.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:35 np0005486759.ooo.test podman[322958]: 2025-10-14 10:06:35.22982749 +0000 UTC m=+0.084211941 container create fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 10:06:35 np0005486759.ooo.test systemd[1]: Started libpod-conmon-fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f.scope.
Oct 14 10:06:35 np0005486759.ooo.test podman[322958]: 2025-10-14 10:06:35.180170338 +0000 UTC m=+0.034554849 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:06:35 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:06:35 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c20e6809d4e42fc7c5a5b5976cb9be12474f85e3132042ccdb3e15dec2985c4c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:06:35 np0005486759.ooo.test podman[322958]: 2025-10-14 10:06:35.30133657 +0000 UTC m=+0.155721011 container init fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:06:35 np0005486759.ooo.test podman[322958]: 2025-10-14 10:06:35.310525141 +0000 UTC m=+0.164909582 container start fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3)
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq[322976]: started, version 2.85 cachesize 150
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq[322976]: DNS service limited to local subnets
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq[322976]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq[322976]: warning: no upstream servers configured
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq-dhcp[322976]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/addn_hosts - 0 addresses
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/host
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/opts
Oct 14 10:06:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:35.370 287366 INFO neutron.agent.dhcp.agent [None req-a0067b0e-3b84-4b18-9345-ffbd69c07778 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:34Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec62b550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec618550>], id=b287e5ca-f9a3-4c48-921f-8e2e12603287, ip_allocation=immediate, mac_address=fa:16:3e:28:f6:78, name=tempest-PortsIpV6TestJSON-79657903, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:06:31Z, description=, dns_domain=, id=796f0bd6-8bbf-4716-8fc6-e2c483ebd936, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1312227957, port_security_enabled=True, project_id=7f80bd80798c4b65b9ca3457716b0229, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35839, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1903, status=ACTIVE, subnets=['933010b6-8c3a-45d1-b7bd-3672bee366b1'], tags=[], tenant_id=7f80bd80798c4b65b9ca3457716b0229, updated_at=2025-10-14T10:06:33Z, vlan_transparent=None, network_id=796f0bd6-8bbf-4716-8fc6-e2c483ebd936, port_security_enabled=True, project_id=7f80bd80798c4b65b9ca3457716b0229, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['201bb6d0-74bd-4bb0-a611-31f2c81fb03d'], standard_attr_id=1912, status=DOWN, tags=[], tenant_id=7f80bd80798c4b65b9ca3457716b0229, updated_at=2025-10-14T10:06:34Z on network 796f0bd6-8bbf-4716-8fc6-e2c483ebd936
Oct 14 10:06:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:35.481 287366 INFO neutron.agent.dhcp.agent [None req-3654705f-75fb-47d3-a4d5-1bc5afe82144 - - - - - -] DHCP configuration for ports {'33d46e5d-40ef-4a66-949a-96e57d62fdba'} is completed
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/addn_hosts - 1 addresses
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/host
Oct 14 10:06:35 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/opts
Oct 14 10:06:35 np0005486759.ooo.test podman[322995]: 2025-10-14 10:06:35.550560724 +0000 UTC m=+0.065208408 container kill fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:06:35 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:35.557 2 INFO neutron.agent.securitygroups_rpc [None req-2d9ebed7-0cb7-4a46-922c-778aeee36ca4 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8617bccf-18bd-4c69-b406-32e2821f3600']
Oct 14 10:06:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:35.830 287366 INFO neutron.agent.dhcp.agent [None req-cfaeedf3-30ef-4832-8a9e-da73bf79b502 - - - - - -] DHCP configuration for ports {'b287e5ca-f9a3-4c48-921f-8e2e12603287'} is completed
Oct 14 10:06:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:06:36 np0005486759.ooo.test podman[323017]: 2025-10-14 10:06:36.453272057 +0000 UTC m=+0.078075471 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:06:36 np0005486759.ooo.test podman[323017]: 2025-10-14 10:06:36.487366723 +0000 UTC m=+0.112170197 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:06:36 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:06:36 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:36.588 2 INFO neutron.agent.securitygroups_rpc [None req-10f38270-f52a-4ad1-89b7-75e440db30b6 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8617bccf-18bd-4c69-b406-32e2821f3600']
Oct 14 10:06:37 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:37.305 2 INFO neutron.agent.securitygroups_rpc [None req-56d4bb39-00dd-45f7-9abe-976bab643206 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8617bccf-18bd-4c69-b406-32e2821f3600']
Oct 14 10:06:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:37.433 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:37.434 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:06:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:37.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:37.560 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 2001:db8::f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:37.563 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:06:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:37.566 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:37.567 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[846a8de6-6e64-4c3a-bdd8-76d71e28b3ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:38 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:38.356 2 INFO neutron.agent.securitygroups_rpc [None req-ac81ae5a-eced-4e74-91e0-8bca4a42c051 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8617bccf-18bd-4c69-b406-32e2821f3600']
Oct 14 10:06:38 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:38.656 2 INFO neutron.agent.securitygroups_rpc [None req-058b96f8-a0d2-429b-bbaf-65af9664628a 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8617bccf-18bd-4c69-b406-32e2821f3600']
Oct 14 10:06:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:39.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:39 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:39.705 2 INFO neutron.agent.securitygroups_rpc [None req-b938d810-1069-40e4-a6c2-a88d77702d8d 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['8617bccf-18bd-4c69-b406-32e2821f3600']
Oct 14 10:06:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:39.787 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:34Z, description=, device_id=d0a6af9d-11e7-4414-b947-80813f9c5013, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec698640>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec698400>], id=b287e5ca-f9a3-4c48-921f-8e2e12603287, ip_allocation=immediate, mac_address=fa:16:3e:28:f6:78, name=tempest-PortsIpV6TestJSON-79657903, network_id=796f0bd6-8bbf-4716-8fc6-e2c483ebd936, port_security_enabled=True, project_id=7f80bd80798c4b65b9ca3457716b0229, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['201bb6d0-74bd-4bb0-a611-31f2c81fb03d'], standard_attr_id=1912, status=DOWN, tags=[], tenant_id=7f80bd80798c4b65b9ca3457716b0229, updated_at=2025-10-14T10:06:36Z on network 796f0bd6-8bbf-4716-8fc6-e2c483ebd936
Oct 14 10:06:39 np0005486759.ooo.test dnsmasq[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/addn_hosts - 1 addresses
Oct 14 10:06:39 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/host
Oct 14 10:06:39 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/opts
Oct 14 10:06:39 np0005486759.ooo.test podman[323058]: 2025-10-14 10:06:39.927092262 +0000 UTC m=+0.037800488 container kill fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:06:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:40.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:40 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:40.226 287366 INFO neutron.agent.dhcp.agent [None req-96c66516-6335-43b8-90c5-ae175146096f - - - - - -] DHCP configuration for ports {'b287e5ca-f9a3-4c48-921f-8e2e12603287'} is completed
Oct 14 10:06:40 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:40.925 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:40 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:40.927 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:06:40 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:40.930 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:40 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:40.932 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccb8b13-1a73-43e2-b68a-ca483885e7b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:41 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:41.437 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:06:41 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:41.522 2 INFO neutron.agent.securitygroups_rpc [None req-0b5ac04b-50cb-494f-910c-78492cb7924a 734a9182ff24445488cbc10c942d394d 6a962a74429a4a5cab459f3e55af2d9d - - default default] Security group rule updated ['1d6b2ef0-c52a-401d-88b5-2f7ba9ac788f']
Oct 14 10:06:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:41.558 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:41Z, description=, device_id=20b057a8-3953-4aba-801e-2d7d0536aae6, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec624c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec624f70>], id=4851e494-f07d-415a-bc90-0dabd871454e, ip_allocation=immediate, mac_address=fa:16:3e:6b:c5:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1929, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:06:41Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:06:41 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:06:41 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:41 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:41 np0005486759.ooo.test podman[323095]: 2025-10-14 10:06:41.773018149 +0000 UTC m=+0.055684787 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:41.973 287366 INFO neutron.agent.dhcp.agent [None req-914a08f1-4c49-47be-8990-f5d104f3bcca - - - - - -] DHCP configuration for ports {'4851e494-f07d-415a-bc90-0dabd871454e'} is completed
Oct 14 10:06:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:42.049 287366 INFO neutron.agent.linux.ip_lib [None req-71c0470b-69a6-427b-8b59-1eddd884a6b3 - - - - - -] Device tap6b4f10d6-ca cannot be used as it has no MAC address
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test kernel: device tap6b4f10d6-ca entered promiscuous mode
Oct 14 10:06:42 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436402.0838] manager: (tap6b4f10d6-ca): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Oct 14 10:06:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:42Z|00238|binding|INFO|Claiming lport 6b4f10d6-ca0d-4141-8116-23360feb3d7a for this chassis.
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:42Z|00239|binding|INFO|6b4f10d6-ca0d-4141-8116-23360feb3d7a: Claiming unknown
Oct 14 10:06:42 np0005486759.ooo.test systemd-udevd[323125]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.098 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '710bf5711484449682775e44dbb1ee9d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19d413a5-8f35-45fb-9e09-a83f24d08850, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=6b4f10d6-ca0d-4141-8116-23360feb3d7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.101 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 6b4f10d6-ca0d-4141-8116-23360feb3d7a in datapath e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8 bound to our chassis
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.104 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 11478412-0559-4724-a6b4-b5e5fa17b66f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.106 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.108 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[436cf710-588e-4f91-ac70-e6ab4e6f9d39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:42Z|00240|binding|INFO|Setting lport 6b4f10d6-ca0d-4141-8116-23360feb3d7a ovn-installed in OVS
Oct 14 10:06:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:42Z|00241|binding|INFO|Setting lport 6b4f10d6-ca0d-4141-8116-23360feb3d7a up in Southbound
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:42.186 2 INFO neutron.agent.securitygroups_rpc [None req-a9af4e13-0e0e-425d-8b7d-d101ede69f7b fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:06:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:06:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:06:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132490 "" "Go-http-client/1.1"
Oct 14 10:06:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:06:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17171 "" "Go-http-client/1.1"
Oct 14 10:06:42 np0005486759.ooo.test dnsmasq[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/addn_hosts - 0 addresses
Oct 14 10:06:42 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/host
Oct 14 10:06:42 np0005486759.ooo.test podman[323160]: 2025-10-14 10:06:42.492117488 +0000 UTC m=+0.063268040 container kill fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:06:42 np0005486759.ooo.test dnsmasq-dhcp[322976]: read /var/lib/neutron/dhcp/796f0bd6-8bbf-4716-8fc6-e2c483ebd936/opts
Oct 14 10:06:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:42Z|00242|binding|INFO|Releasing lport dbf78059-3dc1-4517-a0cd-5132fccfc0db from this chassis (sb_readonly=0)
Oct 14 10:06:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:42Z|00243|binding|INFO|Setting lport dbf78059-3dc1-4517-a0cd-5132fccfc0db down in Southbound
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test kernel: device tapdbf78059-3d left promiscuous mode
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.741 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-796f0bd6-8bbf-4716-8fc6-e2c483ebd936', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-796f0bd6-8bbf-4716-8fc6-e2c483ebd936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f80bd80798c4b65b9ca3457716b0229', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a955b76-1ea0-47d5-a535-efbc2063c865, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=dbf78059-3dc1-4517-a0cd-5132fccfc0db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.743 183328 INFO neutron.agent.ovn.metadata.agent [-] Port dbf78059-3dc1-4517-a0cd-5132fccfc0db in datapath 796f0bd6-8bbf-4716-8fc6-e2c483ebd936 unbound from our chassis
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.745 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 796f0bd6-8bbf-4716-8fc6-e2c483ebd936 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:06:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:42.746 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[4d8d5835-046c-4c9d-94fc-3a9108977e5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:42.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:42 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:42.972 2 INFO neutron.agent.securitygroups_rpc [None req-e72ff2c0-01b6-4d53-97c5-6768b3f207fc e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:43 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:43.006 2 INFO neutron.agent.securitygroups_rpc [None req-a0602186-e5a7-42fd-b435-fbfe06c45ed4 480400984d38484dbe29bb9c2def7428 710bf5711484449682775e44dbb1ee9d - - default default] Security group member updated ['9769d304-4be1-458d-b9d3-56f64b754316']
Oct 14 10:06:43 np0005486759.ooo.test podman[323219]: 
Oct 14 10:06:43 np0005486759.ooo.test podman[323219]: 2025-10-14 10:06:43.189894943 +0000 UTC m=+0.094105093 container create fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:06:43 np0005486759.ooo.test systemd[1]: Started libpod-conmon-fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9.scope.
Oct 14 10:06:43 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:06:43 np0005486759.ooo.test podman[323219]: 2025-10-14 10:06:43.147139064 +0000 UTC m=+0.051349244 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:06:43 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3824ee141cc8d6815a003395c22a6eea5bb77daea1511a4f59faf8488137bfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:06:43 np0005486759.ooo.test podman[323219]: 2025-10-14 10:06:43.262061714 +0000 UTC m=+0.166271864 container init fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:06:43 np0005486759.ooo.test systemd[1]: tmp-crun.SNOoSA.mount: Deactivated successfully.
Oct 14 10:06:43 np0005486759.ooo.test podman[323219]: 2025-10-14 10:06:43.277405344 +0000 UTC m=+0.181615504 container start fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq[323237]: started, version 2.85 cachesize 150
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq[323237]: DNS service limited to local subnets
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq[323237]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq[323237]: warning: no upstream servers configured
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq-dhcp[323237]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/addn_hosts - 0 addresses
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/host
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/opts
Oct 14 10:06:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:43.604 287366 INFO neutron.agent.dhcp.agent [None req-9d575395-ddb9-4477-91fa-c0e1ee68184a - - - - - -] DHCP configuration for ports {'6e215cae-6b10-4f51-932a-2be08d29b97c'} is completed
Oct 14 10:06:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:43.725 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5fe1f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5fe970>], id=cda54e30-3942-4d38-90dc-cf7befc65bbf, ip_allocation=immediate, mac_address=fa:16:3e:66:03:63, name=tempest-RoutersTest-650192705, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:06:39Z, description=, dns_domain=, id=e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-465683405, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44131, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1920, status=ACTIVE, subnets=['78565c5d-ea94-4e9b-b34d-b1b28e4a50f2'], tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:06:40Z, vlan_transparent=None, network_id=e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9769d304-4be1-458d-b9d3-56f64b754316'], standard_attr_id=1934, status=DOWN, tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:06:42Z on network e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/addn_hosts - 1 addresses
Oct 14 10:06:43 np0005486759.ooo.test podman[323255]: 2025-10-14 10:06:43.935173294 +0000 UTC m=+0.055656347 container kill fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/host
Oct 14 10:06:43 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/opts
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:06:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:06:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:06:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:44.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:44 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:44.243 287366 INFO neutron.agent.dhcp.agent [None req-f3e043a9-97ec-4fd5-a647-c587580fa826 - - - - - -] DHCP configuration for ports {'cda54e30-3942-4d38-90dc-cf7befc65bbf'} is completed
Oct 14 10:06:45 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:45.134 2 INFO neutron.agent.securitygroups_rpc [None req-768be07d-1b36-45c2-8c45-a016e28668e6 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:45 np0005486759.ooo.test dnsmasq[322976]: exiting on receipt of SIGTERM
Oct 14 10:06:45 np0005486759.ooo.test podman[323291]: 2025-10-14 10:06:45.149875494 +0000 UTC m=+0.065346203 container kill fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:06:45 np0005486759.ooo.test systemd[1]: libpod-fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f.scope: Deactivated successfully.
Oct 14 10:06:45 np0005486759.ooo.test podman[323305]: 2025-10-14 10:06:45.23007565 +0000 UTC m=+0.061625778 container died fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:06:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:45.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:45 np0005486759.ooo.test systemd[1]: tmp-crun.bp8vLE.mount: Deactivated successfully.
Oct 14 10:06:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f-userdata-shm.mount: Deactivated successfully.
Oct 14 10:06:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-c20e6809d4e42fc7c5a5b5976cb9be12474f85e3132042ccdb3e15dec2985c4c-merged.mount: Deactivated successfully.
Oct 14 10:06:45 np0005486759.ooo.test podman[323305]: 2025-10-14 10:06:45.28067344 +0000 UTC m=+0.112223518 container cleanup fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:06:45 np0005486759.ooo.test systemd[1]: libpod-conmon-fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f.scope: Deactivated successfully.
Oct 14 10:06:45 np0005486759.ooo.test podman[323307]: 2025-10-14 10:06:45.31655941 +0000 UTC m=+0.144231189 container remove fdb99be9ffb35ff0197598f6c2a636d6dec486dd61d48235f3daf6da1bb1ba1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-796f0bd6-8bbf-4716-8fc6-e2c483ebd936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:45.706 287366 INFO neutron.agent.dhcp.agent [None req-ec3efb28-4d0c-4108-9b66-d8016b0c5a23 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:45.707 287366 INFO neutron.agent.dhcp.agent [None req-ec3efb28-4d0c-4108-9b66-d8016b0c5a23 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:46.131 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:46 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d796f0bd6\x2d8bbf\x2d4716\x2d8fc6\x2de2c483ebd936.mount: Deactivated successfully.
Oct 14 10:06:46 np0005486759.ooo.test podman[323350]: 2025-10-14 10:06:46.675433837 +0000 UTC m=+0.050507728 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:06:46 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:06:46 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:46 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:46 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:46Z|00244|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:06:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:46.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:06:48 np0005486759.ooo.test podman[323370]: 2025-10-14 10:06:48.443128938 +0000 UTC m=+0.067420307 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: tmp-crun.KWOZc1.mount: Deactivated successfully.
Oct 14 10:06:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:48.498 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:42Z, description=, device_id=3abdf7cc-fa8b-4cdf-878b-0ddc69270170, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c4eb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7c49d0>], id=cda54e30-3942-4d38-90dc-cf7befc65bbf, ip_allocation=immediate, mac_address=fa:16:3e:66:03:63, name=tempest-RoutersTest-650192705, network_id=e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['9769d304-4be1-458d-b9d3-56f64b754316'], standard_attr_id=1934, status=DOWN, tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:06:45Z on network e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8
Oct 14 10:06:48 np0005486759.ooo.test podman[323378]: 2025-10-14 10:06:48.507093577 +0000 UTC m=+0.113491177 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:06:48 np0005486759.ooo.test podman[323377]: 2025-10-14 10:06:48.462153941 +0000 UTC m=+0.075477804 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:06:48 np0005486759.ooo.test podman[323370]: 2025-10-14 10:06:48.5293884 +0000 UTC m=+0.153679749 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:06:48 np0005486759.ooo.test podman[323378]: 2025-10-14 10:06:48.541695068 +0000 UTC m=+0.148092598 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:06:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:48.545 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:47Z, description=, device_id=425e2f09-a177-4566-9dfb-d04118f6f3e8, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ecffb640>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ecffb0a0>], id=fdbb1fe7-d943-4dbf-a5d9-f4346a743952, ip_allocation=immediate, mac_address=fa:16:3e:76:ac:3f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1938, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:06:48Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:06:48 np0005486759.ooo.test podman[323377]: 2025-10-14 10:06:48.591918725 +0000 UTC m=+0.205242588 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:06:48 np0005486759.ooo.test podman[323371]: 2025-10-14 10:06:48.689078932 +0000 UTC m=+0.304989024 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 10:06:48 np0005486759.ooo.test podman[323371]: 2025-10-14 10:06:48.702533664 +0000 UTC m=+0.318443786 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:48 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:06:48 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:06:48 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:48 np0005486759.ooo.test podman[323478]: 2025-10-14 10:06:48.796028198 +0000 UTC m=+0.059897915 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:06:48 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:48 np0005486759.ooo.test dnsmasq[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/addn_hosts - 1 addresses
Oct 14 10:06:48 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/host
Oct 14 10:06:48 np0005486759.ooo.test podman[323492]: 2025-10-14 10:06:48.870938233 +0000 UTC m=+0.064873968 container kill fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:48 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/opts
Oct 14 10:06:49 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:49.039 287366 INFO neutron.agent.dhcp.agent [None req-6621fe09-2604-474e-ad36-fc8d5902a6c0 - - - - - -] DHCP configuration for ports {'fdbb1fe7-d943-4dbf-a5d9-f4346a743952'} is completed
Oct 14 10:06:49 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:49.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:49 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:49.311 287366 INFO neutron.agent.dhcp.agent [None req-f4e5bae1-c16a-4a03-9f25-25387b795e1e - - - - - -] DHCP configuration for ports {'cda54e30-3942-4d38-90dc-cf7befc65bbf'} is completed
Oct 14 10:06:49 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:49.472 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:49 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:49.474 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:06:49 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:49.478 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:49 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:49.479 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d5dd1dec-6c89-4b9e-af66-97aaa53da365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65010 DF PROTO=TCP SPT=50080 DPT=9102 SEQ=960714738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA692980000000001030307) 
Oct 14 10:06:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:50.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:50 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:50.305 2 INFO neutron.agent.securitygroups_rpc [None req-09062f01-f142-49dd-87fd-a27b15f1ce7c 480400984d38484dbe29bb9c2def7428 710bf5711484449682775e44dbb1ee9d - - default default] Security group member updated ['9769d304-4be1-458d-b9d3-56f64b754316']
Oct 14 10:06:50 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:50.313 2 INFO neutron.agent.securitygroups_rpc [None req-002bafe9-87c1-40ee-9b56-287c31664a9a fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['e480e181-aceb-44b5-b4a3-fbacc157cbb1']
Oct 14 10:06:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65011 DF PROTO=TCP SPT=50080 DPT=9102 SEQ=960714738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA696810000000001030307) 
Oct 14 10:06:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:50.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:50 np0005486759.ooo.test dnsmasq[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/addn_hosts - 0 addresses
Oct 14 10:06:50 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/host
Oct 14 10:06:50 np0005486759.ooo.test dnsmasq-dhcp[323237]: read /var/lib/neutron/dhcp/e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8/opts
Oct 14 10:06:50 np0005486759.ooo.test podman[323539]: 2025-10-14 10:06:50.57647471 +0000 UTC m=+0.060849976 container kill fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:06:50 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:50Z|00245|binding|INFO|Releasing lport 6b4f10d6-ca0d-4141-8116-23360feb3d7a from this chassis (sb_readonly=0)
Oct 14 10:06:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:50.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:50 np0005486759.ooo.test kernel: device tap6b4f10d6-ca left promiscuous mode
Oct 14 10:06:50 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:50Z|00246|binding|INFO|Setting lport 6b4f10d6-ca0d-4141-8116-23360feb3d7a down in Southbound
Oct 14 10:06:50 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:50.753 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '710bf5711484449682775e44dbb1ee9d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19d413a5-8f35-45fb-9e09-a83f24d08850, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=6b4f10d6-ca0d-4141-8116-23360feb3d7a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:50 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:50.756 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 6b4f10d6-ca0d-4141-8116-23360feb3d7a in datapath e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8 unbound from our chassis
Oct 14 10:06:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:50.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:50 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:50.760 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:50.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:50 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:50.761 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a99da1a8-3660-4d01-8fad-1ac9aca4dd4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:51 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:51.263 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:51Z, description=, device_id=ee1b047f-5e09-442f-a097-905d91b2ab88, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7218b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5d67c0>], id=af12d28a-abd5-4e25-82f0-fe03317166fd, ip_allocation=immediate, mac_address=fa:16:3e:5c:22:90, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1951, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:06:51Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:06:51 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:06:51 np0005486759.ooo.test podman[323578]: 2025-10-14 10:06:51.442949122 +0000 UTC m=+0.048895958 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:06:51 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:51 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:51 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:51.661 287366 INFO neutron.agent.dhcp.agent [None req-e21507c2-be1c-4e8e-a51c-ae695d5c117a - - - - - -] DHCP configuration for ports {'af12d28a-abd5-4e25-82f0-fe03317166fd'} is completed
Oct 14 10:06:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:51.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:52 np0005486759.ooo.test dnsmasq[323237]: exiting on receipt of SIGTERM
Oct 14 10:06:52 np0005486759.ooo.test podman[323615]: 2025-10-14 10:06:52.392276134 +0000 UTC m=+0.058908536 container kill fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 14 10:06:52 np0005486759.ooo.test systemd[1]: libpod-fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9.scope: Deactivated successfully.
Oct 14 10:06:52 np0005486759.ooo.test podman[323630]: 2025-10-14 10:06:52.442133201 +0000 UTC m=+0.037615523 container died fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9-userdata-shm.mount: Deactivated successfully.
Oct 14 10:06:52 np0005486759.ooo.test podman[323630]: 2025-10-14 10:06:52.529525958 +0000 UTC m=+0.125008230 container cleanup fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:52 np0005486759.ooo.test systemd[1]: libpod-conmon-fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9.scope: Deactivated successfully.
Oct 14 10:06:52 np0005486759.ooo.test podman[323631]: 2025-10-14 10:06:52.551135081 +0000 UTC m=+0.141553218 container remove fb4708fd38a5702badc870fb694b0651f2e36267759ad95703c886e48d5e3dc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e619eb27-6c3d-4dbc-bd46-b9d5b83b04f8, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:06:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65012 DF PROTO=TCP SPT=50080 DPT=9102 SEQ=960714738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA69E810000000001030307) 
Oct 14 10:06:52 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:52.873 287366 INFO neutron.agent.dhcp.agent [None req-8dafe671-966f-4b01-bbb1-5c81708fa355 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:52 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:52.962 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:53 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:53.037 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:53 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:53.039 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:06:53 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:53.043 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:53 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:53.044 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d03da276-199f-4ca1-903a-d52433542755]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:53 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-f3824ee141cc8d6815a003395c22a6eea5bb77daea1511a4f59faf8488137bfb-merged.mount: Deactivated successfully.
Oct 14 10:06:53 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2de619eb27\x2d6c3d\x2d4dbc\x2dbd46\x2db9d5b83b04f8.mount: Deactivated successfully.
Oct 14 10:06:53 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:53.546 2 INFO neutron.agent.securitygroups_rpc [None req-3f897996-5504-4389-b26a-4333502f9906 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['e480e181-aceb-44b5-b4a3-fbacc157cbb1', 'a1f1b917-35e8-4bd6-a1eb-3732a9e3b0dd']
Oct 14 10:06:53 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:53.610 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:06:53 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:06:53 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:53 np0005486759.ooo.test podman[323674]: 2025-10-14 10:06:53.847023017 +0000 UTC m=+0.056586104 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:06:53 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:53 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:53Z|00247|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:06:53 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:53.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:54.172 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:06:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:54.173 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:06:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:54.174 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:06:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:54.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:54.336 2 INFO neutron.agent.securitygroups_rpc [None req-73d2e92a-3b61-421b-8bb6-b89165f8f8aa fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['a1f1b917-35e8-4bd6-a1eb-3732a9e3b0dd']
Oct 14 10:06:54 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:54.493 2 INFO neutron.agent.securitygroups_rpc [None req-35b506ed-24dd-4932-bb46-8fb5f66e5322 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:55 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:06:55 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:55 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:55 np0005486759.ooo.test podman[323709]: 2025-10-14 10:06:55.143128802 +0000 UTC m=+0.062228607 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:55 np0005486759.ooo.test systemd[1]: tmp-crun.8uDAuS.mount: Deactivated successfully.
Oct 14 10:06:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:55.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:55 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:06:55Z|00248|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:06:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:55.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:06:55 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:55.700 2 INFO neutron.agent.securitygroups_rpc [None req-1afb8856-c6d4-4a7b-af63-50d65c9e7094 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:06:56 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:56.008 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:06:55Z, description=, device_id=ab14f8f4-de19-4d9c-aae5-b1098c72edb7, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec653d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6535e0>], id=9440c7d5-870a-4613-b553-512f248cd9fb, ip_allocation=immediate, mac_address=fa:16:3e:f8:71:37, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1959, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:06:55Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:06:56 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:06:56 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:56 np0005486759.ooo.test podman[323746]: 2025-10-14 10:06:56.200296416 +0000 UTC m=+0.040098689 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:06:56 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:06:56 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:06:56 np0005486759.ooo.test systemd[1]: tmp-crun.jFPAfh.mount: Deactivated successfully.
Oct 14 10:06:56 np0005486759.ooo.test podman[323759]: 2025-10-14 10:06:56.28955613 +0000 UTC m=+0.066980513 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:06:56 np0005486759.ooo.test podman[323760]: 2025-10-14 10:06:56.339022416 +0000 UTC m=+0.112776606 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Oct 14 10:06:56 np0005486759.ooo.test podman[323760]: 2025-10-14 10:06:56.355310765 +0000 UTC m=+0.129064965 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Oct 14 10:06:56 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:06:56 np0005486759.ooo.test podman[323759]: 2025-10-14 10:06:56.406244925 +0000 UTC m=+0.183669228 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009)
Oct 14 10:06:56 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:06:56 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:06:56.480 287366 INFO neutron.agent.dhcp.agent [None req-a774179d-b395-4d0c-a6f0-417c3a7c3f95 - - - - - -] DHCP configuration for ports {'9440c7d5-870a-4613-b553-512f248cd9fb'} is completed
Oct 14 10:06:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65013 DF PROTO=TCP SPT=50080 DPT=9102 SEQ=960714738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA6AE410000000001030307) 
Oct 14 10:06:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:58.436 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:06:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:58.438 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:06:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:58.441 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:06:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:06:58.441 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[96798120-437d-4e3b-932c-480745286075]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:06:58 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:06:58.589 2 INFO neutron.agent.securitygroups_rpc [None req-24d697f3-68bf-4828-9637-b1c51f6751e0 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['0c0b38f6-04ac-4a8e-8456-627602cf6e66']
Oct 14 10:06:59 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:06:59 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:06:59 np0005486759.ooo.test podman[323824]: 2025-10-14 10:06:59.188644429 +0000 UTC m=+0.059334208 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:06:59 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:06:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:06:59.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:00 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:00.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:01.183 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:00Z, description=, device_id=752357aa-4c60-4f2f-9d02-04777e34ddb2, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec81a1c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec81ad90>], id=2310a59a-4c31-4c30-afde-7666bad21357, ip_allocation=immediate, mac_address=fa:16:3e:50:ad:6f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1979, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:07:00Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:07:01 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:07:01 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:01 np0005486759.ooo.test podman[323862]: 2025-10-14 10:07:01.456051138 +0000 UTC m=+0.044187315 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:07:01 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:01.781 287366 INFO neutron.agent.dhcp.agent [None req-9b41ce38-d174-4b39-b5b6-8b837e9aac74 - - - - - -] DHCP configuration for ports {'2310a59a-4c31-4c30-afde-7666bad21357'} is completed
Oct 14 10:07:01 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:01.786 2 INFO neutron.agent.securitygroups_rpc [None req-6ad1228b-a289-4ec7-a917-4fe8db24269c fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['5e7c8ff6-435f-4d44-aafd-bdc3cca336b0', 'fc0dc8f7-d045-49f2-bd7c-9fab6a62bfb5', '0c0b38f6-04ac-4a8e-8456-627602cf6e66']
Oct 14 10:07:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:01.870 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:01.871 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:07:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:01.873 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:07:01 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:01.874 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[863184ec-7878-4071-9db7-3accfe34a753]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:02 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:02.947 2 INFO neutron.agent.securitygroups_rpc [None req-44dcd68f-7e2b-4047-b56b-63e3e1f4f7b8 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['5e7c8ff6-435f-4d44-aafd-bdc3cca336b0', 'fc0dc8f7-d045-49f2-bd7c-9fab6a62bfb5']
Oct 14 10:07:03 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:03.953 2 INFO neutron.agent.securitygroups_rpc [None req-338da9b0-c7d0-4810-9c97-b585bc8587d8 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:04.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:04 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:04.922 2 INFO neutron.agent.securitygroups_rpc [None req-d2ef5c12-f882-4a14-b6df-69de5c6d9c81 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:05.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:05 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:07:05 np0005486759.ooo.test podman[323887]: 2025-10-14 10:07:05.496065017 +0000 UTC m=+0.088171102 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:07:05 np0005486759.ooo.test podman[323887]: 2025-10-14 10:07:05.503207416 +0000 UTC m=+0.095313511 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 14 10:07:05 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:07:05 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:07:05 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:05 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:05 np0005486759.ooo.test podman[323920]: 2025-10-14 10:07:05.651667613 +0000 UTC m=+0.056007756 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:06.837 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:06Z, description=, device_id=7d402485-8e6c-44f0-9092-f339d7bb183c, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec653460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7a8910>], id=fd79fee1-65f5-467e-8903-de73b08ec7b7, ip_allocation=immediate, mac_address=fa:16:3e:a2:6a:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1983, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:07:06Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:07:07 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:07:07 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:07 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:07 np0005486759.ooo.test podman[323956]: 2025-10-14 10:07:07.004820935 +0000 UTC m=+0.031174316 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0)
Oct 14 10:07:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:07:07 np0005486759.ooo.test podman[323971]: 2025-10-14 10:07:07.068463095 +0000 UTC m=+0.042841883 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:07:07 np0005486759.ooo.test podman[323971]: 2025-10-14 10:07:07.0771233 +0000 UTC m=+0.051502078 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 10:07:07 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:07:07 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:07.113 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:06Z, description=, device_id=4146daaf-1b8c-438d-bcc1-84b6cf031197, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec618eb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec618bb0>], id=c8fceec8-7a35-4a4f-be37-c22fa3de0713, ip_allocation=immediate, mac_address=fa:16:3e:20:45:e5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1984, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:07:06Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:07:07 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:07.168 287366 INFO neutron.agent.dhcp.agent [None req-7514741a-bd70-41cc-8f23-35d9c5197cf0 - - - - - -] DHCP configuration for ports {'fd79fee1-65f5-467e-8903-de73b08ec7b7'} is completed
Oct 14 10:07:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:07.254 2 INFO neutron.agent.securitygroups_rpc [None req-b979e336-75cb-4164-8760-7675cea7db11 fff35d3b4f2d4c2a958a3120126257a8 7f80bd80798c4b65b9ca3457716b0229 - - default default] Security group member updated ['201bb6d0-74bd-4bb0-a611-31f2c81fb03d']
Oct 14 10:07:07 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:07:07 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:07 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:07 np0005486759.ooo.test podman[324017]: 2025-10-14 10:07:07.323857608 +0000 UTC m=+0.063037201 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:07:07 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:07.530 287366 INFO neutron.agent.dhcp.agent [None req-9f40d28c-4fb7-48b9-ad51-68f3bd19901a - - - - - -] DHCP configuration for ports {'c8fceec8-7a35-4a4f-be37-c22fa3de0713'} is completed
Oct 14 10:07:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:09.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:10.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:10 np0005486759.ooo.test sshd[324038]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:07:10 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:10.705 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 2001:db8::f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 10.100.0.2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:10 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:10.709 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:07:10 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:10.711 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:07:10 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:10.713 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[7f595e8b-5c6d-49cc-b815-cb5cdceb951d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:07:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:07:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:07:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:07:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:07:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16700 "" "Go-http-client/1.1"
Oct 14 10:07:12 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:12.487 2 INFO neutron.agent.securitygroups_rpc [None req-ee283b61-8794-4393-a78f-0a9552be2454 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:12 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:07:12 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:12 np0005486759.ooo.test podman[324056]: 2025-10-14 10:07:12.710757477 +0000 UTC m=+0.050536409 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:12 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:13 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:13.715 2 INFO neutron.agent.securitygroups_rpc [None req-f4165a81-523d-4301-828c-e7f2d40001af e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:07:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:07:14 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:07:14 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:14 np0005486759.ooo.test podman[324093]: 2025-10-14 10:07:14.148692936 +0000 UTC m=+0.056454580 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:14 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:14.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:14.729 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:14Z, description=, device_id=39d03fb4-52cd-4dcc-8f53-91b5bdbcb479, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec60c1f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec60c430>], id=a24be4c7-1632-49bd-8ffb-8f1e179b4dae, ip_allocation=immediate, mac_address=fa:16:3e:90:d9:72, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1992, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:07:14Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:07:14 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:07:14 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:14 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:14 np0005486759.ooo.test podman[324130]: 2025-10-14 10:07:14.918728625 +0000 UTC m=+0.046926279 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:07:14 np0005486759.ooo.test sshd[324038]: Connection reset by 198.235.24.119 port 63332 [preauth]
Oct 14 10:07:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:15.130 287366 INFO neutron.agent.dhcp.agent [None req-8818bb3a-edd3-482f-a225-cb76c039a929 - - - - - -] DHCP configuration for ports {'a24be4c7-1632-49bd-8ffb-8f1e179b4dae'} is completed
Oct 14 10:07:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:15.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:16.114 2 INFO neutron.agent.securitygroups_rpc [None req-5bfcff17-a9fe-48c3-ac77-c8252a212687 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:16 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:16.998 2 INFO neutron.agent.securitygroups_rpc [None req-f8bb4abe-9bb6-4672-a606-35c9c08b368b e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:17 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:07:17 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:17 np0005486759.ooo.test podman[324168]: 2025-10-14 10:07:17.334001023 +0000 UTC m=+0.059639928 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:07:17 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:18 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:18.666 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:17Z, description=, device_id=42678713-f22c-48d1-acf7-4e8f90ba300e, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5d7880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5d7ac0>], id=3fda9533-dbb3-4177-9d0c-524d135fb784, ip_allocation=immediate, mac_address=fa:16:3e:fb:e6:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2008, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:07:17Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:07:18 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:07:18 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:18 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:18 np0005486759.ooo.test podman[324205]: 2025-10-14 10:07:18.869982165 +0000 UTC m=+0.057109841 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:07:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:07:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:07:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:07:18 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:07:18 np0005486759.ooo.test podman[324223]: 2025-10-14 10:07:18.98993665 +0000 UTC m=+0.084632044 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:07:19 np0005486759.ooo.test podman[324223]: 2025-10-14 10:07:19.002282448 +0000 UTC m=+0.096977842 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:07:19 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:07:19 np0005486759.ooo.test podman[324222]: 2025-10-14 10:07:19.057301784 +0000 UTC m=+0.151930516 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:07:19 np0005486759.ooo.test podman[324222]: 2025-10-14 10:07:19.090329545 +0000 UTC m=+0.184958247 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct 14 10:07:19 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:07:19 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:19.106 287366 INFO neutron.agent.dhcp.agent [None req-fb59a5d0-2d3a-49dc-bcb9-e6fe36ae8907 - - - - - -] DHCP configuration for ports {'3fda9533-dbb3-4177-9d0c-524d135fb784'} is completed
Oct 14 10:07:19 np0005486759.ooo.test podman[324221]: 2025-10-14 10:07:19.042760078 +0000 UTC m=+0.140504595 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:07:19 np0005486759.ooo.test podman[324221]: 2025-10-14 10:07:19.171485581 +0000 UTC m=+0.269230058 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:07:19 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:07:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:19.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:19.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:19.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:19 np0005486759.ooo.test podman[324224]: 2025-10-14 10:07:19.094232655 +0000 UTC m=+0.182574844 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:07:19 np0005486759.ooo.test podman[324224]: 2025-10-14 10:07:19.227399274 +0000 UTC m=+0.315741473 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:07:19 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:07:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:19.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37325 DF PROTO=TCP SPT=57818 DPT=9102 SEQ=64755847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA707C70000000001030307) 
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.209 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.209 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.210 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.210 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.300 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.373 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.375 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.429 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.431 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.473 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.474 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37326 DF PROTO=TCP SPT=57818 DPT=9102 SEQ=64755847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA70BC10000000001030307) 
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.546 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.712 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.714 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12268MB free_disk=386.67737579345703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.714 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.714 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:07:20 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:20.715 2 INFO neutron.agent.securitygroups_rpc [None req-094ab394-1cca-4acf-99cf-c825edd3980a e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.823 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.823 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.824 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.888 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.914 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.916 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:07:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:20.917 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:07:21 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:21.289 2 INFO neutron.agent.securitygroups_rpc [None req-58a2d5d7-1505-43d5-b1c0-2f14c704ffc4 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:21 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:07:21 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:21 np0005486759.ooo.test podman[324335]: 2025-10-14 10:07:21.490850771 +0000 UTC m=+0.055130609 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:07:21 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37327 DF PROTO=TCP SPT=57818 DPT=9102 SEQ=64755847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA713C10000000001030307) 
Oct 14 10:07:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:23.913 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:23.949 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:23.950 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:07:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:23.950 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.013 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.014 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.014 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.015 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:07:24 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:24.059 2 INFO neutron.agent.securitygroups_rpc [None req-353fddb4-e791-4233-b846-3e77e0fe7238 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.696 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.719 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.720 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.721 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.722 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:07:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:24.723 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:07:24 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:24.782 2 INFO neutron.agent.securitygroups_rpc [None req-5f1cfb27-a874-4733-8111-c67815229c5c e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:25.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:26.435 2 INFO neutron.agent.securitygroups_rpc [None req-fc92577b-7393-4362-965e-c4c3a7224b31 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37328 DF PROTO=TCP SPT=57818 DPT=9102 SEQ=64755847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA723810000000001030307) 
Oct 14 10:07:26 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:26.842 2 INFO neutron.agent.securitygroups_rpc [None req-24002352-1d91-4ff8-992a-63696e177a45 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:07:27 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:07:27 np0005486759.ooo.test podman[324356]: 2025-10-14 10:07:27.437468367 +0000 UTC m=+0.064302481 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal)
Oct 14 10:07:27 np0005486759.ooo.test podman[324356]: 2025-10-14 10:07:27.450105113 +0000 UTC m=+0.076939257 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, vcs-type=git)
Oct 14 10:07:27 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:07:27 np0005486759.ooo.test podman[324355]: 2025-10-14 10:07:27.501324173 +0000 UTC m=+0.128167917 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251009, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Oct 14 10:07:27 np0005486759.ooo.test podman[324355]: 2025-10-14 10:07:27.561844926 +0000 UTC m=+0.188688610 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:27 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:07:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:28.812 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:28.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:28 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:28.815 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:07:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:29.143 287366 INFO neutron.agent.linux.ip_lib [None req-6f235ec2-1de5-4a9e-b7a1-f4f10e1f924e - - - - - -] Device tapd9d8ab7a-57 cannot be used as it has no MAC address
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:29 np0005486759.ooo.test kernel: device tapd9d8ab7a-57 entered promiscuous mode
Oct 14 10:07:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:29Z|00249|binding|INFO|Claiming lport d9d8ab7a-5731-437c-ac73-0b31e88134b9 for this chassis.
Oct 14 10:07:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:29Z|00250|binding|INFO|d9d8ab7a-5731-437c-ac73-0b31e88134b9: Claiming unknown
Oct 14 10:07:29 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436449.1747] manager: (tapd9d8ab7a-57): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:29 np0005486759.ooo.test systemd-udevd[324411]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:07:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:29.181 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-c0dad70c-e24d-4510-839c-2316ed4bb5b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0dad70c-e24d-4510-839c-2316ed4bb5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '710bf5711484449682775e44dbb1ee9d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3dbf6691-01d3-4d01-b1a4-a9b810617859, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=d9d8ab7a-5731-437c-ac73-0b31e88134b9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:29.183 183328 INFO neutron.agent.ovn.metadata.agent [-] Port d9d8ab7a-5731-437c-ac73-0b31e88134b9 in datapath c0dad70c-e24d-4510-839c-2316ed4bb5b1 bound to our chassis
Oct 14 10:07:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:29.185 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0dad70c-e24d-4510-839c-2316ed4bb5b1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:07:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:29.186 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[7a29f30c-ae37-4a87-96f6-92aa17dc3e3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:29Z|00251|binding|INFO|Setting lport d9d8ab7a-5731-437c-ac73-0b31e88134b9 ovn-installed in OVS
Oct 14 10:07:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:29Z|00252|binding|INFO|Setting lport d9d8ab7a-5731-437c-ac73-0b31e88134b9 up in Southbound
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapd9d8ab7a-57: No such device
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:29.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:30 np0005486759.ooo.test podman[324482]: 
Oct 14 10:07:30 np0005486759.ooo.test podman[324482]: 2025-10-14 10:07:30.112146982 +0000 UTC m=+0.087611856 container create e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:07:30 np0005486759.ooo.test systemd[1]: Started libpod-conmon-e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840.scope.
Oct 14 10:07:30 np0005486759.ooo.test podman[324482]: 2025-10-14 10:07:30.068221486 +0000 UTC m=+0.043686380 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:07:30 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:07:30 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/121ac324a543daa0c9d65579c3ea18cb526e200b666b3c4c2f37d8401b135f14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:07:30 np0005486759.ooo.test podman[324482]: 2025-10-14 10:07:30.185917871 +0000 UTC m=+0.161382755 container init e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0)
Oct 14 10:07:30 np0005486759.ooo.test podman[324482]: 2025-10-14 10:07:30.194506454 +0000 UTC m=+0.169971358 container start e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq[324500]: started, version 2.85 cachesize 150
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq[324500]: DNS service limited to local subnets
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq[324500]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq[324500]: warning: no upstream servers configured
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq-dhcp[324500]: DHCP, static leases only on 10.101.0.0, lease time 1d
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/addn_hosts - 0 addresses
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/host
Oct 14 10:07:30 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/opts
Oct 14 10:07:30 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:30.296 287366 INFO neutron.agent.dhcp.agent [None req-9a7a01ed-c8bf-4364-83d3-084eb67a6126 - - - - - -] DHCP configuration for ports {'2937c270-40ee-46b1-b4d4-4269acf6b8a6'} is completed
Oct 14 10:07:30 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:30.410 2 INFO neutron.agent.securitygroups_rpc [None req-b7c850eb-4703-492b-bd9b-7919fe1dd49a e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:30.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:31.124 2 INFO neutron.agent.securitygroups_rpc [None req-ede035d5-8887-4850-83fb-8d729b6116c4 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:31.701 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:31Z, description=, device_id=9adcf0e9-5a69-4f9e-9f60-62e9f4081030, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ecfea880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec61a4c0>], id=0463beeb-3c9e-4a86-996e-e8aacef5efc3, ip_allocation=immediate, mac_address=fa:16:3e:29:19:d2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:07:27Z, description=, dns_domain=, id=c0dad70c-e24d-4510-839c-2316ed4bb5b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1671029793, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16570, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2030, status=ACTIVE, subnets=['ff81dcc9-28cb-4984-820f-32cca70c12e1'], tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:28Z, vlan_transparent=None, network_id=c0dad70c-e24d-4510-839c-2316ed4bb5b1, port_security_enabled=False, project_id=710bf5711484449682775e44dbb1ee9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2050, status=DOWN, tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:31Z on network c0dad70c-e24d-4510-839c-2316ed4bb5b1
Oct 14 10:07:31 np0005486759.ooo.test dnsmasq[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/addn_hosts - 1 addresses
Oct 14 10:07:31 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/host
Oct 14 10:07:31 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/opts
Oct 14 10:07:31 np0005486759.ooo.test podman[324516]: 2025-10-14 10:07:31.912047448 +0000 UTC m=+0.056239083 container kill e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 10:07:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:32.161 287366 INFO neutron.agent.dhcp.agent [None req-af3ed79e-cdd8-446b-92ff-06ae288a010d - - - - - -] DHCP configuration for ports {'0463beeb-3c9e-4a86-996e-e8aacef5efc3'} is completed
Oct 14 10:07:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:32.551 287366 INFO neutron.agent.linux.ip_lib [None req-e152efbb-a9f2-44a3-971f-de3ca81d8017 - - - - - -] Device tap5cc90713-bb cannot be used as it has no MAC address
Oct 14 10:07:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:32.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:32 np0005486759.ooo.test kernel: device tap5cc90713-bb entered promiscuous mode
Oct 14 10:07:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:32Z|00253|binding|INFO|Claiming lport 5cc90713-bbb8-49a2-aaa0-b34382492925 for this chassis.
Oct 14 10:07:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:32Z|00254|binding|INFO|5cc90713-bbb8-49a2-aaa0-b34382492925: Claiming unknown
Oct 14 10:07:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:32.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:32 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436452.5794] manager: (tap5cc90713-bb): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Oct 14 10:07:32 np0005486759.ooo.test systemd-udevd[324547]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:07:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:32.591 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-8f12702f-3688-496d-8efe-d1c8c773af06', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f12702f-3688-496d-8efe-d1c8c773af06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd795491f5c1d4cb0bdc37f8eea30c3f8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ef3c7bc-8a64-447f-b6c8-4b356ce46062, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=5cc90713-bbb8-49a2-aaa0-b34382492925) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:32.593 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 5cc90713-bbb8-49a2-aaa0-b34382492925 in datapath 8f12702f-3688-496d-8efe-d1c8c773af06 bound to our chassis
Oct 14 10:07:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:32.595 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f12702f-3688-496d-8efe-d1c8c773af06 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:07:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:32.596 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[38c04da0-6003-494a-b393-05c625781809]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:32.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:32Z|00255|binding|INFO|Setting lport 5cc90713-bbb8-49a2-aaa0-b34382492925 ovn-installed in OVS
Oct 14 10:07:32 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:32Z|00256|binding|INFO|Setting lport 5cc90713-bbb8-49a2-aaa0-b34382492925 up in Southbound
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:32.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5cc90713-bb: No such device
Oct 14 10:07:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:32.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:32.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:32 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:32.817 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:07:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:33.341 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:31Z, description=, device_id=9adcf0e9-5a69-4f9e-9f60-62e9f4081030, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec594820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec594100>], id=0463beeb-3c9e-4a86-996e-e8aacef5efc3, ip_allocation=immediate, mac_address=fa:16:3e:29:19:d2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:07:27Z, description=, dns_domain=, id=c0dad70c-e24d-4510-839c-2316ed4bb5b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1671029793, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16570, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2030, status=ACTIVE, subnets=['ff81dcc9-28cb-4984-820f-32cca70c12e1'], tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:28Z, vlan_transparent=None, network_id=c0dad70c-e24d-4510-839c-2316ed4bb5b1, port_security_enabled=False, project_id=710bf5711484449682775e44dbb1ee9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2050, status=DOWN, tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:31Z on network c0dad70c-e24d-4510-839c-2316ed4bb5b1
Oct 14 10:07:33 np0005486759.ooo.test podman[324618]: 
Oct 14 10:07:33 np0005486759.ooo.test podman[324618]: 2025-10-14 10:07:33.519075358 +0000 UTC m=+0.144245251 container create 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:33 np0005486759.ooo.test podman[324618]: 2025-10-14 10:07:33.424057127 +0000 UTC m=+0.049227050 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:07:33 np0005486759.ooo.test systemd[1]: Started libpod-conmon-48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c.scope.
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/addn_hosts - 1 addresses
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/host
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/opts
Oct 14 10:07:33 np0005486759.ooo.test podman[324648]: 2025-10-14 10:07:33.56485921 +0000 UTC m=+0.070387397 container kill e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:07:33 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:07:33 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92633e4c96bd4a29a5a76734fcf701d6853232ae15965c69bf7c20fb7c693c5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:07:33 np0005486759.ooo.test podman[324618]: 2025-10-14 10:07:33.589529926 +0000 UTC m=+0.214699829 container init 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:07:33 np0005486759.ooo.test podman[324618]: 2025-10-14 10:07:33.59914051 +0000 UTC m=+0.224310403 container start 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS)
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq[324666]: started, version 2.85 cachesize 150
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq[324666]: DNS service limited to local subnets
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq[324666]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq[324666]: warning: no upstream servers configured
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq-dhcp[324666]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/addn_hosts - 0 addresses
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/host
Oct 14 10:07:33 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/opts
Oct 14 10:07:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:33.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:33.874 287366 INFO neutron.agent.dhcp.agent [None req-51e26f4c-9d71-4ec7-9f3a-bf0c025f94db - - - - - -] DHCP configuration for ports {'edf061e1-d560-49f1-a48d-0cebdf2b0fff', '0463beeb-3c9e-4a86-996e-e8aacef5efc3'} is completed
Oct 14 10:07:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:34.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:35 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:35.307 2 INFO neutron.agent.securitygroups_rpc [None req-2dac0d34-2f80-4bc4-be68-b05b78f92c4a e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:35.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:36 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:36.060 2 INFO neutron.agent.securitygroups_rpc [None req-3690a8dc-9653-4fff-a5a1-66bf882b954d e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:36 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:36.065 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:35Z, description=, device_id=3151e51e-75c8-43bd-9ccd-1ab1547aba71, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed0081c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7d57f0>], id=f414eacf-a8c2-4bd8-b54a-4cf62fd674c4, ip_allocation=immediate, mac_address=fa:16:3e:48:f0:e0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:07:29Z, description=, dns_domain=, id=8f12702f-3688-496d-8efe-d1c8c773af06, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1419838196, port_security_enabled=True, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53796, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2046, status=ACTIVE, subnets=['9d3a0396-6fce-4569-af6c-f970e7529bea'], tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:07:31Z, vlan_transparent=None, network_id=8f12702f-3688-496d-8efe-d1c8c773af06, port_security_enabled=False, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2064, status=DOWN, tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:07:35Z on network 8f12702f-3688-496d-8efe-d1c8c773af06
Oct 14 10:07:36 np0005486759.ooo.test dnsmasq[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/addn_hosts - 1 addresses
Oct 14 10:07:36 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/host
Oct 14 10:07:36 np0005486759.ooo.test podman[324692]: 2025-10-14 10:07:36.256760622 +0000 UTC m=+0.059176384 container kill 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:07:36 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/opts
Oct 14 10:07:36 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:07:36 np0005486759.ooo.test systemd[1]: tmp-crun.oOQLnH.mount: Deactivated successfully.
Oct 14 10:07:36 np0005486759.ooo.test podman[324707]: 2025-10-14 10:07:36.372643522 +0000 UTC m=+0.088124861 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 14 10:07:36 np0005486759.ooo.test podman[324707]: 2025-10-14 10:07:36.406354645 +0000 UTC m=+0.121835934 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:36 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:07:36 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:36.537 287366 INFO neutron.agent.dhcp.agent [None req-016e4774-33d9-4c1a-917e-16f0b834b0cf - - - - - -] DHCP configuration for ports {'f414eacf-a8c2-4bd8-b54a-4cf62fd674c4'} is completed
Oct 14 10:07:37 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:07:37 np0005486759.ooo.test podman[324730]: 2025-10-14 10:07:37.448803699 +0000 UTC m=+0.077299720 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:07:37 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:37.453 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:35Z, description=, device_id=3151e51e-75c8-43bd-9ccd-1ab1547aba71, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5ec580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5ecd00>], id=f414eacf-a8c2-4bd8-b54a-4cf62fd674c4, ip_allocation=immediate, mac_address=fa:16:3e:48:f0:e0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:07:29Z, description=, dns_domain=, id=8f12702f-3688-496d-8efe-d1c8c773af06, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1419838196, port_security_enabled=True, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53796, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2046, status=ACTIVE, subnets=['9d3a0396-6fce-4569-af6c-f970e7529bea'], tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:07:31Z, vlan_transparent=None, network_id=8f12702f-3688-496d-8efe-d1c8c773af06, port_security_enabled=False, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2064, status=DOWN, tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:07:35Z on network 8f12702f-3688-496d-8efe-d1c8c773af06
Oct 14 10:07:37 np0005486759.ooo.test podman[324730]: 2025-10-14 10:07:37.485295027 +0000 UTC m=+0.113791008 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:07:37 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:07:37 np0005486759.ooo.test dnsmasq[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/addn_hosts - 1 addresses
Oct 14 10:07:37 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/host
Oct 14 10:07:37 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/opts
Oct 14 10:07:37 np0005486759.ooo.test podman[324771]: 2025-10-14 10:07:37.646234366 +0000 UTC m=+0.058042829 container kill 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:07:37 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:37.882 287366 INFO neutron.agent.dhcp.agent [None req-654beb60-48f4-4ede-aeb2-497d5e0f2e5e - - - - - -] DHCP configuration for ports {'f414eacf-a8c2-4bd8-b54a-4cf62fd674c4'} is completed
Oct 14 10:07:39 np0005486759.ooo.test dnsmasq[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/addn_hosts - 0 addresses
Oct 14 10:07:39 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/host
Oct 14 10:07:39 np0005486759.ooo.test podman[324809]: 2025-10-14 10:07:39.072321632 +0000 UTC m=+0.065266050 container kill 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:07:39 np0005486759.ooo.test dnsmasq-dhcp[324666]: read /var/lib/neutron/dhcp/8f12702f-3688-496d-8efe-d1c8c773af06/opts
Oct 14 10:07:39 np0005486759.ooo.test kernel: device tap5cc90713-bb left promiscuous mode
Oct 14 10:07:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:39Z|00257|binding|INFO|Releasing lport 5cc90713-bbb8-49a2-aaa0-b34382492925 from this chassis (sb_readonly=0)
Oct 14 10:07:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:39Z|00258|binding|INFO|Setting lport 5cc90713-bbb8-49a2-aaa0-b34382492925 down in Southbound
Oct 14 10:07:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:39.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.297 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-8f12702f-3688-496d-8efe-d1c8c773af06', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f12702f-3688-496d-8efe-d1c8c773af06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd795491f5c1d4cb0bdc37f8eea30c3f8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ef3c7bc-8a64-447f-b6c8-4b356ce46062, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=5cc90713-bbb8-49a2-aaa0-b34382492925) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.299 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 5cc90713-bbb8-49a2-aaa0-b34382492925 in datapath 8f12702f-3688-496d-8efe-d1c8c773af06 unbound from our chassis
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.301 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f12702f-3688-496d-8efe-d1c8c773af06 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.303 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[a30d39aa-2dad-4cc3-9213-83344a79c87c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:39.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:39.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.783 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:74:c2 2001:db8:0:1:f816:3eff:fe04:74c2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3728b3ee-a3aa-4d06-969e-85cd7ad44fa6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1e57820-cc57-4e14-acb6-c1eb8865310f) old=Port_Binding(mac=['fa:16:3e:04:74:c2 2001:db8::f816:3eff:fe04:74c2'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:74c2/64', 'neutron:device_id': 'ovnmeta-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7611020a-b462-493c-ab89-50ce86f0c0fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5b15a8a02c49648cdd9088cd23ebb4', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.786 183328 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1e57820-cc57-4e14-acb6-c1eb8865310f in datapath 7611020a-b462-493c-ab89-50ce86f0c0fc updated
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.789 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7611020a-b462-493c-ab89-50ce86f0c0fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:07:39 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:39.790 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3a6dd6-cc96-4cb7-b120-c47b9747675d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:40 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:40.431 2 INFO neutron.agent.securitygroups_rpc [None req-3c0f4bf8-908e-46b5-afc6-3bfcb8991389 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:40.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:41 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:41.319 2 INFO neutron.agent.securitygroups_rpc [None req-ba896242-93bb-4a4d-80de-df5d994ad4a9 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:41 np0005486759.ooo.test dnsmasq[324666]: exiting on receipt of SIGTERM
Oct 14 10:07:41 np0005486759.ooo.test systemd[1]: libpod-48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c.scope: Deactivated successfully.
Oct 14 10:07:41 np0005486759.ooo.test podman[324848]: 2025-10-14 10:07:41.823558342 +0000 UTC m=+0.060885476 container kill 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:07:41 np0005486759.ooo.test podman[324860]: 2025-10-14 10:07:41.889394779 +0000 UTC m=+0.053069587 container died 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:07:41 np0005486759.ooo.test systemd[1]: tmp-crun.Jw27Ib.mount: Deactivated successfully.
Oct 14 10:07:41 np0005486759.ooo.test podman[324860]: 2025-10-14 10:07:41.924157334 +0000 UTC m=+0.087832102 container cleanup 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009)
Oct 14 10:07:41 np0005486759.ooo.test systemd[1]: libpod-conmon-48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c.scope: Deactivated successfully.
Oct 14 10:07:41 np0005486759.ooo.test podman[324862]: 2025-10-14 10:07:41.974564638 +0000 UTC m=+0.132178060 container remove 48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f12702f-3688-496d-8efe-d1c8c773af06, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true)
Oct 14 10:07:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:07:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:07:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:07:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132497 "" "Go-http-client/1.1"
Oct 14 10:07:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:07:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17162 "" "Go-http-client/1.1"
Oct 14 10:07:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:42.581 287366 INFO neutron.agent.dhcp.agent [None req-b39551c1-337c-4f9a-82ce-c230460983f7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:42.595 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-92633e4c96bd4a29a5a76734fcf701d6853232ae15965c69bf7c20fb7c693c5c-merged.mount: Deactivated successfully.
Oct 14 10:07:42 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48d8f65b730eefdd0deb69dbb18e489a3595ae4b0c14142ca5998094499f0b6c-userdata-shm.mount: Deactivated successfully.
Oct 14 10:07:42 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d8f12702f\x2d3688\x2d496d\x2d8efe\x2dd1c8c773af06.mount: Deactivated successfully.
Oct 14 10:07:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:42.973 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:43Z|00259|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:43 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:43.881 287366 INFO neutron.agent.linux.ip_lib [None req-b40fdaf0-04b6-4389-a2ff-2aaf1a64587b - - - - - -] Device tapcfdaf625-40 cannot be used as it has no MAC address
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:43 np0005486759.ooo.test kernel: device tapcfdaf625-40 entered promiscuous mode
Oct 14 10:07:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:43Z|00260|binding|INFO|Claiming lport cfdaf625-40ea-47f3-af7b-c345aee8a3a3 for this chassis.
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:43Z|00261|binding|INFO|cfdaf625-40ea-47f3-af7b-c345aee8a3a3: Claiming unknown
Oct 14 10:07:43 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436463.9121] manager: (tapcfdaf625-40): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Oct 14 10:07:43 np0005486759.ooo.test systemd-udevd[324899]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:07:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:43.929 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-5a1489f9-472c-4912-96de-235eac5cddeb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a1489f9-472c-4912-96de-235eac5cddeb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '710bf5711484449682775e44dbb1ee9d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9939699d-c2e3-4698-92f1-83a00cfd9bab, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=cfdaf625-40ea-47f3-af7b-c345aee8a3a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:43.931 183328 INFO neutron.agent.ovn.metadata.agent [-] Port cfdaf625-40ea-47f3-af7b-c345aee8a3a3 in datapath 5a1489f9-472c-4912-96de-235eac5cddeb bound to our chassis
Oct 14 10:07:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:43.932 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a1489f9-472c-4912-96de-235eac5cddeb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:07:43 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:43.936 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[052e8cab-38ed-4f38-98b2-c6366afc2245]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:43Z|00262|binding|INFO|Setting lport cfdaf625-40ea-47f3-af7b-c345aee8a3a3 ovn-installed in OVS
Oct 14 10:07:43 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:43Z|00263|binding|INFO|Setting lport cfdaf625-40ea-47f3-af7b-c345aee8a3a3 up in Southbound
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapcfdaf625-40: No such device
Oct 14 10:07:43 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:43.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:07:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:07:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:07:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:44.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:44.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:44 np0005486759.ooo.test podman[324970]: 
Oct 14 10:07:44 np0005486759.ooo.test podman[324970]: 2025-10-14 10:07:44.874822063 +0000 UTC m=+0.080005552 container create 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0)
Oct 14 10:07:44 np0005486759.ooo.test systemd[1]: Started libpod-conmon-1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90.scope.
Oct 14 10:07:44 np0005486759.ooo.test systemd[1]: tmp-crun.zDiKsU.mount: Deactivated successfully.
Oct 14 10:07:44 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:07:44 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/354160576dfec1fbe856c00446aa8a2f8eefb8697b1e4a4e4e60b982b1c46f38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:07:44 np0005486759.ooo.test podman[324970]: 2025-10-14 10:07:44.835269451 +0000 UTC m=+0.040452960 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:07:44 np0005486759.ooo.test podman[324970]: 2025-10-14 10:07:44.94393882 +0000 UTC m=+0.149122299 container init 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009)
Oct 14 10:07:44 np0005486759.ooo.test podman[324970]: 2025-10-14 10:07:44.953119681 +0000 UTC m=+0.158303170 container start 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq[324989]: started, version 2.85 cachesize 150
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq[324989]: DNS service limited to local subnets
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq[324989]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq[324989]: warning: no upstream servers configured
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq-dhcp[324989]: DHCP, static leases only on 10.103.0.0, lease time 1d
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/addn_hosts - 0 addresses
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/host
Oct 14 10:07:44 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/opts
Oct 14 10:07:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:45.116 287366 INFO neutron.agent.dhcp.agent [None req-d614d773-0354-4936-870c-643ca987ec20 - - - - - -] DHCP configuration for ports {'53fc835a-02fd-428f-9fe1-723ab45ed370'} is completed
Oct 14 10:07:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:45.395 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:45Z, description=, device_id=9adcf0e9-5a69-4f9e-9f60-62e9f4081030, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec590460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec590610>], id=f9ab1cb5-85de-49ed-865c-b9bfdc7950f0, ip_allocation=immediate, mac_address=fa:16:3e:5b:c7:69, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:07:40Z, description=, dns_domain=, id=5a1489f9-472c-4912-96de-235eac5cddeb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1199762238, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3529, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2071, status=ACTIVE, subnets=['f55299a5-8c19-4943-933a-efc2b90027e8'], tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:42Z, vlan_transparent=None, network_id=5a1489f9-472c-4912-96de-235eac5cddeb, port_security_enabled=False, project_id=710bf5711484449682775e44dbb1ee9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2082, status=DOWN, tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:45Z on network 5a1489f9-472c-4912-96de-235eac5cddeb
Oct 14 10:07:45 np0005486759.ooo.test dnsmasq[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/addn_hosts - 1 addresses
Oct 14 10:07:45 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/host
Oct 14 10:07:45 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/opts
Oct 14 10:07:45 np0005486759.ooo.test podman[325008]: 2025-10-14 10:07:45.598391468 +0000 UTC m=+0.055402188 container kill 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:07:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:45.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:45.848 287366 INFO neutron.agent.dhcp.agent [None req-7cd811bf-9ea2-4a95-8eff-2544f631d28a - - - - - -] DHCP configuration for ports {'f9ab1cb5-85de-49ed-865c-b9bfdc7950f0'} is completed
Oct 14 10:07:46 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:46.016 2 INFO neutron.agent.securitygroups_rpc [None req-2c06f099-b3c9-467c-8bdf-8489a4e18aa9 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:46 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:46.350 2 INFO neutron.agent.securitygroups_rpc [None req-4dddf0b2-d5e6-4f23-808f-78aa8db42213 e1c090c8694c447b95f0e7dad1e1047c fe5b15a8a02c49648cdd9088cd23ebb4 - - default default] Security group member updated ['cd03e991-2cd4-466f-a3db-acfce81ca1be']
Oct 14 10:07:46 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:46.548 2 INFO neutron.agent.securitygroups_rpc [None req-3caaab92-dd27-49ea-9ec3-28e3a067a698 be51b1781f0540818bc435f8d0cc527b d795491f5c1d4cb0bdc37f8eea30c3f8 - - default default] Security group member updated ['62f89a8f-1b5c-4a5e-a7ba-04cfc0de2960']
Oct 14 10:07:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:46.557 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:45Z, description=, device_id=9adcf0e9-5a69-4f9e-9f60-62e9f4081030, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec59c1c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec61e520>], id=f9ab1cb5-85de-49ed-865c-b9bfdc7950f0, ip_allocation=immediate, mac_address=fa:16:3e:5b:c7:69, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:07:40Z, description=, dns_domain=, id=5a1489f9-472c-4912-96de-235eac5cddeb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1199762238, port_security_enabled=True, project_id=710bf5711484449682775e44dbb1ee9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3529, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2071, status=ACTIVE, subnets=['f55299a5-8c19-4943-933a-efc2b90027e8'], tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:42Z, vlan_transparent=None, network_id=5a1489f9-472c-4912-96de-235eac5cddeb, port_security_enabled=False, project_id=710bf5711484449682775e44dbb1ee9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2082, status=DOWN, tags=[], tenant_id=710bf5711484449682775e44dbb1ee9d, updated_at=2025-10-14T10:07:45Z on network 5a1489f9-472c-4912-96de-235eac5cddeb
Oct 14 10:07:46 np0005486759.ooo.test dnsmasq[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/addn_hosts - 1 addresses
Oct 14 10:07:46 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/host
Oct 14 10:07:46 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/opts
Oct 14 10:07:46 np0005486759.ooo.test podman[325043]: 2025-10-14 10:07:46.77527355 +0000 UTC m=+0.058869364 container kill 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:46 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:46.996 287366 INFO neutron.agent.dhcp.agent [None req-3f191cb5-b9c2-4d1b-a96b-3a0c4a79ee51 - - - - - -] DHCP configuration for ports {'f9ab1cb5-85de-49ed-865c-b9bfdc7950f0'} is completed
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:07:49 np0005486759.ooo.test podman[325065]: 2025-10-14 10:07:49.463916553 +0000 UTC m=+0.090187835 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:07:49 np0005486759.ooo.test podman[325065]: 2025-10-14 10:07:49.49845907 +0000 UTC m=+0.124730342 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:07:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30269 DF PROTO=TCP SPT=46098 DPT=9102 SEQ=2114472104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA77CF70000000001030307) 
Oct 14 10:07:49 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:49.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:07:49 np0005486759.ooo.test podman[325068]: 2025-10-14 10:07:49.574951023 +0000 UTC m=+0.191717153 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:07:49 np0005486759.ooo.test podman[325067]: 2025-10-14 10:07:49.624194852 +0000 UTC m=+0.244035306 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 14 10:07:49 np0005486759.ooo.test podman[325068]: 2025-10-14 10:07:49.637876021 +0000 UTC m=+0.254642151 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Oct 14 10:07:49 np0005486759.ooo.test podman[325067]: 2025-10-14 10:07:49.641633746 +0000 UTC m=+0.261474200 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009)
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:07:49 np0005486759.ooo.test podman[325066]: 2025-10-14 10:07:49.53955915 +0000 UTC m=+0.162861680 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 14 10:07:49 np0005486759.ooo.test podman[325066]: 2025-10-14 10:07:49.719801061 +0000 UTC m=+0.343103601 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:07:49 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:07:50 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:07:50.262 2 INFO neutron.agent.securitygroups_rpc [None req-2a99b282-2b20-49f3-bf9c-bf312793e89d be51b1781f0540818bc435f8d0cc527b d795491f5c1d4cb0bdc37f8eea30c3f8 - - default default] Security group member updated ['62f89a8f-1b5c-4a5e-a7ba-04cfc0de2960']
Oct 14 10:07:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30270 DF PROTO=TCP SPT=46098 DPT=9102 SEQ=2114472104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA781010000000001030307) 
Oct 14 10:07:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:50.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:51 np0005486759.ooo.test dnsmasq[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/addn_hosts - 0 addresses
Oct 14 10:07:51 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/host
Oct 14 10:07:51 np0005486759.ooo.test dnsmasq-dhcp[324989]: read /var/lib/neutron/dhcp/5a1489f9-472c-4912-96de-235eac5cddeb/opts
Oct 14 10:07:51 np0005486759.ooo.test podman[325162]: 2025-10-14 10:07:51.483019524 +0000 UTC m=+0.053962804 container kill 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:51 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:51Z|00264|binding|INFO|Releasing lport cfdaf625-40ea-47f3-af7b-c345aee8a3a3 from this chassis (sb_readonly=0)
Oct 14 10:07:51 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:51Z|00265|binding|INFO|Setting lport cfdaf625-40ea-47f3-af7b-c345aee8a3a3 down in Southbound
Oct 14 10:07:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:51.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:51 np0005486759.ooo.test kernel: device tapcfdaf625-40 left promiscuous mode
Oct 14 10:07:51 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:51.691 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-5a1489f9-472c-4912-96de-235eac5cddeb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a1489f9-472c-4912-96de-235eac5cddeb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '710bf5711484449682775e44dbb1ee9d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9939699d-c2e3-4698-92f1-83a00cfd9bab, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=cfdaf625-40ea-47f3-af7b-c345aee8a3a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:51 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:51.693 183328 INFO neutron.agent.ovn.metadata.agent [-] Port cfdaf625-40ea-47f3-af7b-c345aee8a3a3 in datapath 5a1489f9-472c-4912-96de-235eac5cddeb unbound from our chassis
Oct 14 10:07:51 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:51.697 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5a1489f9-472c-4912-96de-235eac5cddeb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:07:51 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:51.698 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e7705d-7d8e-44f5-9d7c-e2c476fec2e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:51.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30271 DF PROTO=TCP SPT=46098 DPT=9102 SEQ=2114472104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA789010000000001030307) 
Oct 14 10:07:52 np0005486759.ooo.test dnsmasq[324989]: exiting on receipt of SIGTERM
Oct 14 10:07:52 np0005486759.ooo.test podman[325203]: 2025-10-14 10:07:52.806759645 +0000 UTC m=+0.054029696 container kill 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009)
Oct 14 10:07:52 np0005486759.ooo.test systemd[1]: libpod-1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90.scope: Deactivated successfully.
Oct 14 10:07:52 np0005486759.ooo.test podman[325217]: 2025-10-14 10:07:52.861508231 +0000 UTC m=+0.039333355 container died 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:07:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90-userdata-shm.mount: Deactivated successfully.
Oct 14 10:07:52 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-354160576dfec1fbe856c00446aa8a2f8eefb8697b1e4a4e4e60b982b1c46f38-merged.mount: Deactivated successfully.
Oct 14 10:07:52 np0005486759.ooo.test podman[325217]: 2025-10-14 10:07:52.910912135 +0000 UTC m=+0.088737229 container remove 1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a1489f9-472c-4912-96de-235eac5cddeb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:07:52 np0005486759.ooo.test systemd[1]: libpod-conmon-1225f10d96b29144671b2c93d141f332f617cf1e6fc68a3d85a891fcc63bdb90.scope: Deactivated successfully.
Oct 14 10:07:52 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d5a1489f9\x2d472c\x2d4912\x2d96de\x2d235eac5cddeb.mount: Deactivated successfully.
Oct 14 10:07:52 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:52.975 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:53 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:53.566 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:53 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:53Z|00266|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:07:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:54.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:54.173 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:07:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:54.174 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:07:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:54.175 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:07:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:54.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:55.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:56 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:56Z|00267|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:07:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:56.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30272 DF PROTO=TCP SPT=46098 DPT=9102 SEQ=2114472104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA798C10000000001030307) 
Oct 14 10:07:57 np0005486759.ooo.test dnsmasq[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/addn_hosts - 0 addresses
Oct 14 10:07:57 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/host
Oct 14 10:07:57 np0005486759.ooo.test dnsmasq-dhcp[324500]: read /var/lib/neutron/dhcp/c0dad70c-e24d-4510-839c-2316ed4bb5b1/opts
Oct 14 10:07:57 np0005486759.ooo.test podman[325260]: 2025-10-14 10:07:57.756571055 +0000 UTC m=+0.056192923 container kill e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3)
Oct 14 10:07:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:07:57 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:07:57 np0005486759.ooo.test podman[325274]: 2025-10-14 10:07:57.861616252 +0000 UTC m=+0.080139055 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Oct 14 10:07:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:57.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:57 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:57Z|00268|binding|INFO|Releasing lport d9d8ab7a-5731-437c-ac73-0b31e88134b9 from this chassis (sb_readonly=0)
Oct 14 10:07:57 np0005486759.ooo.test kernel: device tapd9d8ab7a-57 left promiscuous mode
Oct 14 10:07:57 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:57Z|00269|binding|INFO|Setting lport d9d8ab7a-5731-437c-ac73-0b31e88134b9 down in Southbound
Oct 14 10:07:57 np0005486759.ooo.test podman[325273]: 2025-10-14 10:07:57.912117189 +0000 UTC m=+0.130125437 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 10:07:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:57.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:57 np0005486759.ooo.test podman[325274]: 2025-10-14 10:07:57.928816421 +0000 UTC m=+0.147339174 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Oct 14 10:07:57 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:07:57 np0005486759.ooo.test podman[325273]: 2025-10-14 10:07:57.973294453 +0000 UTC m=+0.191302681 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:07:57 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:07:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:58.093 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-c0dad70c-e24d-4510-839c-2316ed4bb5b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0dad70c-e24d-4510-839c-2316ed4bb5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '710bf5711484449682775e44dbb1ee9d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3dbf6691-01d3-4d01-b1a4-a9b810617859, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=d9d8ab7a-5731-437c-ac73-0b31e88134b9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:07:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:58.097 183328 INFO neutron.agent.ovn.metadata.agent [-] Port d9d8ab7a-5731-437c-ac73-0b31e88134b9 in datapath c0dad70c-e24d-4510-839c-2316ed4bb5b1 unbound from our chassis
Oct 14 10:07:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:58.100 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0dad70c-e24d-4510-839c-2316ed4bb5b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:07:58 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:07:58.101 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0dda93-3f07-4095-82f0-a4fc963a0a69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:07:58 np0005486759.ooo.test podman[325345]: 2025-10-14 10:07:58.852147856 +0000 UTC m=+0.057310237 container kill e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 14 10:07:58 np0005486759.ooo.test dnsmasq[324500]: exiting on receipt of SIGTERM
Oct 14 10:07:58 np0005486759.ooo.test systemd[1]: libpod-e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840.scope: Deactivated successfully.
Oct 14 10:07:58 np0005486759.ooo.test podman[325358]: 2025-10-14 10:07:58.924528493 +0000 UTC m=+0.056589114 container died e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:07:58 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840-userdata-shm.mount: Deactivated successfully.
Oct 14 10:07:58 np0005486759.ooo.test podman[325358]: 2025-10-14 10:07:58.955938845 +0000 UTC m=+0.087999396 container cleanup e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:07:58 np0005486759.ooo.test systemd[1]: libpod-conmon-e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840.scope: Deactivated successfully.
Oct 14 10:07:59 np0005486759.ooo.test podman[325359]: 2025-10-14 10:07:58.999803929 +0000 UTC m=+0.127594190 container remove e9649fe9263332bf5da7e062f6c2c4a7366db2a6e133afc88883f408d364c840 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dad70c-e24d-4510-839c-2316ed4bb5b1, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:07:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:59.066 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:59.261 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:07:59 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:07:59Z|00270|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:07:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:59.441 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:07:59Z, description=, device_id=470b8128-c483-4733-8d33-a692539168cb, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec787400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec654ca0>], id=f109614e-ac0c-476c-b57f-4de93e90ec71, ip_allocation=immediate, mac_address=fa:16:3e:d8:74:07, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2105, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:07:59Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:07:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:59.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:07:59.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:07:59 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:07:59 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:07:59 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:07:59 np0005486759.ooo.test podman[325404]: 2025-10-14 10:07:59.68994459 +0000 UTC m=+0.061342520 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:07:59 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-121ac324a543daa0c9d65579c3ea18cb526e200b666b3c4c2f37d8401b135f14-merged.mount: Deactivated successfully.
Oct 14 10:07:59 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2dc0dad70c\x2de24d\x2d4510\x2d839c\x2d2316ed4bb5b1.mount: Deactivated successfully.
Oct 14 10:07:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:07:59.934 287366 INFO neutron.agent.dhcp.agent [None req-0500cdff-9da2-4bef-b257-b0695e40dd50 - - - - - -] DHCP configuration for ports {'f109614e-ac0c-476c-b57f-4de93e90ec71'} is completed
Oct 14 10:08:00 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:00.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:01 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:01.492 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:01Z, description=, device_id=b136eb9d-67db-4329-9a5f-83c085cc3421, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec61ad60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed0081c0>], id=d16a51b1-2833-44e0-b148-2871c6ac0929, ip_allocation=immediate, mac_address=fa:16:3e:74:e1:80, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2107, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:08:01Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:08:01 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:08:01 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:08:01 np0005486759.ooo.test podman[325443]: 2025-10-14 10:08:01.796055808 +0000 UTC m=+0.057218295 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:08:01 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:08:02 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:02.037 287366 INFO neutron.agent.dhcp.agent [None req-06f609ea-1bd7-4b92-9e5c-222a1cc0ee43 - - - - - -] DHCP configuration for ports {'d16a51b1-2833-44e0-b148-2871c6ac0929'} is completed
Oct 14 10:08:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:04.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:05.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:07 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:08:07 np0005486759.ooo.test systemd[1]: tmp-crun.o1FvZC.mount: Deactivated successfully.
Oct 14 10:08:07 np0005486759.ooo.test podman[325464]: 2025-10-14 10:08:07.443391445 +0000 UTC m=+0.071608175 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 14 10:08:07 np0005486759.ooo.test podman[325464]: 2025-10-14 10:08:07.452429841 +0000 UTC m=+0.080646592 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct 14 10:08:07 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:08:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:08:08 np0005486759.ooo.test podman[325480]: 2025-10-14 10:08:08.440579882 +0000 UTC m=+0.068535221 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 14 10:08:08 np0005486759.ooo.test podman[325480]: 2025-10-14 10:08:08.452244719 +0000 UTC m=+0.080200058 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 10:08:08 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:08:09 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:09.334 2 INFO neutron.agent.securitygroups_rpc [None req-4fb262fc-e43e-45c1-96be-5c97c45e97c2 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:09.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:10 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:08:10 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:08:10 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:08:10 np0005486759.ooo.test podman[325518]: 2025-10-14 10:08:10.060071253 +0000 UTC m=+0.048300931 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:08:10 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:10.143 2 INFO neutron.agent.securitygroups_rpc [None req-d3b9841e-c254-42f5-b42b-ac7f7ab0c4d5 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:10.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:08:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:08:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:08:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:08:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:08:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16694 "" "Go-http-client/1.1"
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:08:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:08:14 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:14.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:15.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:19.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:19.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22948 DF PROTO=TCP SPT=42800 DPT=9102 SEQ=352823874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA7F2270000000001030307) 
Oct 14 10:08:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:19.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:20 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:20.107 2 INFO neutron.agent.securitygroups_rpc [None req-c3a36032-03b8-4038-a760-e3c95b90be63 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:20.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:08:20 np0005486759.ooo.test podman[325541]: 2025-10-14 10:08:20.475995588 +0000 UTC m=+0.095373793 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:08:20 np0005486759.ooo.test podman[325541]: 2025-10-14 10:08:20.481671502 +0000 UTC m=+0.101049737 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:08:20 np0005486759.ooo.test podman[325542]: 2025-10-14 10:08:20.526039641 +0000 UTC m=+0.142050573 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 14 10:08:20 np0005486759.ooo.test podman[325542]: 2025-10-14 10:08:20.532432526 +0000 UTC m=+0.148443488 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid)
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:08:20 np0005486759.ooo.test podman[325544]: 2025-10-14 10:08:20.548802598 +0000 UTC m=+0.158116985 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:08:20 np0005486759.ooo.test podman[325544]: 2025-10-14 10:08:20.555245045 +0000 UTC m=+0.164559422 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:08:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22949 DF PROTO=TCP SPT=42800 DPT=9102 SEQ=352823874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA7F6410000000001030307) 
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:08:20 np0005486759.ooo.test podman[325543]: 2025-10-14 10:08:20.636879696 +0000 UTC m=+0.248262876 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 10:08:20 np0005486759.ooo.test podman[325543]: 2025-10-14 10:08:20.647860132 +0000 UTC m=+0.259243322 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd)
Oct 14 10:08:20 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:08:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:20.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:21.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:21.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.217 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.218 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.218 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.218 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.282 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.351 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.353 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.431 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.432 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.507 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.508 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.564 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:08:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22950 DF PROTO=TCP SPT=42800 DPT=9102 SEQ=352823874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA7FE410000000001030307) 
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.745 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.746 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12262MB free_disk=386.677433013916GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.747 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.747 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.863 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.863 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.863 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.902 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.917 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.918 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:08:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:22.919 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:08:23 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:23.897 2 INFO neutron.agent.securitygroups_rpc [None req-3a30d53d-de16-4cee-ac55-002472656610 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:23 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:23.927 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.454 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.454 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.458 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58acd1a0-488e-4cd9-a1c3-dee73d2d7616', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.455022', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b87bda84-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '280326d4d70b748eedb16c13580d41f9ff861e55fec1c4844ff52b1b165438b5'}]}, 'timestamp': '2025-10-14 10:08:24.459056', '_unique_id': 'bbc3ba5751ce40e8bcf9720457d3dc6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.460 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.493 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.494 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '621e7170-9f2d-4ad0-a202-d8c191d92679', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.462004', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b88133bc-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '17979805e2095259a2133bce1d628cc7286bf73ebcdbc868e852d4b852744b99'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.462004', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b88148e8-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '56162d6218320ae288323846ac78f0a1b6d1544690e71b7991b8be7bcc5ae918'}]}, 'timestamp': '2025-10-14 10:08:24.494531', '_unique_id': '8f57bb4b4ec141d6b5dd1de9266481f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.495 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.513 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.514 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2539ddaa-8d14-4c48-a9bd-9a0178e1fa64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.497386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8845786-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.692489852, 'message_signature': 'fabd02d1c1003675312a585ebfe4de594add85f1ef3891d8b71dfe673c4817a3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.497386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8846a8c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.692489852, 'message_signature': '89941909c9f97e23a597c01ca349feb18260c2ef4017ce233b59d891829386cc'}]}, 'timestamp': '2025-10-14 10:08:24.515088', '_unique_id': 'cd60b3f5dd92465c8f26ad1c596a092a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.516 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.517 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.518 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.518 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc8a9144-bfde-4b70-9c03-7db9130f5a38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.517939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b884ee44-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.692489852, 'message_signature': '2fc276ed19c0ba7b5fd0d90fca79b16cc780ee5f7c6fc438b0833d685b449e64'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.517939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8850032-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.692489852, 'message_signature': '5bc7857f17921ca0e26adac6f8fbe7e2b572aa12aeedfff24b7ac97435b2e751'}]}, 'timestamp': '2025-10-14 10:08:24.518873', '_unique_id': '5809345cfc7d41aa8e41d0ae734ae9b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.519 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.521 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce0ad4d3-1952-4733-88c1-bf2044ed1efe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.521215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8856d06-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': 'bd2a76db6973423a7c92b7f33978432a042ff2a8bef3d6a2843cae01896c68b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.521215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8857e0e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '6aaf1765901d1d0c773419a238a72b9bde62c33039cdc0dae1cf5879ed36c0bc'}]}, 'timestamp': '2025-10-14 10:08:24.522125', '_unique_id': 'd6d5a4e0d00f4fd38912daea9ead88c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.523 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.524 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.524 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cd7066f-5a46-418c-a8e6-5b1014d10a4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.524523', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b885ef2e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '52412603b74332358ea76fa9e6a0aa5dd92d3110410cafb16dbd54d5cc042bd7'}]}, 'timestamp': '2025-10-14 10:08:24.525049', '_unique_id': '7afa6217df6c48b78321607623b27316'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.526 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.527 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a42b038-7f93-4ecb-ac08-1c1be9735952', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.527276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8865996-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': 'dc945a865f4899f33388a4c64aff938a13ba0d09e84efefad3f1f7bd99ad9ee9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.527276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8866c56-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '250f792c9d955a19a79fd016c9f1943cc2cce4aa924490c6ae0276d76ad64d74'}]}, 'timestamp': '2025-10-14 10:08:24.528197', '_unique_id': '32552bbbd5fc4b0190626b848501edea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.529 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.530 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.530 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.531 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59ed04b0-a831-4cac-898d-e1f124da93cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.530623', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b886dd12-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '68f648b83263df2826b0bd98349e7ad195e39f389190f3cffaa09e0d6522d607'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.530623', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b886eef6-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '6deb27402bb437acc7ae520c7d01d8eb65dee1530aaae95455088536c0d4b24f'}]}, 'timestamp': '2025-10-14 10:08:24.531538', '_unique_id': 'dc6f6fcf41cd45f5aa465d16987376d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.532 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.533 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '310ef99f-0efc-45dc-862e-cb91d857f135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8721, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.533813', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b8875c10-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '4dc47fcfab86b0c54d68438257f1d73964c7855f940814a937aebdf11e8d5730'}]}, 'timestamp': '2025-10-14 10:08:24.534364', '_unique_id': 'a8d800b3653944b5a2e303a59a41a24a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.535 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.536 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.536 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.537 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4aa9318-ad7a-4066-9086-07625080a27f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.536721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b887ceb6-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.692489852, 'message_signature': '03a731ee58700eed49e001da56164cc16c54b1b241ee5d828b294e3eb1fc7f39'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.536721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b887e25c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.692489852, 'message_signature': '246269201377c05f5ca808f62541be89d0412c6c6a32e1b56fbc15bb8656e563'}]}, 'timestamp': '2025-10-14 10:08:24.537774', '_unique_id': '7fdcbd911612471085a25a7d76c2fb10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.540 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a23d133-c1b0-4183-acd0-18ffe05ea2b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.540164', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b88855e8-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': 'c1e27ef7dc2e775db42187204651f1a1dfef09b3fab67c08e70486692dfbc399'}]}, 'timestamp': '2025-10-14 10:08:24.540796', '_unique_id': 'a3fc33fe2634412b935fb09bc1b33d0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.541 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.543 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a43f92b-c12f-4882-8f3f-eb3b3e32f0ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.543180', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b888ca3c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': 'e26475aebfcbc3b70e9a46a3b8a36a5ad81b22b82c25f7c01be796e6526c9e93'}]}, 'timestamp': '2025-10-14 10:08:24.543771', '_unique_id': 'b6857c58b0c14594a54c5a4674eea8af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.545 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3c0726c-2290-4962-92fb-44e95a9d1e0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.545871', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b889304e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '5d843842c991771dd7868fa99c4c5ac28d62e283f103433208caa4cad1439cd8'}]}, 'timestamp': '2025-10-14 10:08:24.546300', '_unique_id': '8d9e74d76f73424bb9353bcfd4f1b636'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.547 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.547 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.548 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.548 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd90c1dd-b5b2-4946-8923-acf9916324ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.548080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8898544-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '1cfe2d4257068d063026f16fea5bbb339f3e5f1fa6e5609e8811e92d992e0a7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.548080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8899232-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': 'fb366043c738be6a9e63317082295bc1558ed95a4042ca6097013444b9b1c1bc'}]}, 'timestamp': '2025-10-14 10:08:24.548763', '_unique_id': '5649ec67d24a41018f973d224c9a5d24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.549 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.550 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.569 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60125685-59f2-4ad0-93ac-51ea081846be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:08:24.550504', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b88ce02c-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.764855179, 'message_signature': '3eeb16f022ec5fe0a989e243f26a7f9a04ea50b7451b7f35f8b1ae31da0bbf14'}]}, 'timestamp': '2025-10-14 10:08:24.570533', '_unique_id': 'bdb108316a284d57bc58648cde587a52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.571 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.573 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.573 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5422938a-9bbb-4636-ae91-8e20ea00e078', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.573281', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b88d5f66-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '3cdebe0d9ae3d2aa58b4225e5875c5a2bbe405ec69f990b546fb77199e96b257'}]}, 'timestamp': '2025-10-14 10:08:24.573768', '_unique_id': '76adb2c156494f1696117d40f8bea610'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.574 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.575 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.576 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 13750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6c9cd64-d36b-458b-8d7c-79611ed602e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13750000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:08:24.575986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b88dc8de-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.764855179, 'message_signature': 'd0c1372a79c52388d10c445e41441ed111e52aacdc1951d303cb265aaa8ca497'}]}, 'timestamp': '2025-10-14 10:08:24.576475', '_unique_id': '6aeb280186d048cc96de322d4c3bbb94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.577 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.578 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.578 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e573db8-843f-4cae-b66b-02d0f867006e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.578659', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b88e313e-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '4c70fb1a120822f574b2082f51b0feea19d08135ff5bed38fcd23c9fa026825d'}]}, 'timestamp': '2025-10-14 10:08:24.579185', '_unique_id': '85aad9895d8d4fdd9f688cf3d8e71756'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.580 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.581 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.581 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b2046b5-682d-49ac-831d-1c33eddb20ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:08:24.581424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b88e9d04-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '48bdd168c1e704bca5f59e52b0dfac5ec6902e8d1232ce7a0ecbf5ae625d6993'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:08:24.581424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b88eb1c2-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.657114639, 'message_signature': '8aaea2fa4019900d3b528e957b6388e8f88a7b2ff11c61813cdb60cb67722b67'}]}, 'timestamp': '2025-10-14 10:08:24.582403', '_unique_id': 'e5ac5090c2864a07a90579e9d99f2024'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.583 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.584 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.584 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bab779c-05b6-4617-af23-973c6dad70f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.584665', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b88f1bee-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': '77f9a67d716e38a70a0068f5ca1193402a3f823aa9ac20cf441175a8520c12c8'}]}, 'timestamp': '2025-10-14 10:08:24.585170', '_unique_id': '25cf620b18404c46b9ff82598883f9d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.586 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.587 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.587 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11f8ca3f-f38b-4559-81b9-930b065dbb35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:08:24.587514', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': 'b88f8afc-a8e5-11f0-b515-fa163eba5220', 'monotonic_time': 12524.650132534, 'message_signature': 'a84f99922e7eaa7ad4aa55589973294154649dbfd62282c3f24f511de5253cbb'}]}, 'timestamp': '2025-10-14 10:08:24.588023', '_unique_id': '1a38b591d3d346ff861686cb7a21c53c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:08:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:08:24.589 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:08:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:24.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:24.919 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.189 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.189 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.273 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.273 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.274 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.274 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:08:25 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:25.303 2 INFO neutron.agent.securitygroups_rpc [None req-399f6b29-a9b2-4d11-b6f7-60667cf34163 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.912 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.931 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.931 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.931 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:08:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:25.932 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:08:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22951 DF PROTO=TCP SPT=42800 DPT=9102 SEQ=352823874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA80E010000000001030307) 
Oct 14 10:08:27 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:27.216 2 INFO neutron.agent.securitygroups_rpc [None req-949845d8-9351-4a7c-b845-c93a747a349d 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:27 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:27.236 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:08:28 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:08:28 np0005486759.ooo.test podman[325636]: 2025-10-14 10:08:28.452365051 +0000 UTC m=+0.080848368 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Oct 14 10:08:28 np0005486759.ooo.test podman[325636]: 2025-10-14 10:08:28.464357708 +0000 UTC m=+0.092840975 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc.)
Oct 14 10:08:28 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:08:28 np0005486759.ooo.test podman[325635]: 2025-10-14 10:08:28.519225639 +0000 UTC m=+0.147722956 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:08:28 np0005486759.ooo.test podman[325635]: 2025-10-14 10:08:28.549315011 +0000 UTC m=+0.177812328 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Oct 14 10:08:28 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:08:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:29.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:29.944 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:08:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:29.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:29.946 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:08:30 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:30.216 287366 INFO neutron.agent.linux.ip_lib [None req-d613aecb-6e64-4452-bf09-f0ec46e14ac2 - - - - - -] Device tap3e789bb4-fa cannot be used as it has no MAC address
Oct 14 10:08:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:30.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:30 np0005486759.ooo.test kernel: device tap3e789bb4-fa entered promiscuous mode
Oct 14 10:08:30 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436510.2500] manager: (tap3e789bb4-fa): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Oct 14 10:08:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:30.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:30 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:30Z|00271|binding|INFO|Claiming lport 3e789bb4-fad8-4834-9352-cd3395541c1d for this chassis.
Oct 14 10:08:30 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:30Z|00272|binding|INFO|3e789bb4-fad8-4834-9352-cd3395541c1d: Claiming unknown
Oct 14 10:08:30 np0005486759.ooo.test systemd-udevd[325690]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:08:30 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:30Z|00273|binding|INFO|Setting lport 3e789bb4-fad8-4834-9352-cd3395541c1d ovn-installed in OVS
Oct 14 10:08:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:30.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:30 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:30Z|00274|binding|INFO|Setting lport 3e789bb4-fad8-4834-9352-cd3395541c1d up in Southbound
Oct 14 10:08:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:30.322 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-eef03828-48cc-412d-af38-8078518b7ee6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eef03828-48cc-412d-af38-8078518b7ee6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '604e5517ad91430ebd9bac26d0e454e9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc27da97-986f-4f92-b9e7-2b42a83da0bd, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3e789bb4-fad8-4834-9352-cd3395541c1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:08:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:30.323 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3e789bb4-fad8-4834-9352-cd3395541c1d in datapath eef03828-48cc-412d-af38-8078518b7ee6 bound to our chassis
Oct 14 10:08:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:30.324 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eef03828-48cc-412d-af38-8078518b7ee6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:08:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:30.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:30 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:30.326 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[52d47874-7385-48ef-8685-e988efbab90d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:08:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:30.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:30.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:31 np0005486759.ooo.test podman[325745]: 
Oct 14 10:08:31 np0005486759.ooo.test podman[325745]: 2025-10-14 10:08:31.255304085 +0000 UTC m=+0.089291256 container create ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:08:31 np0005486759.ooo.test systemd[1]: Started libpod-conmon-ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab.scope.
Oct 14 10:08:31 np0005486759.ooo.test podman[325745]: 2025-10-14 10:08:31.216421154 +0000 UTC m=+0.050408365 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:08:31 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:08:31 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66ed934d30b89f2c1d36eaa96b18146a2faf1babafc352393b4d10072db633a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:08:31 np0005486759.ooo.test podman[325745]: 2025-10-14 10:08:31.332333744 +0000 UTC m=+0.166320905 container init ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 14 10:08:31 np0005486759.ooo.test podman[325745]: 2025-10-14 10:08:31.338919137 +0000 UTC m=+0.172906298 container start ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq[325764]: started, version 2.85 cachesize 150
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq[325764]: DNS service limited to local subnets
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq[325764]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq[325764]: warning: no upstream servers configured
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq-dhcp[325764]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/addn_hosts - 0 addresses
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/host
Oct 14 10:08:31 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/opts
Oct 14 10:08:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:31.706 287366 INFO neutron.agent.dhcp.agent [None req-fac22bbf-f48b-4069-99c8-2a843176b934 - - - - - -] DHCP configuration for ports {'4cf24b95-0cac-4da2-a85e-d765ca2823e7'} is completed
Oct 14 10:08:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:31.810 287366 INFO neutron.agent.linux.ip_lib [None req-d5e35404-d773-459e-9f18-a4c66c1d7468 - - - - - -] Device tap255d9f24-3b cannot be used as it has no MAC address
Oct 14 10:08:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:31.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:31 np0005486759.ooo.test kernel: device tap255d9f24-3b entered promiscuous mode
Oct 14 10:08:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:31Z|00275|binding|INFO|Claiming lport 255d9f24-3b8b-4c0a-8228-d6be76107b6d for this chassis.
Oct 14 10:08:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:31.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:31Z|00276|binding|INFO|255d9f24-3b8b-4c0a-8228-d6be76107b6d: Claiming unknown
Oct 14 10:08:31 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436511.8424] manager: (tap255d9f24-3b): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Oct 14 10:08:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:31.854 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:3::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-989cbfc7-48c7-4708-bac3-6200cb04bde0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-989cbfc7-48c7-4708-bac3-6200cb04bde0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd795491f5c1d4cb0bdc37f8eea30c3f8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54f718c9-a0ce-49b8-88a9-554b948c4cf2, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=255d9f24-3b8b-4c0a-8228-d6be76107b6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:08:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:31.857 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 255d9f24-3b8b-4c0a-8228-d6be76107b6d in datapath 989cbfc7-48c7-4708-bac3-6200cb04bde0 bound to our chassis
Oct 14 10:08:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:31.860 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 989cbfc7-48c7-4708-bac3-6200cb04bde0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:08:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:31.861 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c4fad3cb-210d-4f7c-abd1-b3c5b0e92144]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:08:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:31Z|00277|binding|INFO|Setting lport 255d9f24-3b8b-4c0a-8228-d6be76107b6d ovn-installed in OVS
Oct 14 10:08:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:31Z|00278|binding|INFO|Setting lport 255d9f24-3b8b-4c0a-8228-d6be76107b6d up in Southbound
Oct 14 10:08:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:31.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:31.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:31.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:32.022 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:31Z, description=, device_id=628b83d9-7f37-41d8-8b6f-1a68b4b716a5, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec558430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec558820>], id=ac92494d-fcda-458f-acbb-521978800bfc, ip_allocation=immediate, mac_address=fa:16:3e:61:7d:63, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T08:44:39Z, description=, dns_domain=, id=a6b02595-ce43-43c7-aca8-531937571464, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=8bf64e81a4214f9490d231a2e79ab3d8, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=15, status=ACTIVE, subnets=['bf2bf716-b9bb-4d7b-b054-64a60b3187c6'], tags=[], tenant_id=8bf64e81a4214f9490d231a2e79ab3d8, updated_at=2025-10-14T08:44:45Z, vlan_transparent=None, network_id=a6b02595-ce43-43c7-aca8-531937571464, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2168, status=DOWN, tags=[], tenant_id=, updated_at=2025-10-14T10:08:31Z on network a6b02595-ce43-43c7-aca8-531937571464
Oct 14 10:08:32 np0005486759.ooo.test podman[325807]: 2025-10-14 10:08:32.230803548 +0000 UTC m=+0.056188732 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 3 addresses
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:08:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:32.672 287366 INFO neutron.agent.dhcp.agent [None req-aa13faac-8cba-4509-b9f3-d28f1dee8923 - - - - - -] DHCP configuration for ports {'ac92494d-fcda-458f-acbb-521978800bfc'} is completed
Oct 14 10:08:32 np0005486759.ooo.test podman[325865]: 
Oct 14 10:08:32 np0005486759.ooo.test podman[325865]: 2025-10-14 10:08:32.774254266 +0000 UTC m=+0.087077779 container create 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:08:32 np0005486759.ooo.test systemd[1]: Started libpod-conmon-0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4.scope.
Oct 14 10:08:32 np0005486759.ooo.test podman[325865]: 2025-10-14 10:08:32.73195096 +0000 UTC m=+0.044806004 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:08:32 np0005486759.ooo.test systemd[1]: tmp-crun.WmCiOM.mount: Deactivated successfully.
Oct 14 10:08:32 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:08:32 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58a5378a30e0304ced41dd9a6ede44c4e94acbba43aef9937cc30eaa194f3dd1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:08:32 np0005486759.ooo.test podman[325865]: 2025-10-14 10:08:32.860345343 +0000 UTC m=+0.173168866 container init 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 14 10:08:32 np0005486759.ooo.test podman[325865]: 2025-10-14 10:08:32.870897526 +0000 UTC m=+0.183721049 container start 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq[325883]: started, version 2.85 cachesize 150
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq[325883]: DNS service limited to local subnets
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq[325883]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq[325883]: warning: no upstream servers configured
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq-dhcp[325883]: DHCPv6, static leases only on 2001:db8:3::, lease time 1d
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/addn_hosts - 0 addresses
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/host
Oct 14 10:08:32 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/opts
Oct 14 10:08:32 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:32.935 287366 INFO neutron.agent.dhcp.agent [None req-d5e35404-d773-459e-9f18-a4c66c1d7468 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:31Z, description=, device_id=3c35c8a4-896b-448f-a468-8cf2b1447a37, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec55d040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec558f10>], id=837f8c17-86e1-4fe5-85f3-bf9ad83a49bb, ip_allocation=immediate, mac_address=fa:16:3e:17:f5:90, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:29Z, description=, dns_domain=, id=989cbfc7-48c7-4708-bac3-6200cb04bde0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-557892787, port_security_enabled=True, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39966, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2161, status=ACTIVE, subnets=['66f3f665-370e-4aff-8dcb-c919917a0c78'], tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:08:30Z, vlan_transparent=None, network_id=989cbfc7-48c7-4708-bac3-6200cb04bde0, port_security_enabled=False, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2166, status=DOWN, tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:08:31Z on network 989cbfc7-48c7-4708-bac3-6200cb04bde0
Oct 14 10:08:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:33.028 287366 INFO neutron.agent.dhcp.agent [None req-c077092b-0dad-499b-afb0-21c5bc81538e - - - - - -] DHCP configuration for ports {'e42b74af-6f3c-4c98-8ee0-9f38fc385739'} is completed
Oct 14 10:08:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:33.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:33 np0005486759.ooo.test dnsmasq[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/addn_hosts - 1 addresses
Oct 14 10:08:33 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/host
Oct 14 10:08:33 np0005486759.ooo.test podman[325899]: 2025-10-14 10:08:33.126674072 +0000 UTC m=+0.060039971 container kill 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:08:33 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/opts
Oct 14 10:08:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:33.280 287366 INFO neutron.agent.dhcp.agent [None req-d5e35404-d773-459e-9f18-a4c66c1d7468 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:31Z, description=, device_id=3c35c8a4-896b-448f-a468-8cf2b1447a37, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5eceb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5ec5e0>], id=837f8c17-86e1-4fe5-85f3-bf9ad83a49bb, ip_allocation=immediate, mac_address=fa:16:3e:17:f5:90, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:29Z, description=, dns_domain=, id=989cbfc7-48c7-4708-bac3-6200cb04bde0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-557892787, port_security_enabled=True, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39966, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2161, status=ACTIVE, subnets=['66f3f665-370e-4aff-8dcb-c919917a0c78'], tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:08:30Z, vlan_transparent=None, network_id=989cbfc7-48c7-4708-bac3-6200cb04bde0, port_security_enabled=False, project_id=d795491f5c1d4cb0bdc37f8eea30c3f8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2166, status=DOWN, tags=[], tenant_id=d795491f5c1d4cb0bdc37f8eea30c3f8, updated_at=2025-10-14T10:08:31Z on network 989cbfc7-48c7-4708-bac3-6200cb04bde0
Oct 14 10:08:33 np0005486759.ooo.test podman[325939]: 2025-10-14 10:08:33.426780335 +0000 UTC m=+0.049251410 container kill 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:08:33 np0005486759.ooo.test dnsmasq[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/addn_hosts - 1 addresses
Oct 14 10:08:33 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/host
Oct 14 10:08:33 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/opts
Oct 14 10:08:33 np0005486759.ooo.test systemd[1]: tmp-crun.zRgFC6.mount: Deactivated successfully.
Oct 14 10:08:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:33.683 287366 INFO neutron.agent.dhcp.agent [None req-a7c2ad70-412e-4d6a-ab17-51f4e2f818ef - - - - - -] DHCP configuration for ports {'837f8c17-86e1-4fe5-85f3-bf9ad83a49bb'} is completed
Oct 14 10:08:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:33.825 287366 INFO neutron.agent.dhcp.agent [None req-8023659a-b8b6-420b-9262-cea71c89ba6d - - - - - -] DHCP configuration for ports {'837f8c17-86e1-4fe5-85f3-bf9ad83a49bb'} is completed
Oct 14 10:08:34 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:34.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:35.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:36 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:36.730 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:35Z, description=, device_id=628b83d9-7f37-41d8-8b6f-1a68b4b716a5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec721dc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7219a0>], id=785c3fd1-8b73-4e3d-8ab5-a0114709de40, ip_allocation=immediate, mac_address=fa:16:3e:ef:65:b4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:28Z, description=, dns_domain=, id=eef03828-48cc-412d-af38-8078518b7ee6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1692315326, port_security_enabled=True, project_id=604e5517ad91430ebd9bac26d0e454e9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3886, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2156, status=ACTIVE, subnets=['d5252669-ce20-46c8-b2b6-b9449ca9ba0b'], tags=[], tenant_id=604e5517ad91430ebd9bac26d0e454e9, updated_at=2025-10-14T10:08:29Z, vlan_transparent=None, network_id=eef03828-48cc-412d-af38-8078518b7ee6, port_security_enabled=False, project_id=604e5517ad91430ebd9bac26d0e454e9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2169, status=DOWN, tags=[], tenant_id=604e5517ad91430ebd9bac26d0e454e9, updated_at=2025-10-14T10:08:35Z on network eef03828-48cc-412d-af38-8078518b7ee6
Oct 14 10:08:36 np0005486759.ooo.test podman[325974]: 2025-10-14 10:08:36.932475936 +0000 UTC m=+0.061642919 container kill ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:08:36 np0005486759.ooo.test dnsmasq[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/addn_hosts - 1 addresses
Oct 14 10:08:36 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/host
Oct 14 10:08:36 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/opts
Oct 14 10:08:37 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:37.186 287366 INFO neutron.agent.dhcp.agent [None req-903d8e2d-c929-4bd1-a9c8-65541ed5f0ce - - - - - -] DHCP configuration for ports {'785c3fd1-8b73-4e3d-8ab5-a0114709de40'} is completed
Oct 14 10:08:37 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:37.837 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:35Z, description=, device_id=628b83d9-7f37-41d8-8b6f-1a68b4b716a5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec527c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ed013cd0>], id=785c3fd1-8b73-4e3d-8ab5-a0114709de40, ip_allocation=immediate, mac_address=fa:16:3e:ef:65:b4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:28Z, description=, dns_domain=, id=eef03828-48cc-412d-af38-8078518b7ee6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1692315326, port_security_enabled=True, project_id=604e5517ad91430ebd9bac26d0e454e9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3886, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2156, status=ACTIVE, subnets=['d5252669-ce20-46c8-b2b6-b9449ca9ba0b'], tags=[], tenant_id=604e5517ad91430ebd9bac26d0e454e9, updated_at=2025-10-14T10:08:29Z, vlan_transparent=None, network_id=eef03828-48cc-412d-af38-8078518b7ee6, port_security_enabled=False, project_id=604e5517ad91430ebd9bac26d0e454e9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2169, status=DOWN, tags=[], tenant_id=604e5517ad91430ebd9bac26d0e454e9, updated_at=2025-10-14T10:08:35Z on network eef03828-48cc-412d-af38-8078518b7ee6
Oct 14 10:08:37 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:37.948 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/addn_hosts - 0 addresses
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/host
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq-dhcp[325883]: read /var/lib/neutron/dhcp/989cbfc7-48c7-4708-bac3-6200cb04bde0/opts
Oct 14 10:08:38 np0005486759.ooo.test podman[326025]: 2025-10-14 10:08:38.00427937 +0000 UTC m=+0.066704275 container kill 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, org.label-schema.build-date=20251009, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 14 10:08:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/addn_hosts - 1 addresses
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/host
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/opts
Oct 14 10:08:38 np0005486759.ooo.test podman[326039]: 2025-10-14 10:08:38.06599807 +0000 UTC m=+0.074879385 container kill ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:08:38 np0005486759.ooo.test podman[326051]: 2025-10-14 10:08:38.115245928 +0000 UTC m=+0.082077225 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:08:38 np0005486759.ooo.test podman[326051]: 2025-10-14 10:08:38.147489206 +0000 UTC m=+0.114320533 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:08:38 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:08:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:38Z|00279|binding|INFO|Releasing lport 255d9f24-3b8b-4c0a-8228-d6be76107b6d from this chassis (sb_readonly=0)
Oct 14 10:08:38 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:38Z|00280|binding|INFO|Setting lport 255d9f24-3b8b-4c0a-8228-d6be76107b6d down in Southbound
Oct 14 10:08:38 np0005486759.ooo.test kernel: device tap255d9f24-3b left promiscuous mode
Oct 14 10:08:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:38.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:38.266 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:3::2/64', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-989cbfc7-48c7-4708-bac3-6200cb04bde0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-989cbfc7-48c7-4708-bac3-6200cb04bde0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd795491f5c1d4cb0bdc37f8eea30c3f8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54f718c9-a0ce-49b8-88a9-554b948c4cf2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=255d9f24-3b8b-4c0a-8228-d6be76107b6d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:08:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:38.268 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 255d9f24-3b8b-4c0a-8228-d6be76107b6d in datapath 989cbfc7-48c7-4708-bac3-6200cb04bde0 unbound from our chassis
Oct 14 10:08:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:38.269 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 989cbfc7-48c7-4708-bac3-6200cb04bde0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:08:38 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:38.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:38.274 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[bdca9a42-2780-4605-b8b8-654afedaff40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:08:38 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:38.665 287366 INFO neutron.agent.dhcp.agent [None req-3af16ce6-1337-40d8-8e7e-2970e80105b1 - - - - - -] DHCP configuration for ports {'785c3fd1-8b73-4e3d-8ab5-a0114709de40'} is completed
Oct 14 10:08:38 np0005486759.ooo.test dnsmasq[325883]: exiting on receipt of SIGTERM
Oct 14 10:08:38 np0005486759.ooo.test systemd[1]: libpod-0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4.scope: Deactivated successfully.
Oct 14 10:08:38 np0005486759.ooo.test podman[326106]: 2025-10-14 10:08:38.897573114 +0000 UTC m=+0.065038253 container kill 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:08:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:08:38 np0005486759.ooo.test podman[326119]: 2025-10-14 10:08:38.970930031 +0000 UTC m=+0.063853097 container died 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:08:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-58a5378a30e0304ced41dd9a6ede44c4e94acbba43aef9937cc30eaa194f3dd1-merged.mount: Deactivated successfully.
Oct 14 10:08:39 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4-userdata-shm.mount: Deactivated successfully.
Oct 14 10:08:39 np0005486759.ooo.test podman[326119]: 2025-10-14 10:08:39.00905364 +0000 UTC m=+0.101976646 container cleanup 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009)
Oct 14 10:08:39 np0005486759.ooo.test systemd[1]: libpod-conmon-0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4.scope: Deactivated successfully.
Oct 14 10:08:39 np0005486759.ooo.test podman[326127]: 2025-10-14 10:08:39.060056502 +0000 UTC m=+0.132255192 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:08:39 np0005486759.ooo.test podman[326126]: 2025-10-14 10:08:39.093028262 +0000 UTC m=+0.168348159 container remove 0d815742ae8fc67fcc281f4abce23a20f844d99c222380cfb1ab3fed0e1368f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-989cbfc7-48c7-4708-bac3-6200cb04bde0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:08:39 np0005486759.ooo.test podman[326127]: 2025-10-14 10:08:39.096613722 +0000 UTC m=+0.168812392 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 10:08:39 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:08:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:39.325 287366 INFO neutron.agent.dhcp.agent [None req-c0e4e34c-29d1-4678-bb82-bee3b7265a86 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:39 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d989cbfc7\x2d48c7\x2d4708\x2dbac3\x2d6200cb04bde0.mount: Deactivated successfully.
Oct 14 10:08:39 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:39.395 2 INFO neutron.agent.securitygroups_rpc [None req-c9875360-1d9d-4abf-8b67-e3aea6e3bb63 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:39.408 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:39.433 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:08:38Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5d3b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec5d30a0>], id=f5ddb7b6-458c-4987-9fcd-657e4afd1822, ip_allocation=immediate, mac_address=fa:16:3e:ed:3e:58, name=tempest-FloatingIPTestJSON-619546117, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:28Z, description=, dns_domain=, id=eef03828-48cc-412d-af38-8078518b7ee6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1692315326, port_security_enabled=True, project_id=604e5517ad91430ebd9bac26d0e454e9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3886, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2156, status=ACTIVE, subnets=['d5252669-ce20-46c8-b2b6-b9449ca9ba0b'], tags=[], tenant_id=604e5517ad91430ebd9bac26d0e454e9, updated_at=2025-10-14T10:08:29Z, vlan_transparent=None, network_id=eef03828-48cc-412d-af38-8078518b7ee6, port_security_enabled=True, project_id=604e5517ad91430ebd9bac26d0e454e9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['74838a88-8a21-49f4-8af4-233e44b05e8e'], standard_attr_id=2170, status=DOWN, tags=[], tenant_id=604e5517ad91430ebd9bac26d0e454e9, updated_at=2025-10-14T10:08:38Z on network eef03828-48cc-412d-af38-8078518b7ee6
Oct 14 10:08:39 np0005486759.ooo.test dnsmasq[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/addn_hosts - 2 addresses
Oct 14 10:08:39 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/host
Oct 14 10:08:39 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/opts
Oct 14 10:08:39 np0005486759.ooo.test podman[326189]: 2025-10-14 10:08:39.626095351 +0000 UTC m=+0.056971527 container kill ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:08:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:39.674 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:39 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:39.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:39 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:39.911 287366 INFO neutron.agent.dhcp.agent [None req-a1a55453-a54b-4d63-8ca4-cc6aaab0b76c - - - - - -] DHCP configuration for ports {'f5ddb7b6-458c-4987-9fcd-657e4afd1822'} is completed
Oct 14 10:08:39 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:39Z|00281|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:08:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:40.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:41.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:41 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:41.407 2 INFO neutron.agent.securitygroups_rpc [None req-3478cfc9-04fb-4bfc-b23c-992d1c8136f5 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:41 np0005486759.ooo.test dnsmasq[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/addn_hosts - 1 addresses
Oct 14 10:08:41 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/host
Oct 14 10:08:41 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/opts
Oct 14 10:08:41 np0005486759.ooo.test podman[326227]: 2025-10-14 10:08:41.643848242 +0000 UTC m=+0.048200927 container kill ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 14 10:08:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:08:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:08:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:08:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132497 "" "Go-http-client/1.1"
Oct 14 10:08:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:08:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17178 "" "Go-http-client/1.1"
Oct 14 10:08:42 np0005486759.ooo.test dnsmasq[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/addn_hosts - 0 addresses
Oct 14 10:08:42 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/host
Oct 14 10:08:42 np0005486759.ooo.test dnsmasq-dhcp[325764]: read /var/lib/neutron/dhcp/eef03828-48cc-412d-af38-8078518b7ee6/opts
Oct 14 10:08:42 np0005486759.ooo.test podman[326265]: 2025-10-14 10:08:42.556077597 +0000 UTC m=+0.054548612 container kill ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:08:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:42Z|00282|binding|INFO|Releasing lport 3e789bb4-fad8-4834-9352-cd3395541c1d from this chassis (sb_readonly=0)
Oct 14 10:08:42 np0005486759.ooo.test kernel: device tap3e789bb4-fa left promiscuous mode
Oct 14 10:08:42 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:42Z|00283|binding|INFO|Setting lport 3e789bb4-fad8-4834-9352-cd3395541c1d down in Southbound
Oct 14 10:08:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:42.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:42.924 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-eef03828-48cc-412d-af38-8078518b7ee6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eef03828-48cc-412d-af38-8078518b7ee6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '604e5517ad91430ebd9bac26d0e454e9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc27da97-986f-4f92-b9e7-2b42a83da0bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=3e789bb4-fad8-4834-9352-cd3395541c1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:08:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:42.926 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 3e789bb4-fad8-4834-9352-cd3395541c1d in datapath eef03828-48cc-412d-af38-8078518b7ee6 unbound from our chassis
Oct 14 10:08:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:42.929 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eef03828-48cc-412d-af38-8078518b7ee6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:08:42 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:42.930 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[f654bd28-caad-47ab-b3be-8f4cf0537462]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:08:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:42.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:08:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:08:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:08:44 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 2 addresses
Oct 14 10:08:44 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:08:44 np0005486759.ooo.test podman[326306]: 2025-10-14 10:08:44.484912873 +0000 UTC m=+0.064054083 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3)
Oct 14 10:08:44 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:08:44 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:44.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:45 np0005486759.ooo.test dnsmasq[325764]: exiting on receipt of SIGTERM
Oct 14 10:08:45 np0005486759.ooo.test podman[326343]: 2025-10-14 10:08:45.014053843 +0000 UTC m=+0.059040110 container kill ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:08:45 np0005486759.ooo.test systemd[1]: libpod-ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab.scope: Deactivated successfully.
Oct 14 10:08:45 np0005486759.ooo.test podman[326356]: 2025-10-14 10:08:45.085508982 +0000 UTC m=+0.056467981 container died ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:08:45 np0005486759.ooo.test podman[326356]: 2025-10-14 10:08:45.115510501 +0000 UTC m=+0.086469470 container cleanup ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:08:45 np0005486759.ooo.test systemd[1]: libpod-conmon-ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab.scope: Deactivated successfully.
Oct 14 10:08:45 np0005486759.ooo.test podman[326358]: 2025-10-14 10:08:45.156921679 +0000 UTC m=+0.119698698 container remove ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eef03828-48cc-412d-af38-8078518b7ee6, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:08:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:45.179 287366 INFO neutron.agent.dhcp.agent [None req-8b6e6d04-1ea3-4e40-9d29-6b94272a061d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:45 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:45.354 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-66ed934d30b89f2c1d36eaa96b18146a2faf1babafc352393b4d10072db633a9-merged.mount: Deactivated successfully.
Oct 14 10:08:45 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddad24c014742fd708ff012d3c622d2820b391d5a6866aa8b71ff0e2e79f0eab-userdata-shm.mount: Deactivated successfully.
Oct 14 10:08:45 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2deef03828\x2d48cc\x2d412d\x2daf38\x2d8078518b7ee6.mount: Deactivated successfully.
Oct 14 10:08:45 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:08:45Z|00284|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:08:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:45.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:46.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:46 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:46.972 2 INFO neutron.agent.securitygroups_rpc [None req-30e0460f-8395-43e3-9e6d-f6a42cbabf08 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:47 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:47.017 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:47 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:08:47.579 2 INFO neutron.agent.securitygroups_rpc [None req-f6564b7b-d008-435b-bdb7-6d38da49fdf7 1c8e36bc90aa4c14a11ed68a253e61aa 604e5517ad91430ebd9bac26d0e454e9 - - default default] Security group member updated ['74838a88-8a21-49f4-8af4-233e44b05e8e']
Oct 14 10:08:47 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:47.677 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:48 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:48.700 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:49 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:49.214 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61252 DF PROTO=TCP SPT=48934 DPT=9102 SEQ=4144917613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA867580000000001030307) 
Oct 14 10:08:49 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:49.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:50 np0005486759.ooo.test dnsmasq[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/addn_hosts - 1 addresses
Oct 14 10:08:50 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/host
Oct 14 10:08:50 np0005486759.ooo.test podman[326404]: 2025-10-14 10:08:50.427878557 +0000 UTC m=+0.061972119 container kill 931dc8ffb626d483efeb22132f9911402758342d1233af0a97f0e4cdcf47e507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a6b02595-ce43-43c7-aca8-531937571464, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:08:50 np0005486759.ooo.test dnsmasq-dhcp[317807]: read /var/lib/neutron/dhcp/a6b02595-ce43-43c7-aca8-531937571464/opts
Oct 14 10:08:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61253 DF PROTO=TCP SPT=48934 DPT=9102 SEQ=4144917613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA86B410000000001030307) 
Oct 14 10:08:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:51.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:08:51 np0005486759.ooo.test podman[326427]: 2025-10-14 10:08:51.470715323 +0000 UTC m=+0.094145896 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid)
Oct 14 10:08:51 np0005486759.ooo.test podman[326427]: 2025-10-14 10:08:51.50683832 +0000 UTC m=+0.130268903 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_managed=true, container_name=iscsid)
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:08:51 np0005486759.ooo.test podman[326428]: 2025-10-14 10:08:51.519764095 +0000 UTC m=+0.137956417 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Oct 14 10:08:51 np0005486759.ooo.test podman[326429]: 2025-10-14 10:08:51.580794685 +0000 UTC m=+0.194735586 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct 14 10:08:51 np0005486759.ooo.test podman[326429]: 2025-10-14 10:08:51.617695085 +0000 UTC m=+0.231635946 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:08:51 np0005486759.ooo.test podman[326426]: 2025-10-14 10:08:51.637660037 +0000 UTC m=+0.262063029 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:08:51 np0005486759.ooo.test podman[326428]: 2025-10-14 10:08:51.656543495 +0000 UTC m=+0.274735817 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 10:08:51 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:51.667 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:51 np0005486759.ooo.test podman[326426]: 2025-10-14 10:08:51.670058699 +0000 UTC m=+0.294461631 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:08:51 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:08:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61254 DF PROTO=TCP SPT=48934 DPT=9102 SEQ=4144917613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA873410000000001030307) 
Oct 14 10:08:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:54.174 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:08:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:54.175 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:08:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:08:54.176 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:08:54 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:54.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:56.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:08:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61255 DF PROTO=TCP SPT=48934 DPT=9102 SEQ=4144917613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA883010000000001030307) 
Oct 14 10:08:58 np0005486759.ooo.test sshd[326504]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:08:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:08:59 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:08:59 np0005486759.ooo.test podman[326507]: 2025-10-14 10:08:59.455752511 +0000 UTC m=+0.078351801 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Oct 14 10:08:59 np0005486759.ooo.test podman[326507]: 2025-10-14 10:08:59.467522952 +0000 UTC m=+0.090122282 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Oct 14 10:08:59 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:08:59 np0005486759.ooo.test sshd[326504]: Invalid user support from 78.128.112.74 port 47112
Oct 14 10:08:59 np0005486759.ooo.test podman[326506]: 2025-10-14 10:08:59.560285454 +0000 UTC m=+0.186485024 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:08:59 np0005486759.ooo.test podman[326506]: 2025-10-14 10:08:59.604327913 +0000 UTC m=+0.230527523 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:08:59 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:08:59 np0005486759.ooo.test sshd[326504]: pam_unix(sshd:auth): check pass; user unknown
Oct 14 10:08:59 np0005486759.ooo.test sshd[326504]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=78.128.112.74
Oct 14 10:08:59 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:08:59.920 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:08:59 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:08:59.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:01.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:01 np0005486759.ooo.test sshd[326504]: Failed password for invalid user support from 78.128.112.74 port 47112 ssh2
Oct 14 10:09:03 np0005486759.ooo.test sshd[326504]: Connection closed by invalid user support 78.128.112.74 port 47112 [preauth]
Oct 14 10:09:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:04.021 287366 INFO neutron.agent.linux.ip_lib [None req-5667f728-6385-4170-b1f5-141fe80c7fad - - - - - -] Device tap4f442c0d-dd cannot be used as it has no MAC address
Oct 14 10:09:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:04.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:04 np0005486759.ooo.test kernel: device tap4f442c0d-dd entered promiscuous mode
Oct 14 10:09:04 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:04Z|00285|binding|INFO|Claiming lport 4f442c0d-dd52-47cb-9b29-137b62aecee5 for this chassis.
Oct 14 10:09:04 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:04Z|00286|binding|INFO|4f442c0d-dd52-47cb-9b29-137b62aecee5: Claiming unknown
Oct 14 10:09:04 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436544.0530] manager: (tap4f442c0d-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Oct 14 10:09:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:04.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:04 np0005486759.ooo.test systemd-udevd[326560]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:09:04 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:04.067 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-98662246-a563-437e-9cd9-f826a393f21f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98662246-a563-437e-9cd9-f826a393f21f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f43a78a6bec443f087d13390d1f3b99c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=719f2b4d-fb55-4c43-81dc-7387beea6195, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=4f442c0d-dd52-47cb-9b29-137b62aecee5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:04 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:04.069 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 4f442c0d-dd52-47cb-9b29-137b62aecee5 in datapath 98662246-a563-437e-9cd9-f826a393f21f bound to our chassis
Oct 14 10:09:04 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:04.072 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 98662246-a563-437e-9cd9-f826a393f21f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:09:04 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:04.073 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[653b7a36-3f64-4e6f-9341-be1ca8a975e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:04Z|00287|binding|INFO|Setting lport 4f442c0d-dd52-47cb-9b29-137b62aecee5 ovn-installed in OVS
Oct 14 10:09:04 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:04Z|00288|binding|INFO|Setting lport 4f442c0d-dd52-47cb-9b29-137b62aecee5 up in Southbound
Oct 14 10:09:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:04.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap4f442c0d-dd: No such device
Oct 14 10:09:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:04.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:04.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:05.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:05 np0005486759.ooo.test podman[326631]: 
Oct 14 10:09:05 np0005486759.ooo.test podman[326631]: 2025-10-14 10:09:05.120351867 +0000 UTC m=+0.069813300 container create ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:05 np0005486759.ooo.test systemd[1]: Started libpod-conmon-ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981.scope.
Oct 14 10:09:05 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:09:05 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7af4af58be655bbf1943d98b8ccab93e680dbd7175d293b4102f683288ea293/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:09:05 np0005486759.ooo.test podman[326631]: 2025-10-14 10:09:05.079736293 +0000 UTC m=+0.029197706 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:09:05 np0005486759.ooo.test podman[326631]: 2025-10-14 10:09:05.182997517 +0000 UTC m=+0.132458930 container init ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:09:05 np0005486759.ooo.test podman[326631]: 2025-10-14 10:09:05.191427605 +0000 UTC m=+0.140889018 container start ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq[326649]: started, version 2.85 cachesize 150
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq[326649]: DNS service limited to local subnets
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq[326649]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq[326649]: warning: no upstream servers configured
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq-dhcp[326649]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/addn_hosts - 0 addresses
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/host
Oct 14 10:09:05 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/opts
Oct 14 10:09:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:05.329 287366 INFO neutron.agent.dhcp.agent [None req-fccf0832-a454-491a-b2e4-9bc5024819bc - - - - - -] DHCP configuration for ports {'0ab1349e-501a-43b5-9634-0c964975297d'} is completed
Oct 14 10:09:06 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:06.090 2 INFO neutron.agent.securitygroups_rpc [None req-e7a549f8-83c6-4343-9d6e-001ad62295ec 75fdb37d87144005814d129cf716923a f1d3b46bcd944371bd0ec01deccc2b4d - - default default] Security group member updated ['200589df-e7f0-4862-9086-23091c905acc']
Oct 14 10:09:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:06.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:06 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:06.306 2 INFO neutron.agent.securitygroups_rpc [None req-e7a549f8-83c6-4343-9d6e-001ad62295ec 75fdb37d87144005814d129cf716923a f1d3b46bcd944371bd0ec01deccc2b4d - - default default] Security group member updated ['200589df-e7f0-4862-9086-23091c905acc']
Oct 14 10:09:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:06.563 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:09:06Z, description=, device_id=e1d875ea-ec2c-4e30-ae69-e72e7af8b17e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec7482b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec748610>], id=ed18de4a-ca9e-435b-9685-97f0eb88b8cb, ip_allocation=immediate, mac_address=fa:16:3e:d3:ea:54, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:58Z, description=, dns_domain=, id=98662246-a563-437e-9cd9-f826a393f21f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1057324847, port_security_enabled=True, project_id=f43a78a6bec443f087d13390d1f3b99c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18877, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2193, status=ACTIVE, subnets=['6cce93c3-0ab8-4da4-a96b-37431f37538c'], tags=[], tenant_id=f43a78a6bec443f087d13390d1f3b99c, updated_at=2025-10-14T10:09:01Z, vlan_transparent=None, network_id=98662246-a563-437e-9cd9-f826a393f21f, port_security_enabled=False, project_id=f43a78a6bec443f087d13390d1f3b99c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2214, status=DOWN, tags=[], tenant_id=f43a78a6bec443f087d13390d1f3b99c, updated_at=2025-10-14T10:09:06Z on network 98662246-a563-437e-9cd9-f826a393f21f
Oct 14 10:09:06 np0005486759.ooo.test dnsmasq[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/addn_hosts - 1 addresses
Oct 14 10:09:06 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/host
Oct 14 10:09:06 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/opts
Oct 14 10:09:06 np0005486759.ooo.test podman[326667]: 2025-10-14 10:09:06.759035436 +0000 UTC m=+0.050878420 container kill ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:09:06 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:06.869 2 INFO neutron.agent.securitygroups_rpc [None req-a164162d-a2a7-489a-880c-252d52a53327 75fdb37d87144005814d129cf716923a f1d3b46bcd944371bd0ec01deccc2b4d - - default default] Security group member updated ['200589df-e7f0-4862-9086-23091c905acc']
Oct 14 10:09:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:06.895 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:06 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:06.971 287366 INFO neutron.agent.dhcp.agent [None req-a0096303-7d68-47bc-88c3-4b9669f51c28 - - - - - -] DHCP configuration for ports {'ed18de4a-ca9e-435b-9685-97f0eb88b8cb'} is completed
Oct 14 10:09:07 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:07.674 2 INFO neutron.agent.securitygroups_rpc [None req-c4c6615d-ff13-4952-9e09-78f68cbbda4c 75fdb37d87144005814d129cf716923a f1d3b46bcd944371bd0ec01deccc2b4d - - default default] Security group member updated ['200589df-e7f0-4862-9086-23091c905acc']
Oct 14 10:09:07 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:07.941 287366 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:09:06Z, description=, device_id=e1d875ea-ec2c-4e30-ae69-e72e7af8b17e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec748df0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec748520>], id=ed18de4a-ca9e-435b-9685-97f0eb88b8cb, ip_allocation=immediate, mac_address=fa:16:3e:d3:ea:54, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:08:58Z, description=, dns_domain=, id=98662246-a563-437e-9cd9-f826a393f21f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1057324847, port_security_enabled=True, project_id=f43a78a6bec443f087d13390d1f3b99c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18877, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2193, status=ACTIVE, subnets=['6cce93c3-0ab8-4da4-a96b-37431f37538c'], tags=[], tenant_id=f43a78a6bec443f087d13390d1f3b99c, updated_at=2025-10-14T10:09:01Z, vlan_transparent=None, network_id=98662246-a563-437e-9cd9-f826a393f21f, port_security_enabled=False, project_id=f43a78a6bec443f087d13390d1f3b99c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2214, status=DOWN, tags=[], tenant_id=f43a78a6bec443f087d13390d1f3b99c, updated_at=2025-10-14T10:09:06Z on network 98662246-a563-437e-9cd9-f826a393f21f
Oct 14 10:09:08 np0005486759.ooo.test dnsmasq[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/addn_hosts - 1 addresses
Oct 14 10:09:08 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/host
Oct 14 10:09:08 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/opts
Oct 14 10:09:08 np0005486759.ooo.test podman[326703]: 2025-10-14 10:09:08.166512382 +0000 UTC m=+0.066456607 container kill ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:09:08 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:09:08 np0005486759.ooo.test podman[326717]: 2025-10-14 10:09:08.259666515 +0000 UTC m=+0.070547892 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:08 np0005486759.ooo.test podman[326717]: 2025-10-14 10:09:08.295419941 +0000 UTC m=+0.106301378 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:09:08 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:09:08 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:08.428 287366 INFO neutron.agent.dhcp.agent [None req-6e2b5181-fbd1-4680-90ec-17ee72eed372 - - - - - -] DHCP configuration for ports {'ed18de4a-ca9e-435b-9685-97f0eb88b8cb'} is completed
Oct 14 10:09:09 np0005486759.ooo.test dnsmasq[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/addn_hosts - 0 addresses
Oct 14 10:09:09 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/host
Oct 14 10:09:09 np0005486759.ooo.test dnsmasq-dhcp[326649]: read /var/lib/neutron/dhcp/98662246-a563-437e-9cd9-f826a393f21f/opts
Oct 14 10:09:09 np0005486759.ooo.test podman[326758]: 2025-10-14 10:09:09.340705981 +0000 UTC m=+0.069623844 container kill ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:09:09 np0005486759.ooo.test podman[326772]: 2025-10-14 10:09:09.456593801 +0000 UTC m=+0.085718386 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:09:09 np0005486759.ooo.test podman[326772]: 2025-10-14 10:09:09.466544626 +0000 UTC m=+0.095669261 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:09:09 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:09:09 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:09Z|00289|binding|INFO|Releasing lport 4f442c0d-dd52-47cb-9b29-137b62aecee5 from this chassis (sb_readonly=0)
Oct 14 10:09:09 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:09Z|00290|binding|INFO|Setting lport 4f442c0d-dd52-47cb-9b29-137b62aecee5 down in Southbound
Oct 14 10:09:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:09.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:09 np0005486759.ooo.test kernel: device tap4f442c0d-dd left promiscuous mode
Oct 14 10:09:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:09.581 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-98662246-a563-437e-9cd9-f826a393f21f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98662246-a563-437e-9cd9-f826a393f21f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f43a78a6bec443f087d13390d1f3b99c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=719f2b4d-fb55-4c43-81dc-7387beea6195, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=4f442c0d-dd52-47cb-9b29-137b62aecee5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:09.583 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 4f442c0d-dd52-47cb-9b29-137b62aecee5 in datapath 98662246-a563-437e-9cd9-f826a393f21f unbound from our chassis
Oct 14 10:09:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:09.586 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 98662246-a563-437e-9cd9-f826a393f21f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:09:09 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:09.586 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[ac48c0a2-a654-4a69-8047-45037036e3fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:09 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:09.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:10.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:11.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:09:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:09:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:09:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132497 "" "Go-http-client/1.1"
Oct 14 10:09:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:09:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17172 "" "Go-http-client/1.1"
Oct 14 10:09:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:12.747 287366 INFO neutron.agent.linux.ip_lib [None req-0bc2aa1d-e655-462b-9b8c-d95de51190eb - - - - - -] Device tapce56951e-25 cannot be used as it has no MAC address
Oct 14 10:09:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:12.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:12 np0005486759.ooo.test kernel: device tapce56951e-25 entered promiscuous mode
Oct 14 10:09:12 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436552.7780] manager: (tapce56951e-25): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Oct 14 10:09:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:12Z|00291|binding|INFO|Claiming lport ce56951e-25c1-4a43-b8bf-6b9060be660b for this chassis.
Oct 14 10:09:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:12Z|00292|binding|INFO|ce56951e-25c1-4a43-b8bf-6b9060be660b: Claiming unknown
Oct 14 10:09:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:12.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:12 np0005486759.ooo.test systemd-udevd[326815]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:09:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:12.788 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-b19a8b59-9809-4802-b5c9-555476ac4412', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b19a8b59-9809-4802-b5c9-555476ac4412', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f43a78a6bec443f087d13390d1f3b99c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f902cb8b-695b-47f9-8573-4158b05da868, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=ce56951e-25c1-4a43-b8bf-6b9060be660b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:12.789 183328 INFO neutron.agent.ovn.metadata.agent [-] Port ce56951e-25c1-4a43-b8bf-6b9060be660b in datapath b19a8b59-9809-4802-b5c9-555476ac4412 bound to our chassis
Oct 14 10:09:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:12.791 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port f7ebe5c2-3163-4fc9-8a3e-ac1e163fe1f0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:09:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:12.791 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b19a8b59-9809-4802-b5c9-555476ac4412, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:09:12 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:12.792 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[94d540f6-68ca-4cbb-a2af-3d0e017e5b98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:12Z|00293|binding|INFO|Setting lport ce56951e-25c1-4a43-b8bf-6b9060be660b ovn-installed in OVS
Oct 14 10:09:12 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:12Z|00294|binding|INFO|Setting lport ce56951e-25c1-4a43-b8bf-6b9060be660b up in Southbound
Oct 14 10:09:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:12.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tapce56951e-25: No such device
Oct 14 10:09:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:12.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:12.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:09:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:09:14 np0005486759.ooo.test podman[326886]: 
Oct 14 10:09:14 np0005486759.ooo.test podman[326886]: 2025-10-14 10:09:14.123157644 +0000 UTC m=+0.096451895 container create 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 14 10:09:14 np0005486759.ooo.test systemd[1]: Started libpod-conmon-9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad.scope.
Oct 14 10:09:14 np0005486759.ooo.test systemd[1]: tmp-crun.pJZlYo.mount: Deactivated successfully.
Oct 14 10:09:14 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:09:14 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38db427eb5e5dea697120a20c3ae8e22c0a09ff72840258c07ee6f0844cbac5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:09:14 np0005486759.ooo.test podman[326886]: 2025-10-14 10:09:14.08027699 +0000 UTC m=+0.053571251 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:09:14 np0005486759.ooo.test podman[326886]: 2025-10-14 10:09:14.189751734 +0000 UTC m=+0.163045975 container init 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS)
Oct 14 10:09:14 np0005486759.ooo.test podman[326886]: 2025-10-14 10:09:14.198379488 +0000 UTC m=+0.171673709 container start 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009)
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq[326905]: started, version 2.85 cachesize 150
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq[326905]: DNS service limited to local subnets
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq[326905]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq[326905]: warning: no upstream servers configured
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq-dhcp[326905]: DHCP, static leases only on 10.101.0.0, lease time 1d
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq[326905]: read /var/lib/neutron/dhcp/b19a8b59-9809-4802-b5c9-555476ac4412/addn_hosts - 0 addresses
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq-dhcp[326905]: read /var/lib/neutron/dhcp/b19a8b59-9809-4802-b5c9-555476ac4412/host
Oct 14 10:09:14 np0005486759.ooo.test dnsmasq-dhcp[326905]: read /var/lib/neutron/dhcp/b19a8b59-9809-4802-b5c9-555476ac4412/opts
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.260 287366 INFO neutron.agent.dhcp.agent [None req-34f92cee-79ec-4469-9eff-93703db8357d - - - - - -] Synchronizing state
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.340 287366 INFO neutron.agent.dhcp.agent [None req-017f9445-30b5-41c2-8ba6-17cb45ce8c39 - - - - - -] DHCP configuration for ports {'b026a4e7-feac-425c-9f00-c02e24311901'} is completed
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.437 287366 INFO neutron.agent.dhcp.agent [None req-9495d63f-5680-4fab-a408-4af0a08f6c7b - - - - - -] All active networks have been fetched through RPC.
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.439 287366 INFO neutron.agent.dhcp.agent [-] Starting network 4a93a103-f62f-46a1-883d-714ff21a4f0c dhcp configuration
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.439 287366 INFO neutron.agent.dhcp.agent [-] Finished network 4a93a103-f62f-46a1-883d-714ff21a4f0c dhcp configuration
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.440 287366 INFO neutron.agent.dhcp.agent [-] Starting network 9433a2c4-0afb-4c52-9ab4-914e26339b04 dhcp configuration
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.440 287366 INFO neutron.agent.dhcp.agent [-] Finished network 9433a2c4-0afb-4c52-9ab4-914e26339b04 dhcp configuration
Oct 14 10:09:14 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:14.441 287366 INFO neutron.agent.dhcp.agent [None req-9495d63f-5680-4fab-a408-4af0a08f6c7b - - - - - -] Synchronizing state complete
Oct 14 10:09:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:15.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:15.153 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:15 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:15.478 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:16.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:17 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:17.965 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:19.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41739 DF PROTO=TCP SPT=46482 DPT=9102 SEQ=438350672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA8DC880000000001030307) 
Oct 14 10:09:19 np0005486759.ooo.test dnsmasq[326905]: exiting on receipt of SIGTERM
Oct 14 10:09:19 np0005486759.ooo.test podman[326924]: 2025-10-14 10:09:19.907499518 +0000 UTC m=+0.046019481 container kill 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:19 np0005486759.ooo.test systemd[1]: libpod-9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad.scope: Deactivated successfully.
Oct 14 10:09:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:19Z|00295|binding|INFO|Removing iface tapce56951e-25 ovn-installed in OVS
Oct 14 10:09:19 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:19.931 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f7ebe5c2-3163-4fc9-8a3e-ac1e163fe1f0 with type ""
Oct 14 10:09:19 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:19Z|00296|binding|INFO|Removing lport ce56951e-25c1-4a43-b8bf-6b9060be660b ovn-installed in OVS
Oct 14 10:09:19 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:19.933 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-b19a8b59-9809-4802-b5c9-555476ac4412', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b19a8b59-9809-4802-b5c9-555476ac4412', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f43a78a6bec443f087d13390d1f3b99c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f902cb8b-695b-47f9-8573-4158b05da868, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=ce56951e-25c1-4a43-b8bf-6b9060be660b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:19 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:19.935 183328 INFO neutron.agent.ovn.metadata.agent [-] Port ce56951e-25c1-4a43-b8bf-6b9060be660b in datapath b19a8b59-9809-4802-b5c9-555476ac4412 unbound from our chassis
Oct 14 10:09:19 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:19.939 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b19a8b59-9809-4802-b5c9-555476ac4412, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:09:19 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:19.940 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[c80c22aa-aa20-4b6c-a718-c4c1dd17a8de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:19.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:19 np0005486759.ooo.test podman[326938]: 2025-10-14 10:09:19.997445614 +0000 UTC m=+0.071331517 container died 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:20 np0005486759.ooo.test systemd[1]: tmp-crun.brlgcn.mount: Deactivated successfully.
Oct 14 10:09:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad-userdata-shm.mount: Deactivated successfully.
Oct 14 10:09:20 np0005486759.ooo.test podman[326938]: 2025-10-14 10:09:20.027246017 +0000 UTC m=+0.101131910 container cleanup 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:20 np0005486759.ooo.test systemd[1]: libpod-conmon-9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad.scope: Deactivated successfully.
Oct 14 10:09:20 np0005486759.ooo.test podman[326939]: 2025-10-14 10:09:20.077734423 +0000 UTC m=+0.148008225 container remove 9b195a8ecad8fd4fddd4276426784574623068f14f9bf812242ffea4fef884ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b19a8b59-9809-4802-b5c9-555476ac4412, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 14 10:09:20 np0005486759.ooo.test kernel: device tapce56951e-25 left promiscuous mode
Oct 14 10:09:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:20.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:20.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:20.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:20 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:20.120 287366 INFO neutron.agent.dhcp.agent [None req-b41c3688-b55a-4a52-870f-f2256927a365 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:20 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:20.121 287366 INFO neutron.agent.dhcp.agent [None req-b41c3688-b55a-4a52-870f-f2256927a365 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:20.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:20 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:20Z|00297|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:09:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:20.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41740 DF PROTO=TCP SPT=46482 DPT=9102 SEQ=438350672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA8E0810000000001030307) 
Oct 14 10:09:20 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-38db427eb5e5dea697120a20c3ae8e22c0a09ff72840258c07ee6f0844cbac5c-merged.mount: Deactivated successfully.
Oct 14 10:09:20 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2db19a8b59\x2d9809\x2d4802\x2db5c9\x2d555476ac4412.mount: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:21.185 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:21.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:21 np0005486759.ooo.test dnsmasq[326649]: exiting on receipt of SIGTERM
Oct 14 10:09:21 np0005486759.ooo.test podman[326982]: 2025-10-14 10:09:21.373827097 +0000 UTC m=+0.097043613 container kill ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: tmp-crun.w1GpD9.mount: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: libpod-ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981.scope: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test podman[326995]: 2025-10-14 10:09:21.447783503 +0000 UTC m=+0.057866064 container died ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 14 10:09:21 np0005486759.ooo.test podman[326995]: 2025-10-14 10:09:21.479825774 +0000 UTC m=+0.089908295 container cleanup ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, io.buildah.version=1.41.3)
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: libpod-conmon-ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981.scope: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test podman[326996]: 2025-10-14 10:09:21.528811375 +0000 UTC m=+0.135813202 container remove ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98662246-a563-437e-9cd9-f826a393f21f, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 14 10:09:21 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:21.744 287366 INFO neutron.agent.dhcp.agent [None req-37af7f8f-7288-4338-a196-bfa58a29cb84 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-a7af4af58be655bbf1943d98b8ccab93e680dbd7175d293b4102f683288ea293-merged.mount: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba92f2e8357543b99276e7bc1f3f1b7751913f754ad97961b188a915f2d5f981-userdata-shm.mount: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d98662246\x2da563\x2d437e\x2d9cd9\x2df826a393f21f.mount: Deactivated successfully.
Oct 14 10:09:21 np0005486759.ooo.test podman[327022]: 2025-10-14 10:09:21.972123645 +0000 UTC m=+0.091135753 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:09:22 np0005486759.ooo.test podman[327030]: 2025-10-14 10:09:22.032850126 +0000 UTC m=+0.136534034 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:09:22 np0005486759.ooo.test podman[327030]: 2025-10-14 10:09:22.068301451 +0000 UTC m=+0.171985369 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 14 10:09:22 np0005486759.ooo.test podman[327023]: 2025-10-14 10:09:22.077223404 +0000 UTC m=+0.189873497 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 14 10:09:22 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:09:22 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:22.091 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:22 np0005486759.ooo.test podman[327022]: 2025-10-14 10:09:22.103911262 +0000 UTC m=+0.222923390 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:09:22 np0005486759.ooo.test podman[327025]: 2025-10-14 10:09:22.133074835 +0000 UTC m=+0.239642761 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:09:22 np0005486759.ooo.test podman[327025]: 2025-10-14 10:09:22.143582537 +0000 UTC m=+0.250150483 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:09:22 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:09:22 np0005486759.ooo.test podman[327023]: 2025-10-14 10:09:22.161696483 +0000 UTC m=+0.274346576 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid)
Oct 14 10:09:22 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:09:22 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.191 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.218 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.219 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.219 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.219 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.293 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.374 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.376 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.457 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.458 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.536 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.538 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:09:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41741 DF PROTO=TCP SPT=46482 DPT=9102 SEQ=438350672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA8E8810000000001030307) 
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.601 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.810 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.811 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12273MB free_disk=386.6772918701172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.812 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.812 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.922 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.923 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.924 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:09:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:22.989 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:09:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:23.015 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:09:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:23.018 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:09:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:23.019 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.014 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.254 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.254 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.255 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.255 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.952 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.967 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.968 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:09:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:25.968 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:26.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:09:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:26.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:09:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:26.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41742 DF PROTO=TCP SPT=46482 DPT=9102 SEQ=438350672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA8F8410000000001030307) 
Oct 14 10:09:29 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:29.409 2 INFO neutron.agent.securitygroups_rpc [None req-8dcb9afb-d5dd-438b-a9e6-dcab644df6b8 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['4bcdb91f-4f82-4fb4-9cd6-4ae0764338d2']
Oct 14 10:09:29 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:29.620 2 INFO neutron.agent.securitygroups_rpc [None req-20392419-a831-4cdc-947c-eaa3ab547a92 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['4bcdb91f-4f82-4fb4-9cd6-4ae0764338d2']
Oct 14 10:09:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:09:29 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:09:29 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:29.799 287366 INFO neutron.agent.linux.ip_lib [None req-a7f03eca-8693-4de6-810f-31cfceb7f69c - - - - - -] Device tap5b533774-6e cannot be used as it has no MAC address
Oct 14 10:09:29 np0005486759.ooo.test systemd[1]: tmp-crun.kZp7fl.mount: Deactivated successfully.
Oct 14 10:09:29 np0005486759.ooo.test podman[327117]: 2025-10-14 10:09:29.80818924 +0000 UTC m=+0.088666137 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 14 10:09:29 np0005486759.ooo.test podman[327117]: 2025-10-14 10:09:29.818267629 +0000 UTC m=+0.098744456 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Oct 14 10:09:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:29.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:29 np0005486759.ooo.test kernel: device tap5b533774-6e entered promiscuous mode
Oct 14 10:09:29 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436569.8334] manager: (tap5b533774-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Oct 14 10:09:29 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:09:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:29.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:29Z|00298|binding|INFO|Claiming lport 5b533774-6e08-4701-bfcb-48b0884f1496 for this chassis.
Oct 14 10:09:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:29Z|00299|binding|INFO|5b533774-6e08-4701-bfcb-48b0884f1496: Claiming unknown
Oct 14 10:09:29 np0005486759.ooo.test systemd-udevd[327156]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:29Z|00300|binding|INFO|Setting lport 5b533774-6e08-4701-bfcb-48b0884f1496 ovn-installed in OVS
Oct 14 10:09:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:29.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap5b533774-6e: No such device
Oct 14 10:09:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:29.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:29 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:29Z|00301|binding|INFO|Setting lport 5b533774-6e08-4701-bfcb-48b0884f1496 up in Southbound
Oct 14 10:09:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:29.907 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08460a50af1149f98cc25455fd23f974', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=751339d1-2b8b-498c-afd4-9fef8531d795, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=5b533774-6e08-4701-bfcb-48b0884f1496) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:29.908 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 5b533774-6e08-4701-bfcb-48b0884f1496 in datapath aad6e8de-32ab-42b7-b5c7-d453eb7eba4d bound to our chassis
Oct 14 10:09:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:29.910 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network aad6e8de-32ab-42b7-b5c7-d453eb7eba4d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:09:29 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:29.911 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec1b667-e9ab-4739-b6b3-9575fe269f21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:29 np0005486759.ooo.test podman[327116]: 2025-10-14 10:09:29.91654887 +0000 UTC m=+0.195952804 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:29.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:29 np0005486759.ooo.test podman[327116]: 2025-10-14 10:09:29.983390928 +0000 UTC m=+0.262794822 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:09:29 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:09:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:30.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:30 np0005486759.ooo.test podman[327239]: 
Oct 14 10:09:30 np0005486759.ooo.test podman[327239]: 2025-10-14 10:09:30.73364559 +0000 UTC m=+0.084825879 container create 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:30 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:30.758 2 INFO neutron.agent.securitygroups_rpc [None req-138cda7c-620e-4a11-bfa9-0d38c11cc142 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:30 np0005486759.ooo.test systemd[1]: Started libpod-conmon-390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588.scope.
Oct 14 10:09:30 np0005486759.ooo.test podman[327239]: 2025-10-14 10:09:30.693289774 +0000 UTC m=+0.044470073 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:09:30 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:09:30 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d69b3d4a8ca84e11193ea28400d4bd4a47840768305cb62b59272e8a4bb1801/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:09:30 np0005486759.ooo.test podman[327239]: 2025-10-14 10:09:30.821292775 +0000 UTC m=+0.172473074 container init 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:09:30 np0005486759.ooo.test podman[327239]: 2025-10-14 10:09:30.829495097 +0000 UTC m=+0.180675386 container start 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq[327258]: started, version 2.85 cachesize 150
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq[327258]: DNS service limited to local subnets
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq[327258]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq[327258]: warning: no upstream servers configured
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq-dhcp[327258]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq[327258]: read /var/lib/neutron/dhcp/aad6e8de-32ab-42b7-b5c7-d453eb7eba4d/addn_hosts - 0 addresses
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq-dhcp[327258]: read /var/lib/neutron/dhcp/aad6e8de-32ab-42b7-b5c7-d453eb7eba4d/host
Oct 14 10:09:30 np0005486759.ooo.test dnsmasq-dhcp[327258]: read /var/lib/neutron/dhcp/aad6e8de-32ab-42b7-b5c7-d453eb7eba4d/opts
Oct 14 10:09:30 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:30.964 2 INFO neutron.agent.securitygroups_rpc [None req-c0d25f6c-5b1e-4eee-b3bf-95b0669c9f28 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:30 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:30.988 287366 INFO neutron.agent.dhcp.agent [None req-43a6875c-d1e7-4821-94c2-27e52600fb3a - - - - - -] DHCP configuration for ports {'081ad6a0-8eb1-4813-9765-a5a56f1a1417'} is completed
Oct 14 10:09:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:31.178 2 INFO neutron.agent.securitygroups_rpc [None req-ff399ede-b052-4c44-81fc-a7a364372472 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:31.247 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c2c3b804-6d3d-41de-819c-052384856cfa with type ""
Oct 14 10:09:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:31Z|00302|binding|INFO|Removing iface tap5b533774-6e ovn-installed in OVS
Oct 14 10:09:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:31Z|00303|binding|INFO|Removing lport 5b533774-6e08-4701-bfcb-48b0884f1496 ovn-installed in OVS
Oct 14 10:09:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:31.248 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08460a50af1149f98cc25455fd23f974', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=751339d1-2b8b-498c-afd4-9fef8531d795, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=5b533774-6e08-4701-bfcb-48b0884f1496) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:31.249 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 5b533774-6e08-4701-bfcb-48b0884f1496 in datapath aad6e8de-32ab-42b7-b5c7-d453eb7eba4d unbound from our chassis
Oct 14 10:09:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:31.251 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:09:31 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:31.252 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8765ba-99be-42a6-aa0c-31f0e81866bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:31.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:31 np0005486759.ooo.test kernel: device tap5b533774-6e left promiscuous mode
Oct 14 10:09:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:31.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:31.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:31.320 2 INFO neutron.agent.securitygroups_rpc [None req-85ddce6e-cbb2-4456-8348-6e133da47a18 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:31.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:31.453 2 INFO neutron.agent.securitygroups_rpc [None req-0462acab-c06f-4005-b205-63e88f4ecd47 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:31 np0005486759.ooo.test dnsmasq[327258]: read /var/lib/neutron/dhcp/aad6e8de-32ab-42b7-b5c7-d453eb7eba4d/addn_hosts - 0 addresses
Oct 14 10:09:31 np0005486759.ooo.test dnsmasq-dhcp[327258]: read /var/lib/neutron/dhcp/aad6e8de-32ab-42b7-b5c7-d453eb7eba4d/host
Oct 14 10:09:31 np0005486759.ooo.test dnsmasq-dhcp[327258]: read /var/lib/neutron/dhcp/aad6e8de-32ab-42b7-b5c7-d453eb7eba4d/opts
Oct 14 10:09:31 np0005486759.ooo.test podman[327276]: 2025-10-14 10:09:31.596918175 +0000 UTC m=+0.055538842 container kill 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent [None req-c6d331e7-e2f0-4085-9b8f-bc74848cf7c3 - - - - - -] Unable to reload_allocations dhcp for aad6e8de-32ab-42b7-b5c7-d453eb7eba4d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap5b533774-6e not found in namespace qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d.
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     return fut.result()
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     raise self._exception
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap5b533774-6e not found in namespace qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d.
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.619 287366 ERROR neutron.agent.dhcp.agent 
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.622 287366 INFO neutron.agent.dhcp.agent [None req-9495d63f-5680-4fab-a408-4af0a08f6c7b - - - - - -] Synchronizing state
Oct 14 10:09:31 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:31.789 2 INFO neutron.agent.securitygroups_rpc [None req-ba094b45-5474-4730-b70d-8073e12eb01a 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.858 287366 INFO neutron.agent.dhcp.agent [None req-2c39df40-df1a-44c6-adf3-580cb2ca8e11 - - - - - -] All active networks have been fetched through RPC.
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.859 287366 INFO neutron.agent.dhcp.agent [-] Starting network aad6e8de-32ab-42b7-b5c7-d453eb7eba4d dhcp configuration
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.860 287366 INFO neutron.agent.dhcp.agent [-] Finished network aad6e8de-32ab-42b7-b5c7-d453eb7eba4d dhcp configuration
Oct 14 10:09:31 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:31.860 287366 INFO neutron.agent.dhcp.agent [None req-2c39df40-df1a-44c6-adf3-580cb2ca8e11 - - - - - -] Synchronizing state complete
Oct 14 10:09:31 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:31Z|00304|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:09:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:32.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:32.064 2 INFO neutron.agent.securitygroups_rpc [None req-7fd31e80-7055-42b9-9481-6b577ea6eb24 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:32 np0005486759.ooo.test dnsmasq[327258]: exiting on receipt of SIGTERM
Oct 14 10:09:32 np0005486759.ooo.test podman[327304]: 2025-10-14 10:09:32.08905031 +0000 UTC m=+0.060342949 container kill 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 14 10:09:32 np0005486759.ooo.test systemd[1]: libpod-390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588.scope: Deactivated successfully.
Oct 14 10:09:32 np0005486759.ooo.test podman[327317]: 2025-10-14 10:09:32.163272455 +0000 UTC m=+0.057902705 container died 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588-userdata-shm.mount: Deactivated successfully.
Oct 14 10:09:32 np0005486759.ooo.test podman[327317]: 2025-10-14 10:09:32.202989481 +0000 UTC m=+0.097619711 container cleanup 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS)
Oct 14 10:09:32 np0005486759.ooo.test systemd[1]: libpod-conmon-390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588.scope: Deactivated successfully.
Oct 14 10:09:32 np0005486759.ooo.test podman[327318]: 2025-10-14 10:09:32.235546489 +0000 UTC m=+0.125523527 container remove 390cd7e57e3efef9c6e8f71f4e7a60a0dbec4ef618ee2eaf786eaa0bbdf91588 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aad6e8de-32ab-42b7-b5c7-d453eb7eba4d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 14 10:09:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:32.262 2 INFO neutron.agent.securitygroups_rpc [None req-85454488-aaa7-446d-86f9-81ce22a069f8 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:32.471 2 INFO neutron.agent.securitygroups_rpc [None req-3b826b6e-21f0-49c7-b8bf-fd1b7fe7c535 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:32 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-8d69b3d4a8ca84e11193ea28400d4bd4a47840768305cb62b59272e8a4bb1801-merged.mount: Deactivated successfully.
Oct 14 10:09:32 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2daad6e8de\x2d32ab\x2d42b7\x2db5c7\x2dd453eb7eba4d.mount: Deactivated successfully.
Oct 14 10:09:32 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:32.853 2 INFO neutron.agent.securitygroups_rpc [None req-69642991-f73e-4428-9ee1-22ac260d804e 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5f12d691-1a99-4b23-a47b-135853024e46']
Oct 14 10:09:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:33.358 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:e5:43', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '9a:3c:f4:19:89:46'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:33.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:33.360 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 14 10:09:33 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:33.444 2 INFO neutron.agent.securitygroups_rpc [None req-206e8a13-7458-43cc-92fb-5d6918448ced 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['f68a41c4-caa4-4460-be94-440c636220f3']
Oct 14 10:09:33 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:33.776 287366 INFO neutron.agent.linux.ip_lib [None req-a6b7cb87-5456-48bb-8d0b-a7f5038e48fb - - - - - -] Device tap38e72c24-5f cannot be used as it has no MAC address
Oct 14 10:09:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:33.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:33 np0005486759.ooo.test kernel: device tap38e72c24-5f entered promiscuous mode
Oct 14 10:09:33 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436573.8454] manager: (tap38e72c24-5f): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Oct 14 10:09:33 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:33Z|00305|binding|INFO|Claiming lport 38e72c24-5fa9-4ed6-a609-15c866e0bf31 for this chassis.
Oct 14 10:09:33 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:33Z|00306|binding|INFO|38e72c24-5fa9-4ed6-a609-15c866e0bf31: Claiming unknown
Oct 14 10:09:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:33.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:33 np0005486759.ooo.test systemd-udevd[327355]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:09:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:33.857 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08460a50af1149f98cc25455fd23f974', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13dab329-3e92-4c6c-be82-2f89b780c2a1, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=38e72c24-5fa9-4ed6-a609-15c866e0bf31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:33.858 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 38e72c24-5fa9-4ed6-a609-15c866e0bf31 in datapath 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a bound to our chassis
Oct 14 10:09:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:33.860 183328 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Oct 14 10:09:33 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:33.861 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[bc220607-8a84-4abb-b883-14f4dc5a2f36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:33 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:33Z|00307|binding|INFO|Setting lport 38e72c24-5fa9-4ed6-a609-15c866e0bf31 ovn-installed in OVS
Oct 14 10:09:33 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:33Z|00308|binding|INFO|Setting lport 38e72c24-5fa9-4ed6-a609-15c866e0bf31 up in Southbound
Oct 14 10:09:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:33.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:33.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:33 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:33.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:34 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:34.381 2 INFO neutron.agent.securitygroups_rpc [None req-9896089b-687d-4068-9d9f-639b0c0548f2 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['12054ac4-8a5a-481a-aa1e-14753a9d29a1']
Oct 14 10:09:34 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:34.542 2 INFO neutron.agent.securitygroups_rpc [None req-738d5115-2c63-4aba-85b8-78cb589eb512 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['12054ac4-8a5a-481a-aa1e-14753a9d29a1']
Oct 14 10:09:34 np0005486759.ooo.test podman[327408]: 
Oct 14 10:09:34 np0005486759.ooo.test podman[327408]: 2025-10-14 10:09:34.768758159 +0000 UTC m=+0.089234144 container create a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009)
Oct 14 10:09:34 np0005486759.ooo.test systemd[1]: Started libpod-conmon-a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355.scope.
Oct 14 10:09:34 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:09:34 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb91a6c1a7822da0ed08386b8cc241a4169d5aeb7d81f6b9e57d9ba7f69e0097/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:09:34 np0005486759.ooo.test podman[327408]: 2025-10-14 10:09:34.727020601 +0000 UTC m=+0.047496666 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:09:34 np0005486759.ooo.test podman[327408]: 2025-10-14 10:09:34.832277405 +0000 UTC m=+0.152753400 container init a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009)
Oct 14 10:09:34 np0005486759.ooo.test podman[327408]: 2025-10-14 10:09:34.839271809 +0000 UTC m=+0.159747814 container start a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq[327426]: started, version 2.85 cachesize 150
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq[327426]: DNS service limited to local subnets
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq[327426]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq[327426]: warning: no upstream servers configured
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq-dhcp[327426]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq[327426]: read /var/lib/neutron/dhcp/1aa2df37-406f-4de8-92d7-0b4dd4c3d00a/addn_hosts - 0 addresses
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq-dhcp[327426]: read /var/lib/neutron/dhcp/1aa2df37-406f-4de8-92d7-0b4dd4c3d00a/host
Oct 14 10:09:34 np0005486759.ooo.test dnsmasq-dhcp[327426]: read /var/lib/neutron/dhcp/1aa2df37-406f-4de8-92d7-0b4dd4c3d00a/opts
Oct 14 10:09:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:34Z|00309|binding|INFO|Removing iface tap38e72c24-5f ovn-installed in OVS
Oct 14 10:09:34 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:34.998 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 07e0450e-d24a-4ee5-adf2-3efed5b88fe7 with type ""
Oct 14 10:09:34 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:34Z|00310|binding|INFO|Removing lport 38e72c24-5fa9-4ed6-a609-15c866e0bf31 ovn-installed in OVS
Oct 14 10:09:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:35.000 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08460a50af1149f98cc25455fd23f974', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13dab329-3e92-4c6c-be82-2f89b780c2a1, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=38e72c24-5fa9-4ed6-a609-15c866e0bf31) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:09:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:35.001 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 38e72c24-5fa9-4ed6-a609-15c866e0bf31 in datapath 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a unbound from our chassis
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.001 287366 INFO neutron.agent.dhcp.agent [None req-3a5f037b-97d5-4cd2-a627-e29ea34e1443 - - - - - -] DHCP configuration for ports {'8be9ad6e-d8f5-40d7-86f5-d5b4a14251be'} is completed
Oct 14 10:09:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:35.005 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:09:35 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:35.040 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[d916d4af-4823-43be-a09f-7ef4f29450f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:09:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:35.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:35 np0005486759.ooo.test kernel: device tap38e72c24-5f left promiscuous mode
Oct 14 10:09:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:35.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:35.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:35 np0005486759.ooo.test dnsmasq[327426]: read /var/lib/neutron/dhcp/1aa2df37-406f-4de8-92d7-0b4dd4c3d00a/addn_hosts - 0 addresses
Oct 14 10:09:35 np0005486759.ooo.test dnsmasq-dhcp[327426]: read /var/lib/neutron/dhcp/1aa2df37-406f-4de8-92d7-0b4dd4c3d00a/host
Oct 14 10:09:35 np0005486759.ooo.test dnsmasq-dhcp[327426]: read /var/lib/neutron/dhcp/1aa2df37-406f-4de8-92d7-0b4dd4c3d00a/opts
Oct 14 10:09:35 np0005486759.ooo.test podman[327446]: 2025-10-14 10:09:35.459616662 +0000 UTC m=+0.061207815 container kill a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent [None req-6cc1103c-c752-46de-8371-8351da908508 - - - - - -] Unable to reload_allocations dhcp for 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap38e72c24-5f not found in namespace qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a.
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     return fut.result()
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     raise self._exception
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap38e72c24-5f not found in namespace qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a.
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.482 287366 ERROR neutron.agent.dhcp.agent 
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.485 287366 INFO neutron.agent.dhcp.agent [None req-2c39df40-df1a-44c6-adf3-580cb2ca8e11 - - - - - -] Synchronizing state
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.756 287366 INFO neutron.agent.dhcp.agent [None req-3fd4acc9-7e2e-41cf-976c-c050983c027c - - - - - -] All active networks have been fetched through RPC.
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.757 287366 INFO neutron.agent.dhcp.agent [-] Starting network 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a dhcp configuration
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.758 287366 INFO neutron.agent.dhcp.agent [-] Finished network 1aa2df37-406f-4de8-92d7-0b4dd4c3d00a dhcp configuration
Oct 14 10:09:35 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:35.758 287366 INFO neutron.agent.dhcp.agent [None req-3fd4acc9-7e2e-41cf-976c-c050983c027c - - - - - -] Synchronizing state complete
Oct 14 10:09:35 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:09:35Z|00311|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:09:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:35.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:35 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:35.905 2 INFO neutron.agent.securitygroups_rpc [None req-58fa94aa-d1c7-4293-9b51-5bb55015163a 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5d77870d-7195-4675-bc73-5d82e309a1af']
Oct 14 10:09:35 np0005486759.ooo.test dnsmasq[327426]: exiting on receipt of SIGTERM
Oct 14 10:09:35 np0005486759.ooo.test podman[327476]: 2025-10-14 10:09:35.988739391 +0000 UTC m=+0.059006848 container kill a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:09:35 np0005486759.ooo.test systemd[1]: tmp-crun.oCHZob.mount: Deactivated successfully.
Oct 14 10:09:35 np0005486759.ooo.test systemd[1]: libpod-a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355.scope: Deactivated successfully.
Oct 14 10:09:36 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:36.042 2 INFO neutron.agent.securitygroups_rpc [None req-5593953b-2429-4c40-99b6-60f36318f046 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['5d77870d-7195-4675-bc73-5d82e309a1af']
Oct 14 10:09:36 np0005486759.ooo.test podman[327488]: 2025-10-14 10:09:36.063356877 +0000 UTC m=+0.060130413 container died a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:09:36 np0005486759.ooo.test podman[327488]: 2025-10-14 10:09:36.095623085 +0000 UTC m=+0.092396631 container cleanup a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251009)
Oct 14 10:09:36 np0005486759.ooo.test systemd[1]: libpod-conmon-a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355.scope: Deactivated successfully.
Oct 14 10:09:36 np0005486759.ooo.test podman[327495]: 2025-10-14 10:09:36.140405757 +0000 UTC m=+0.126308090 container remove a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1aa2df37-406f-4de8-92d7-0b4dd4c3d00a, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:09:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:36.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-fb91a6c1a7822da0ed08386b8cc241a4169d5aeb7d81f6b9e57d9ba7f69e0097-merged.mount: Deactivated successfully.
Oct 14 10:09:36 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a394f0ec2656dd452803cc5f2a238069fb9f7d0415534ce5fab5aed003fa4355-userdata-shm.mount: Deactivated successfully.
Oct 14 10:09:36 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d1aa2df37\x2d406f\x2d4de8\x2d92d7\x2d0b4dd4c3d00a.mount: Deactivated successfully.
Oct 14 10:09:36 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:36.835 2 INFO neutron.agent.securitygroups_rpc [None req-f3912e46-3ac5-4581-9dcf-ab9ecbbc539b 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['0f0b8998-62dc-4c24-a1b7-b7a5bd1e4166']
Oct 14 10:09:37 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:37.082 2 INFO neutron.agent.securitygroups_rpc [None req-47d679bb-06a2-4467-9ce5-7786702079df 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['0f0b8998-62dc-4c24-a1b7-b7a5bd1e4166']
Oct 14 10:09:37 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:37.248 2 INFO neutron.agent.securitygroups_rpc [None req-0c458cfc-b7a0-485e-a93c-535ee174c1cc 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['0f0b8998-62dc-4c24-a1b7-b7a5bd1e4166']
Oct 14 10:09:37 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:37.417 2 INFO neutron.agent.securitygroups_rpc [None req-f166bd01-d0e0-4890-a5e4-6ec9ff523819 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['0f0b8998-62dc-4c24-a1b7-b7a5bd1e4166']
Oct 14 10:09:37 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:37.621 2 INFO neutron.agent.securitygroups_rpc [None req-2666823d-d452-442d-9f68-d48bac4dec0f 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['0f0b8998-62dc-4c24-a1b7-b7a5bd1e4166']
Oct 14 10:09:38 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:38.062 2 INFO neutron.agent.securitygroups_rpc [None req-a9f127fc-a204-47da-9112-cc40b358954b 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['0f0b8998-62dc-4c24-a1b7-b7a5bd1e4166']
Oct 14 10:09:38 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:09:38 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:38.362 183328 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=93d451ec-9a31-4880-9638-030ff3f86e88, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 14 10:09:38 np0005486759.ooo.test podman[327518]: 2025-10-14 10:09:38.450092331 +0000 UTC m=+0.078705972 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true)
Oct 14 10:09:38 np0005486759.ooo.test podman[327518]: 2025-10-14 10:09:38.482513314 +0000 UTC m=+0.111126975 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:09:38 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:09:38 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:38.831 2 INFO neutron.agent.securitygroups_rpc [None req-54fdc663-687f-4f38-96e6-c31eb4cbd8c5 7e92ea549bee4502b7284626c8e6c0d7 cbc878d73cea4489acab26fecea7dd84 - - default default] Security group rule updated ['d369aabb-76dd-447a-9097-87d4cc7cd8c3']
Oct 14 10:09:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:40.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:09:40 np0005486759.ooo.test podman[327536]: 2025-10-14 10:09:40.443893929 +0000 UTC m=+0.071015227 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 14 10:09:40 np0005486759.ooo.test podman[327536]: 2025-10-14 10:09:40.451032927 +0000 UTC m=+0.078154195 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:09:40 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:09:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:41.318 287366 INFO neutron.agent.dhcp.agent [None req-3fd4acc9-7e2e-41cf-976c-c050983c027c - - - - - -] Synchronizing state
Oct 14 10:09:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:41.429 287366 INFO neutron.agent.dhcp.agent [None req-a894621a-103a-4b87-955b-27a49cd84b04 - - - - - -] All active networks have been fetched through RPC.
Oct 14 10:09:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:41.430 287366 INFO neutron.agent.dhcp.agent [-] Starting network f01a2acf-74d8-4b60-9f73-51bd01a05a20 dhcp configuration
Oct 14 10:09:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:41.431 287366 INFO neutron.agent.dhcp.agent [-] Finished network f01a2acf-74d8-4b60-9f73-51bd01a05a20 dhcp configuration
Oct 14 10:09:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:41.432 287366 INFO neutron.agent.dhcp.agent [None req-a894621a-103a-4b87-955b-27a49cd84b04 - - - - - -] Synchronizing state complete
Oct 14 10:09:41 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:41.433 287366 INFO neutron.agent.dhcp.agent [None req-d8d8292f-635d-4532-a6a0-c03b70262554 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:41.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:42 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:09:42.146 287366 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:09:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:09:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:09:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:09:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:09:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:09:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16707 "" "Go-http-client/1.1"
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:09:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:09:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:09:45 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:45.106 2 INFO neutron.agent.securitygroups_rpc [None req-b3815f76-4b54-4bd1-a989-120a012e4108 fcb3123e237f4a1fb2f7942908ac4924 78143c4980ba42b2ba2eba242cb0eddd - - default default] Security group rule updated ['569a1fa9-bb23-4f85-854b-c4a61171a8e0']
Oct 14 10:09:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:45.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:46.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59346 DF PROTO=TCP SPT=33688 DPT=9102 SEQ=3589978557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA951B80000000001030307) 
Oct 14 10:09:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:50.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59347 DF PROTO=TCP SPT=33688 DPT=9102 SEQ=3589978557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA955C20000000001030307) 
Oct 14 10:09:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:51.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: tmp-crun.O0sB1r.mount: Deactivated successfully.
Oct 14 10:09:52 np0005486759.ooo.test podman[327559]: 2025-10-14 10:09:52.440791775 +0000 UTC m=+0.067121007 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 14 10:09:52 np0005486759.ooo.test podman[327559]: 2025-10-14 10:09:52.448195952 +0000 UTC m=+0.074525154 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:09:52 np0005486759.ooo.test podman[327567]: 2025-10-14 10:09:52.48307045 +0000 UTC m=+0.099488098 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Oct 14 10:09:52 np0005486759.ooo.test podman[327567]: 2025-10-14 10:09:52.488098334 +0000 UTC m=+0.104515982 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:09:52 np0005486759.ooo.test podman[327561]: 2025-10-14 10:09:52.452580676 +0000 UTC m=+0.068877450 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 14 10:09:52 np0005486759.ooo.test podman[327560]: 2025-10-14 10:09:52.547742471 +0000 UTC m=+0.167027767 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, container_name=iscsid)
Oct 14 10:09:52 np0005486759.ooo.test podman[327561]: 2025-10-14 10:09:52.581691041 +0000 UTC m=+0.197987895 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 14 10:09:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59348 DF PROTO=TCP SPT=33688 DPT=9102 SEQ=3589978557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA95DC10000000001030307) 
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:09:52 np0005486759.ooo.test podman[327560]: 2025-10-14 10:09:52.633016614 +0000 UTC m=+0.252301980 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:09:52 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:09:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:54.175 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:09:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:54.176 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:09:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:09:54.177 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:09:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:55.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:09:56.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:09:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59349 DF PROTO=TCP SPT=33688 DPT=9102 SEQ=3589978557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA96D820000000001030307) 
Oct 14 10:09:56 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:09:56.846 2 INFO neutron.agent.securitygroups_rpc [None req-c33fddd3-14a7-42b1-85db-e32ac07e0a4c 628896b88544432d805b2c675e9e6d74 702d952efe84459fa3f8a4ad299412fc - - default default] Security group rule updated ['94b855fa-6b45-4dfe-b7a1-47ce2370aa78']
Oct 14 10:10:00 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:10:00 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:10:00 np0005486759.ooo.test podman[327636]: 2025-10-14 10:10:00.456974558 +0000 UTC m=+0.082747296 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:10:00 np0005486759.ooo.test podman[327636]: 2025-10-14 10:10:00.488007309 +0000 UTC m=+0.113780057 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 10:10:00 np0005486759.ooo.test systemd[1]: tmp-crun.9Nj9hJ.mount: Deactivated successfully.
Oct 14 10:10:00 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:10:00 np0005486759.ooo.test podman[327637]: 2025-10-14 10:10:00.505310809 +0000 UTC m=+0.127220369 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, managed_by=edpm_ansible)
Oct 14 10:10:00 np0005486759.ooo.test podman[327637]: 2025-10-14 10:10:00.519418411 +0000 UTC m=+0.141327971 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 10:10:00 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:10:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:01.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:04 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:04.069 287366 INFO neutron.agent.linux.ip_lib [None req-59f07e59-8029-448d-9539-05ebbb9127b7 - - - - - -] Device tap479ee9b1-65 cannot be used as it has no MAC address
Oct 14 10:10:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:04.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:04 np0005486759.ooo.test kernel: device tap479ee9b1-65 entered promiscuous mode
Oct 14 10:10:04 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436604.0927] manager: (tap479ee9b1-65): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Oct 14 10:10:04 np0005486759.ooo.test systemd-udevd[327691]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:10:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:04.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:04.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:04.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap479ee9b1-65: No such device
Oct 14 10:10:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:04.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:04 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:04.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:05 np0005486759.ooo.test podman[327762]: 
Oct 14 10:10:05 np0005486759.ooo.test podman[327762]: 2025-10-14 10:10:05.070386102 +0000 UTC m=+0.084773908 container create 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:10:05 np0005486759.ooo.test systemd[1]: Started libpod-conmon-34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725.scope.
Oct 14 10:10:05 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:10:05 np0005486759.ooo.test podman[327762]: 2025-10-14 10:10:05.030060517 +0000 UTC m=+0.044448403 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:10:05 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/161e6141140c8fda4de7bbad964639311722775effd34c45541151edcc14302d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:10:05 np0005486759.ooo.test podman[327762]: 2025-10-14 10:10:05.139105338 +0000 UTC m=+0.153493184 container init 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 10:10:05 np0005486759.ooo.test podman[327762]: 2025-10-14 10:10:05.147386721 +0000 UTC m=+0.161774517 container start 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: started, version 2.85 cachesize 150
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: DNS service limited to local subnets
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: warning: no upstream servers configured
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq-dhcp[327780]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: read /var/lib/neutron/dhcp/315725e0-7917-4632-9d39-d1eedb585909/addn_hosts - 0 addresses
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq-dhcp[327780]: read /var/lib/neutron/dhcp/315725e0-7917-4632-9d39-d1eedb585909/host
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq-dhcp[327780]: read /var/lib/neutron/dhcp/315725e0-7917-4632-9d39-d1eedb585909/opts
Oct 14 10:10:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:05.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:05.235 287366 INFO neutron.agent.dhcp.agent [None req-45b0b74a-3981-4f44-963a-8207c8efe1ce - - - - - -] DHCP configuration for ports {'d55a665e-afd6-441c-93dc-0360b74a29f5'} is completed
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: read /var/lib/neutron/dhcp/315725e0-7917-4632-9d39-d1eedb585909/addn_hosts - 0 addresses
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq-dhcp[327780]: read /var/lib/neutron/dhcp/315725e0-7917-4632-9d39-d1eedb585909/host
Oct 14 10:10:05 np0005486759.ooo.test podman[327798]: 2025-10-14 10:10:05.460584736 +0000 UTC m=+0.043620028 container kill 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq-dhcp[327780]: read /var/lib/neutron/dhcp/315725e0-7917-4632-9d39-d1eedb585909/opts
Oct 14 10:10:05 np0005486759.ooo.test dnsmasq[327780]: exiting on receipt of SIGTERM
Oct 14 10:10:05 np0005486759.ooo.test podman[327836]: 2025-10-14 10:10:05.795070342 +0000 UTC m=+0.058563655 container kill 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:10:05 np0005486759.ooo.test systemd[1]: libpod-34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725.scope: Deactivated successfully.
Oct 14 10:10:05 np0005486759.ooo.test podman[327848]: 2025-10-14 10:10:05.848081566 +0000 UTC m=+0.038736958 container died 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:10:05 np0005486759.ooo.test podman[327848]: 2025-10-14 10:10:05.886188334 +0000 UTC m=+0.076843646 container cleanup 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:10:05 np0005486759.ooo.test systemd[1]: libpod-conmon-34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725.scope: Deactivated successfully.
Oct 14 10:10:05 np0005486759.ooo.test podman[327850]: 2025-10-14 10:10:05.904609107 +0000 UTC m=+0.085143759 container remove 34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-315725e0-7917-4632-9d39-d1eedb585909, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:10:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:05.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:05 np0005486759.ooo.test kernel: device tap479ee9b1-65 left promiscuous mode
Oct 14 10:10:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:05.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:05.949 287366 INFO neutron.agent.dhcp.agent [None req-eb08292d-8366-4855-bfd5-845c87d14b70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:10:05 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:05.950 287366 INFO neutron.agent.dhcp.agent [None req-eb08292d-8366-4855-bfd5-845c87d14b70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:10:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-161e6141140c8fda4de7bbad964639311722775effd34c45541151edcc14302d-merged.mount: Deactivated successfully.
Oct 14 10:10:06 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34a01010e3e2d511df0eaf9e8a92b9f0b47e82ff8b074b30735dd6517fc90725-userdata-shm.mount: Deactivated successfully.
Oct 14 10:10:06 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2d315725e0\x2d7917\x2d4632\x2d9d39\x2dd1eedb585909.mount: Deactivated successfully.
Oct 14 10:10:06 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:06.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:09 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:10:09 np0005486759.ooo.test podman[327875]: 2025-10-14 10:10:09.458095243 +0000 UTC m=+0.084894372 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:10:09 np0005486759.ooo.test podman[327875]: 2025-10-14 10:10:09.463005133 +0000 UTC m=+0.089804242 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 14 10:10:09 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:10:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:10.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:10 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:10:10.883 2 INFO neutron.agent.securitygroups_rpc [None req-883cb441-39e8-4c10-acba-01b50d814fea d2e5cd3bd3c546a8b6271388d6821d7e c7c73435a74e4332b6f984eff86f31ba - - default default] Security group member updated ['5e6e86a6-3283-48cf-b40b-7c2edd17235a']
Oct 14 10:10:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:10:11 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:11.179 287366 INFO neutron.agent.linux.ip_lib [None req-f9d9ebbd-ab77-4046-820d-85e2029715cb - - - - - -] Device tap9e8f4467-03 cannot be used as it has no MAC address
Oct 14 10:10:11 np0005486759.ooo.test podman[327895]: 2025-10-14 10:10:11.186343646 +0000 UTC m=+0.068907123 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:10:11 np0005486759.ooo.test podman[327895]: 2025-10-14 10:10:11.197346843 +0000 UTC m=+0.079910320 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:10:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:11.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:11 np0005486759.ooo.test kernel: device tap9e8f4467-03 entered promiscuous mode
Oct 14 10:10:11 np0005486759.ooo.test NetworkManager[5960]: <info>  [1760436611.2048] manager: (tap9e8f4467-03): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Oct 14 10:10:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:11.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:11 np0005486759.ooo.test systemd-udevd[327926]: Network interface NamePolicy= disabled on kernel command line.
Oct 14 10:10:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:11Z|00312|binding|INFO|Claiming lport 9e8f4467-0375-434d-9318-54d4a985a70c for this chassis.
Oct 14 10:10:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:11Z|00313|binding|INFO|9e8f4467-0375-434d-9318-54d4a985a70c: Claiming unknown
Oct 14 10:10:11 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:10:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:11.218 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-fc453d01-df4c-4777-acc1-66f266ebf132', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc453d01-df4c-4777-acc1-66f266ebf132', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c73435a74e4332b6f984eff86f31ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa9548e6-7fac-41e9-b9d2-388127ad4ef8, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=9e8f4467-0375-434d-9318-54d4a985a70c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:10:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:11.219 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 9e8f4467-0375-434d-9318-54d4a985a70c in datapath fc453d01-df4c-4777-acc1-66f266ebf132 bound to our chassis
Oct 14 10:10:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:11.220 183328 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3f7cd693-c6b8-46d3-b992-2793ef2933e1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Oct 14 10:10:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:11.221 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc453d01-df4c-4777-acc1-66f266ebf132, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:10:11 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:11.222 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[2570b9de-4354-42fb-94b7-75dba262751e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:11Z|00314|binding|INFO|Setting lport 9e8f4467-0375-434d-9318-54d4a985a70c ovn-installed in OVS
Oct 14 10:10:11 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:11Z|00315|binding|INFO|Setting lport 9e8f4467-0375-434d-9318-54d4a985a70c up in Southbound
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test virtnodedevd[255775]: ethtool ioctl error on tap9e8f4467-03: No such device
Oct 14 10:10:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:11.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:11.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:11.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:11 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:11.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:12 np0005486759.ooo.test podman[327997]: 
Oct 14 10:10:12 np0005486759.ooo.test podman[327997]: 2025-10-14 10:10:12.178944642 +0000 UTC m=+0.089627977 container create e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 14 10:10:12 np0005486759.ooo.test systemd[1]: Started libpod-conmon-e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f.scope.
Oct 14 10:10:12 np0005486759.ooo.test systemd[1]: tmp-crun.mj8goN.mount: Deactivated successfully.
Oct 14 10:10:12 np0005486759.ooo.test systemd[1]: Started libcrun container.
Oct 14 10:10:12 np0005486759.ooo.test kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86298c6fdab46efefa29b4830b035cce8c58c0a2d119905aa12d3adb7e3375bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 14 10:10:12 np0005486759.ooo.test podman[327997]: 2025-10-14 10:10:12.134699887 +0000 UTC m=+0.045383222 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Oct 14 10:10:12 np0005486759.ooo.test podman[327997]: 2025-10-14 10:10:12.243577622 +0000 UTC m=+0.154260957 container init e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.build-date=20251009, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 14 10:10:12 np0005486759.ooo.test podman[327997]: 2025-10-14 10:10:12.25199062 +0000 UTC m=+0.162673955 container start e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq[328015]: started, version 2.85 cachesize 150
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq[328015]: DNS service limited to local subnets
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq[328015]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq[328015]: warning: no upstream servers configured
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq-dhcp[328015]: DHCP, static leases only on 10.100.0.0, lease time 1d
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/addn_hosts - 0 addresses
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq-dhcp[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/host
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq-dhcp[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/opts
Oct 14 10:10:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:10:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:10:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:12.314 287366 INFO neutron.agent.dhcp.agent [None req-dc4bd162-5544-4a84-9ee1-a5a869ae789a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-10-14T10:10:10Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6706a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f20ec6703a0>], id=0602d6be-dd83-46f7-b84e-98cffd13c3f7, ip_allocation=immediate, mac_address=fa:16:3e:e2:40:91, name=tempest-TagsExtTest-1622248332, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-10-14T10:10:08Z, description=, dns_domain=, id=fc453d01-df4c-4777-acc1-66f266ebf132, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-121561000, port_security_enabled=True, project_id=c7c73435a74e4332b6f984eff86f31ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33561, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2419, status=ACTIVE, subnets=['5da83fe5-f05e-4153-b02b-40c280aefc71'], tags=[], tenant_id=c7c73435a74e4332b6f984eff86f31ba, updated_at=2025-10-14T10:10:09Z, vlan_transparent=None, network_id=fc453d01-df4c-4777-acc1-66f266ebf132, port_security_enabled=True, project_id=c7c73435a74e4332b6f984eff86f31ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5e6e86a6-3283-48cf-b40b-7c2edd17235a'], standard_attr_id=2425, status=DOWN, tags=[], tenant_id=c7c73435a74e4332b6f984eff86f31ba, updated_at=2025-10-14T10:10:10Z on network fc453d01-df4c-4777-acc1-66f266ebf132
Oct 14 10:10:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:10:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 132497 "" "Go-http-client/1.1"
Oct 14 10:10:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:10:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17172 "" "Go-http-client/1.1"
Oct 14 10:10:12 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:12.727 287366 INFO neutron.agent.dhcp.agent [None req-e70caa11-e2a8-42c0-952f-18f286f53d07 - - - - - -] DHCP configuration for ports {'70658feb-5e6d-414c-9ae8-8c24b833eed5'} is completed
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/addn_hosts - 1 addresses
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq-dhcp[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/host
Oct 14 10:10:12 np0005486759.ooo.test dnsmasq-dhcp[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/opts
Oct 14 10:10:12 np0005486759.ooo.test podman[328033]: 2025-10-14 10:10:12.887633202 +0000 UTC m=+0.055370018 container kill e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:10:13 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:13.069 287366 INFO neutron.agent.dhcp.agent [None req-23a4773d-1f36-48df-930e-26f8b951b895 - - - - - -] DHCP configuration for ports {'0602d6be-dd83-46f7-b84e-98cffd13c3f7'} is completed
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:10:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:10:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:15.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:15 np0005486759.ooo.test neutron_sriov_agent[280524]: 2025-10-14 10:10:15.973 2 INFO neutron.agent.securitygroups_rpc [None req-527e2694-a67f-4ac9-8429-d19fe9569e0c d2e5cd3bd3c546a8b6271388d6821d7e c7c73435a74e4332b6f984eff86f31ba - - default default] Security group member updated ['5e6e86a6-3283-48cf-b40b-7c2edd17235a']
Oct 14 10:10:16 np0005486759.ooo.test dnsmasq[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/addn_hosts - 0 addresses
Oct 14 10:10:16 np0005486759.ooo.test dnsmasq-dhcp[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/host
Oct 14 10:10:16 np0005486759.ooo.test dnsmasq-dhcp[328015]: read /var/lib/neutron/dhcp/fc453d01-df4c-4777-acc1-66f266ebf132/opts
Oct 14 10:10:16 np0005486759.ooo.test podman[328069]: 2025-10-14 10:10:16.188511339 +0000 UTC m=+0.057443470 container kill e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:10:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:16Z|00316|binding|INFO|Removing iface tap9e8f4467-03 ovn-installed in OVS
Oct 14 10:10:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:16.466 183328 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3f7cd693-c6b8-46d3-b992-2793ef2933e1 with type ""
Oct 14 10:10:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:16Z|00317|binding|INFO|Removing lport 9e8f4467-0375-434d-9318-54d4a985a70c ovn-installed in OVS
Oct 14 10:10:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:16.468 183328 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005486759.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpb8f46c8e-02c5-509b-85cf-df2f94be4bb1-fc453d01-df4c-4777-acc1-66f266ebf132', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc453d01-df4c-4777-acc1-66f266ebf132', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c73435a74e4332b6f984eff86f31ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005486759.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa9548e6-7fac-41e9-b9d2-388127ad4ef8, chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7754a5d880>], logical_port=9e8f4467-0375-434d-9318-54d4a985a70c) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 14 10:10:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:16.470 183328 INFO neutron.agent.ovn.metadata.agent [-] Port 9e8f4467-0375-434d-9318-54d4a985a70c in datapath fc453d01-df4c-4777-acc1-66f266ebf132 unbound from our chassis
Oct 14 10:10:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:16.472 183328 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc453d01-df4c-4777-acc1-66f266ebf132, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 14 10:10:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:16.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:16 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:16.497 183433 DEBUG oslo.privsep.daemon [-] privsep: reply[dd92ba92-d394-44ab-94e8-bf470e100321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 14 10:10:16 np0005486759.ooo.test dnsmasq[328015]: exiting on receipt of SIGTERM
Oct 14 10:10:16 np0005486759.ooo.test podman[328106]: 2025-10-14 10:10:16.626174385 +0000 UTC m=+0.059328678 container kill e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:10:16 np0005486759.ooo.test systemd[1]: libpod-e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f.scope: Deactivated successfully.
Oct 14 10:10:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:16.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:16 np0005486759.ooo.test podman[328118]: 2025-10-14 10:10:16.69385816 +0000 UTC m=+0.055996267 container died e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5)
Oct 14 10:10:16 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:16Z|00318|binding|INFO|Releasing lport 25844137-067c-4137-b11d-9fc6e75f59fd from this chassis (sb_readonly=0)
Oct 14 10:10:16 np0005486759.ooo.test podman[328118]: 2025-10-14 10:10:16.774328654 +0000 UTC m=+0.136466711 container cleanup e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct 14 10:10:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:16.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:16 np0005486759.ooo.test systemd[1]: libpod-conmon-e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f.scope: Deactivated successfully.
Oct 14 10:10:16 np0005486759.ooo.test podman[328125]: 2025-10-14 10:10:16.829303278 +0000 UTC m=+0.179457508 container remove e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc453d01-df4c-4777-acc1-66f266ebf132, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:10:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:16.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:16 np0005486759.ooo.test kernel: device tap9e8f4467-03 left promiscuous mode
Oct 14 10:10:16 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:16.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:16 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:16.872 287366 INFO neutron.agent.dhcp.agent [None req-0d361cf6-2082-4908-9407-8c5e25035598 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:10:16 np0005486759.ooo.test neutron_dhcp_agent[287362]: 2025-10-14 10:10:16.873 287366 INFO neutron.agent.dhcp.agent [None req-0d361cf6-2082-4908-9407-8c5e25035598 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Oct 14 10:10:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay-86298c6fdab46efefa29b4830b035cce8c58c0a2d119905aa12d3adb7e3375bb-merged.mount: Deactivated successfully.
Oct 14 10:10:17 np0005486759.ooo.test systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e786b6a2e75bef9849db34cfe0030caa0b48f8c131d64d3221cad6a5aa2ace5f-userdata-shm.mount: Deactivated successfully.
Oct 14 10:10:17 np0005486759.ooo.test systemd[1]: run-netns-qdhcp\x2dfc453d01\x2ddf4c\x2d4777\x2dacc1\x2d66f266ebf132.mount: Deactivated successfully.
Oct 14 10:10:19 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:19.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2055 DF PROTO=TCP SPT=43208 DPT=9102 SEQ=61297273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA9C6E80000000001030307) 
Oct 14 10:10:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:20.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:20.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 14 10:10:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:20.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2056 DF PROTO=TCP SPT=43208 DPT=9102 SEQ=61297273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA9CB010000000001030307) 
Oct 14 10:10:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:21.206 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:21.206 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:21.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2057 DF PROTO=TCP SPT=43208 DPT=9102 SEQ=61297273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA9D3020000000001030307) 
Oct 14 10:10:23 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:23.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:10:23 np0005486759.ooo.test podman[328156]: 2025-10-14 10:10:23.459946278 +0000 UTC m=+0.075010399 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 14 10:10:23 np0005486759.ooo.test podman[328156]: 2025-10-14 10:10:23.471265885 +0000 UTC m=+0.086329986 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: tmp-crun.oBw0Yp.mount: Deactivated successfully.
Oct 14 10:10:23 np0005486759.ooo.test podman[328148]: 2025-10-14 10:10:23.506831075 +0000 UTC m=+0.133882013 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:10:23 np0005486759.ooo.test podman[328148]: 2025-10-14 10:10:23.51647084 +0000 UTC m=+0.143521828 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:10:23 np0005486759.ooo.test podman[328149]: 2025-10-14 10:10:23.560749406 +0000 UTC m=+0.183925195 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 14 10:10:23 np0005486759.ooo.test podman[328149]: 2025-10-14 10:10:23.567896825 +0000 UTC m=+0.191072624 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:10:23 np0005486759.ooo.test podman[328150]: 2025-10-14 10:10:23.616448353 +0000 UTC m=+0.236616840 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:10:23 np0005486759.ooo.test podman[328150]: 2025-10-14 10:10:23.654349743 +0000 UTC m=+0.274518230 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2)
Oct 14 10:10:23 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.208 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.208 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.208 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.209 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.282 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.355 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.357 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.409 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.411 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.453 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'name': 'test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'np0005486759.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'hostId': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.456 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes volume: 8721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f23f581-cfb3-4da8-abed-0ef7a15700fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8721, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.454082', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '0002291c-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': 'ae54049236ef09d38d49115b0d4e0d56f5a05d8e7984ee90b686536bf0921d14'}]}, 'timestamp': '2025-10-14 10:10:24.457345', '_unique_id': '41277138a36749ebb11d818f7b27a3d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.458 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.459 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7ad35e6-c542-472f-a904-1146d2877e31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.459620', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '00029096-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': 'ae88ade057cdab3a1c4c376ac74ca52be22721b214bb3b756e4a754501eeb7e3'}]}, 'timestamp': '2025-10-14 10:10:24.459976', '_unique_id': 'b5e5f8ee366b4fa0aa5c8e25799f4d28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.460 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.461 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.461 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets volume: 117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8a4cc3e-d523-404b-b34a-64198b29766a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 117, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.461536', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '0002db3c-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': 'eeda74e7b50200e7254738fbb6a985c6cc97cce296c1eab6d5831ab36b3e77ce'}]}, 'timestamp': '2025-10-14 10:10:24.461865', '_unique_id': 'a3703f5dee88409aad69b6a1904c766a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.462 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.463 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.463 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76ea8f8e-6b20-4579-af03-49ec7da2b76c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.463404', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '00032416-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': '677ed7df7a66124d9e55a2c1a27a03fcfe735c2b6f1c8783dde9c815d5ab69c2'}]}, 'timestamp': '2025-10-14 10:10:24.463730', '_unique_id': '33b53a43cd454df9a1db405bdc5fec31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.464 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.465 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.483 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/cpu volume: 14350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff4b3e3f-9e50-47fe-8687-c7c9b0c5b47b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14350000000, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:10:24.465193', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '000636a6-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.678424863, 'message_signature': '134aae0324b2fbdc80657dd187212139ee732bf21b6b472f9379e8d2d303348b'}]}, 'timestamp': '2025-10-14 10:10:24.483887', '_unique_id': 'c0ecee985e7e41c99bf9f3c10bf02f37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.484 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.485 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.487 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.489 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.501 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 31326208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.502 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '926ccdd8-1d74-40a9-a87b-657a238e5f7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31326208, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.485635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0008fc24-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.680701283, 'message_signature': 'b26c1c4380fd23ec5d42dee05b480609e040d7c91930dd91f200ba84780dcd00'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.485635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00090bec-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.680701283, 'message_signature': '50457fc87030e030a30ef3812dd1fc093f7882240d2651e9051ab00b1d3df6fe'}]}, 'timestamp': '2025-10-14 10:10:24.502450', '_unique_id': '506044a662734508b2942d623c87cc12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.503 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.504 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '224c06a3-1d8b-42d4-96b4-8f484bc4dd4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.504651', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '00096f7e-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': 'd6edd303c413d0ed1afbe992f8ce6b5013739220c0d91fdf3017849e8f62056a'}]}, 'timestamp': '2025-10-14 10:10:24.505003', '_unique_id': '2f75a0d18b8a4a7daad841c574c566fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.505 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.531 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.531 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '582f8bb4-c63c-4e25-819a-d44821e8b357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.506766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '000d86b8-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '8aae813edc9fba320b689dc267c27c559616f8a49b9b45ba700041e909fc7c7d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.506766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '000d934c-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '55b03c0b5b56b388a4900093f6ef9912faaf52cce37ea6be85378db62c95fa8b'}]}, 'timestamp': '2025-10-14 10:10:24.532117', '_unique_id': 'c51045f21d67482691768c9daee5c2d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.533 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cf99cf5-6f18-4cf2-a963-e351431e727b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.533910', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '000de838-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': '11a03fc88ecd7bb9cef318f69f939d5ad80eae2210652b7badaae4b2edadb38b'}]}, 'timestamp': '2025-10-14 10:10:24.534285', '_unique_id': '9189f7f3e2c541c8b0552dbf34b0f9cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.534 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.535 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.535 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/memory.usage volume: 51.8203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4b7df2-9ed9-4560-bd7d-0ba7f2acd5a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.8203125, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'timestamp': '2025-10-14T10:10:24.535712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '000e2c1c-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.678424863, 'message_signature': 'd6a82bfc5a40fa7fa171119d88a1de9ab38cca83a15beff6552624f3d9991a25'}]}, 'timestamp': '2025-10-14 10:10:24.536029', '_unique_id': '01de67902ad84527bc04c9e8e4038de9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.536 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.537 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.537 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8951ccb-a6a1-4031-be57-ec62018989e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.537436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '000e6f60-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '878649592645d3c305fc07216dfb087c8a24be06982965efa4ff03f81b1d7751'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.537436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '000e7a14-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '910b0c85389f3811638d16199d80ba45924372ddd13ab368b35e3908065f3fbe'}]}, 'timestamp': '2025-10-14 10:10:24.538019', '_unique_id': 'd6066e4db119469b944e50a2db8cacd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.538 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.539 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.bytes volume: 10064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '418d269d-f75c-4ee1-9866-a0b405b937dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.539465', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '000ebed4-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': '3837962af7f8f0f1a10101d7cc36502339d8192d6143d9d8e5057dcc75e9338f'}]}, 'timestamp': '2025-10-14 10:10:24.539774', '_unique_id': 'ecea36f79be94ad3a82f85dbd775ba62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.540 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.541 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.541 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef2ed36c-f6a5-473d-937e-62af3cfa0cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.541176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '000f016e-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.680701283, 'message_signature': '9187173ed5aa298dea9a3c9a3a07a490d2ba03701cf0714924e82d26d14a8b9f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.541176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '000f0c2c-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.680701283, 'message_signature': '3848b02ef70200d10e3c77c4e28aeac522a415a9238317d06180f60a5702c1e9'}]}, 'timestamp': '2025-10-14 10:10:24.541736', '_unique_id': '8899f642ddd64097ac7bc1fd78559b45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.542 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.543 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.543 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.543 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fec8bc8-47a8-4196-865c-abffceae2b5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.543192', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '000f5024-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.680701283, 'message_signature': '7b10c27522fd8f923be41e21b9750ba8ea365b2a86d7d4cf0fc713d6ad9e24dc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.543192', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '000f5ac4-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.680701283, 'message_signature': 'edbe603ca95c7ea1c015552a303d1498908da7693b63de384f8113a66f3c12fe'}]}, 'timestamp': '2025-10-14 10:10:24.543745', '_unique_id': 'a25c0669eb2f49188a6fb920f5b5050e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.544 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.545 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.545 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4096db9-e138-4370-8062-513d622ffe40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.545346', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '000fa484-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': '717b41863bd54b0575d6ba7e0dbe9d2a28e2953f6c1622c834bed8a1156c665d'}]}, 'timestamp': '2025-10-14 10:10:24.545654', '_unique_id': '240a0ae2ad114a1fb509a9ad0b09f147'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.546 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.547 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c18f9bf-dfbd-4afa-b557-ae9196035be6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.547171', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '000febe2-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': '55ab94b3ee11da8ddd40492ec447c7d6ef82a9cb5cebb573c39503e8cd39439b'}]}, 'timestamp': '2025-10-14 10:10:24.547482', '_unique_id': '623ef4f26b96464da24668df055812f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.548 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.549 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 739626512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.549 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.read.latency volume: 60612298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca9bf1d8-6d63-4dfa-b566-db1a242a5bac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739626512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.548994', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0010330e-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '8e80cc3f1b2ef7ec991501a2b19ee218d61669acb641852533390538da9f19ca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60612298, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.548994', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00103da4-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': 'fd816a4bf958ce8a8515ce3ef01bb4cec1585a271cf5c7c2921ada9e0094a145'}]}, 'timestamp': '2025-10-14 10:10:24.549556', '_unique_id': '881e2f2f09cc4d25a447e5bdaab12938'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.550 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.551 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.551 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.551 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bce21e53-7bac-47ce-bc33-940087ae7aee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.551139', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '001087be-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '90d44f17c7e30add985f29610909dbac11800700f4d3c45e0f107466224e13a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.551139', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00109740-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '9ca9f1f780b22bc54dc16781a2cddd6247b807daf17c349ac83be89f9199e061'}]}, 'timestamp': '2025-10-14 10:10:24.551888', '_unique_id': 'f29ddf22b02f4256813198c87648225f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.553 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.552 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.553 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.553 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.554 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f3e78ca-4a96-4a4e-8a1c-d8d9514a154f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': 'instance-00000001-4408214d-dae5-4452-92e9-eb4abd6589d4-tapeee08de8-f9', 'timestamp': '2025-10-14T10:10:24.553971', 'resource_metadata': {'display_name': 'test', 'name': 'tapeee08de8-f9', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8e:cf:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeee08de8-f9'}, 'message_id': '001109fa-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.649145357, 'message_signature': '9afbcb5687c386f859dd064497a3523b3371c01eb4075599e605475034609afc'}]}, 'timestamp': '2025-10-14 10:10:24.554977', '_unique_id': '2057d8ea8bae46efbce3cff3af25f73a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.556 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.557 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.558 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0222841c-8295-4548-bba7-6b2cb310ee32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.557582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00118290-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '6ac1688950eb6766d893723d51f5a03f097dded0fc63c48efe3ac1c42a9bac42'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.557582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00119b4a-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': '9139e5f5289172389ed5b4a4d50b7f5db7349959ae9f235b48acc594cfe8241d'}]}, 'timestamp': '2025-10-14 10:10:24.558642', '_unique_id': '0664544f43dd4a4fb72411522cab6577'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 67767064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.560 12 DEBUG ceilometer.compute.pollsters [-] 4408214d-dae5-4452-92e9-eb4abd6589d4/disk.device.write.latency volume: 492064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fa2ca7b-c146-4289-9d7c-a768675be3fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67767064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vda', 'timestamp': '2025-10-14T10:10:24.560083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0011e3fc-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': 'a17e0eebde9fc7c2b54f43f44b29822ad47861254934615306a42c2ffcd5325c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492064, 'user_id': '2aff2e6f927a42b1b822d05cd9349762', 'user_name': None, 'project_id': '8bf64e81a4214f9490d231a2e79ab3d8', 'project_name': None, 'resource_id': '4408214d-dae5-4452-92e9-eb4abd6589d4-vdb', 'timestamp': '2025-10-14T10:10:24.560083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000001', 'instance_id': '4408214d-dae5-4452-92e9-eb4abd6589d4', 'instance_type': 'm1.small', 'host': '9a950798d3ac51e7a8fd65b275d18c4d519dceff59a4eab6247586d3', 'instance_host': 'np0005486759.ooo.test', 'flavor': {'id': '0bdb7446-7a7f-4e51-8a88-180de2e09857', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'd8afae20-8860-4649-9226-11ff3fdf8072'}, 'image_ref': 'd8afae20-8860-4649-9226-11ff3fdf8072', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0011ee4c-a8e6-11f0-b515-fa163eba5220', 'monotonic_time': 12644.701844391, 'message_signature': 'a82feaae383e765c8b2cb5711fdb1f45dfda058d7ae2675263386d3abe9bd506'}]}, 'timestamp': '2025-10-14 10:10:24.560624', '_unique_id': '0e8a5dc2a3e24a7f85f584490de5f634'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     yield
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 14 10:10:24 np0005486759.ooo.test ceilometer_agent_compute[262818]: 2025-10-14 10:10:24.561 12 ERROR oslo_messaging.notify.messaging 
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.740 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.742 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12261MB free_disk=386.6771011352539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.742 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:10:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:24.743 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.058 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.059 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.059 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.325 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing inventories for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.598 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Updating ProviderTree inventory for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.598 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Updating inventory in ProviderTree for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.616 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing aggregate associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.640 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Refreshing trait associations for resource provider 2da4b4c2-8401-4cdb-85a2-115635137a6d, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.695 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.799 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.802 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:10:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:25.802 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:10:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2058 DF PROTO=TCP SPT=43208 DPT=9102 SEQ=61297273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FA9E2C10000000001030307) 
Oct 14 10:10:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:26.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.190 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.694 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.695 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.695 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:10:28 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:28.696 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.110 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.124 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.125 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.126 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.126 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.127 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:10:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:29.127 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:30.199 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:10:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:30.200 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 14 10:10:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:30.220 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 14 10:10:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:30.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:10:31 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:10:31 np0005486759.ooo.test podman[328241]: 2025-10-14 10:10:31.45815156 +0000 UTC m=+0.083653983 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Oct 14 10:10:31 np0005486759.ooo.test podman[328241]: 2025-10-14 10:10:31.498352522 +0000 UTC m=+0.123854965 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Oct 14 10:10:31 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:10:31 np0005486759.ooo.test podman[328240]: 2025-10-14 10:10:31.504605114 +0000 UTC m=+0.134493861 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:10:31 np0005486759.ooo.test podman[328240]: 2025-10-14 10:10:31.612606462 +0000 UTC m=+0.242495259 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true)
Oct 14 10:10:31 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:10:31 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:31.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:35.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:36 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:36.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:40 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:10:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:40.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:40 np0005486759.ooo.test podman[328283]: 2025-10-14 10:10:40.499434176 +0000 UTC m=+0.070614944 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 14 10:10:40 np0005486759.ooo.test podman[328283]: 2025-10-14 10:10:40.532359954 +0000 UTC m=+0.103540672 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 10:10:40 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:10:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:10:41 np0005486759.ooo.test podman[328301]: 2025-10-14 10:10:41.448585802 +0000 UTC m=+0.077958260 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:10:41 np0005486759.ooo.test podman[328301]: 2025-10-14 10:10:41.457649299 +0000 UTC m=+0.087021807 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:10:41 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:10:41 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:41.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:10:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:10:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:10:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:10:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:10:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16710 "" "Go-http-client/1.1"
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:10:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:10:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:10:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:45.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:46 np0005486759.ooo.test ovn_controller[177766]: 2025-10-14T10:10:46Z|00319|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 14 10:10:46 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:46.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:48 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=42315 DF PROTO=TCP SPT=60166 DPT=19885 SEQ=1428646112 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A0121A1EB0000000001030307) 
Oct 14 10:10:48 np0005486759.ooo.test sshd[328324]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:10:48 np0005486759.ooo.test sshd[328324]: Accepted publickey for zuul from 38.102.83.114 port 58928 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:10:48 np0005486759.ooo.test systemd-logind[759]: New session 46 of user zuul.
Oct 14 10:10:48 np0005486759.ooo.test systemd[1]: Started Session 46 of User zuul.
Oct 14 10:10:48 np0005486759.ooo.test sshd[328324]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:10:48 np0005486759.ooo.test sudo[328344]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqwxrphjrftyjwfwpxsmtebzigwnguyj ; /usr/bin/python3
Oct 14 10:10:48 np0005486759.ooo.test sudo[328344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:10:49 np0005486759.ooo.test python3[328346]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-49ba-6b5f-00000000000a-1-cell1compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 14 10:10:49 np0005486759.ooo.test sudo[328344]: pam_unix(sudo:session): session closed for user root
Oct 14 10:10:49 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=42316 DF PROTO=TCP SPT=60166 DPT=19885 SEQ=1428646112 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A0121A6030000000001030307) 
Oct 14 10:10:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64004 DF PROTO=TCP SPT=40938 DPT=9102 SEQ=2460947593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAA3C180000000001030307) 
Oct 14 10:10:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:50.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64005 DF PROTO=TCP SPT=40938 DPT=9102 SEQ=2460947593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAA40010000000001030307) 
Oct 14 10:10:51 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=42317 DF PROTO=TCP SPT=60166 DPT=19885 SEQ=1428646112 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A0121AE030000000001030307) 
Oct 14 10:10:51 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:51.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64006 DF PROTO=TCP SPT=40938 DPT=9102 SEQ=2460947593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAA48010000000001030307) 
Oct 14 10:10:53 np0005486759.ooo.test sshd[328324]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: session-46.scope: Deactivated successfully.
Oct 14 10:10:53 np0005486759.ooo.test systemd-logind[759]: Session 46 logged out. Waiting for processes to exit.
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:10:53 np0005486759.ooo.test systemd-logind[759]: Removed session 46.
Oct 14 10:10:53 np0005486759.ooo.test podman[328350]: 2025-10-14 10:10:53.877732778 +0000 UTC m=+0.097999657 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, config_id=iscsid, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:10:53 np0005486759.ooo.test podman[328349]: 2025-10-14 10:10:53.923531899 +0000 UTC m=+0.144798569 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:10:53 np0005486759.ooo.test podman[328352]: 2025-10-14 10:10:53.934599957 +0000 UTC m=+0.144380395 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:10:53 np0005486759.ooo.test podman[328350]: 2025-10-14 10:10:53.941126847 +0000 UTC m=+0.161393756 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251009, config_id=iscsid, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 14 10:10:53 np0005486759.ooo.test podman[328352]: 2025-10-14 10:10:53.94938141 +0000 UTC m=+0.159161888 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251009, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:10:53 np0005486759.ooo.test podman[328349]: 2025-10-14 10:10:53.960396886 +0000 UTC m=+0.181663526 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:10:53 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:10:54 np0005486759.ooo.test podman[328351]: 2025-10-14 10:10:54.032055647 +0000 UTC m=+0.245223980 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 14 10:10:54 np0005486759.ooo.test podman[328351]: 2025-10-14 10:10:54.05142331 +0000 UTC m=+0.264591673 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct 14 10:10:54 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:10:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:54.176 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:10:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:54.176 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:10:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:10:54.177 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:10:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:55.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:10:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64007 DF PROTO=TCP SPT=40938 DPT=9102 SEQ=2460947593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAA57C20000000001030307) 
Oct 14 10:10:56 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:10:56.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:00 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:00.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:01 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:01.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:11:02 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:11:02 np0005486759.ooo.test podman[328430]: 2025-10-14 10:11:02.094169409 +0000 UTC m=+0.079824242 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 14 10:11:02 np0005486759.ooo.test podman[328430]: 2025-10-14 10:11:02.111330924 +0000 UTC m=+0.096985797 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 14 10:11:02 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:11:02 np0005486759.ooo.test podman[328429]: 2025-10-14 10:11:02.203138152 +0000 UTC m=+0.188966270 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 14 10:11:02 np0005486759.ooo.test podman[328429]: 2025-10-14 10:11:02.242497785 +0000 UTC m=+0.228325863 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 14 10:11:02 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:11:05 np0005486759.ooo.test sshd[328476]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:05 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:05.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:05 np0005486759.ooo.test sshd[328476]: Accepted publickey for zuul from 38.102.83.114 port 52028 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:05 np0005486759.ooo.test systemd-logind[759]: New session 47 of user zuul.
Oct 14 10:11:05 np0005486759.ooo.test systemd[1]: Started Session 47 of User zuul.
Oct 14 10:11:05 np0005486759.ooo.test sshd[328476]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:05 np0005486759.ooo.test sudo[328480]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Oct 14 10:11:05 np0005486759.ooo.test sudo[328480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:07 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:07.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:07 np0005486759.ooo.test sudo[328480]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:07 np0005486759.ooo.test sshd[328479]: Received disconnect from 38.102.83.114 port 52028:11: disconnected by user
Oct 14 10:11:07 np0005486759.ooo.test sshd[328479]: Disconnected from user zuul 38.102.83.114 port 52028
Oct 14 10:11:07 np0005486759.ooo.test sshd[328476]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:07 np0005486759.ooo.test systemd[1]: session-47.scope: Deactivated successfully.
Oct 14 10:11:07 np0005486759.ooo.test systemd[1]: session-47.scope: Consumed 2.038s CPU time.
Oct 14 10:11:07 np0005486759.ooo.test systemd-logind[759]: Session 47 logged out. Waiting for processes to exit.
Oct 14 10:11:07 np0005486759.ooo.test systemd-logind[759]: Removed session 47.
Oct 14 10:11:07 np0005486759.ooo.test sshd[328498]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:08 np0005486759.ooo.test sshd[328498]: Accepted publickey for zuul from 38.102.83.114 port 52040 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:08 np0005486759.ooo.test systemd-logind[759]: New session 48 of user zuul.
Oct 14 10:11:08 np0005486759.ooo.test systemd[1]: Started Session 48 of User zuul.
Oct 14 10:11:08 np0005486759.ooo.test sshd[328498]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:08 np0005486759.ooo.test sudo[328502]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Oct 14 10:11:08 np0005486759.ooo.test sudo[328502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:08 np0005486759.ooo.test sudo[328502]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:08 np0005486759.ooo.test sshd[328501]: Received disconnect from 38.102.83.114 port 52040:11: disconnected by user
Oct 14 10:11:08 np0005486759.ooo.test sshd[328501]: Disconnected from user zuul 38.102.83.114 port 52040
Oct 14 10:11:08 np0005486759.ooo.test sshd[328498]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:08 np0005486759.ooo.test systemd[1]: session-48.scope: Deactivated successfully.
Oct 14 10:11:08 np0005486759.ooo.test systemd-logind[759]: Session 48 logged out. Waiting for processes to exit.
Oct 14 10:11:08 np0005486759.ooo.test systemd-logind[759]: Removed session 48.
Oct 14 10:11:08 np0005486759.ooo.test sshd[328520]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:08 np0005486759.ooo.test sshd[328520]: Accepted publickey for zuul from 38.102.83.114 port 52044 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:08 np0005486759.ooo.test systemd-logind[759]: New session 49 of user zuul.
Oct 14 10:11:08 np0005486759.ooo.test systemd[1]: Started Session 49 of User zuul.
Oct 14 10:11:08 np0005486759.ooo.test sshd[328520]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:08 np0005486759.ooo.test sudo[328524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Oct 14 10:11:08 np0005486759.ooo.test sudo[328524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:08 np0005486759.ooo.test sudo[328524]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:08 np0005486759.ooo.test sshd[328523]: Received disconnect from 38.102.83.114 port 52044:11: disconnected by user
Oct 14 10:11:08 np0005486759.ooo.test sshd[328523]: Disconnected from user zuul 38.102.83.114 port 52044
Oct 14 10:11:08 np0005486759.ooo.test sshd[328520]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:08 np0005486759.ooo.test systemd[1]: session-49.scope: Deactivated successfully.
Oct 14 10:11:08 np0005486759.ooo.test systemd-logind[759]: Session 49 logged out. Waiting for processes to exit.
Oct 14 10:11:08 np0005486759.ooo.test systemd-logind[759]: Removed session 49.
Oct 14 10:11:09 np0005486759.ooo.test sshd[328543]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:09 np0005486759.ooo.test sshd[328543]: Accepted publickey for zuul from 38.102.83.114 port 52052 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:09 np0005486759.ooo.test systemd-logind[759]: New session 50 of user zuul.
Oct 14 10:11:09 np0005486759.ooo.test systemd[1]: Started Session 50 of User zuul.
Oct 14 10:11:09 np0005486759.ooo.test sshd[328543]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:09 np0005486759.ooo.test sudo[328547]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Oct 14 10:11:09 np0005486759.ooo.test sudo[328547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:09 np0005486759.ooo.test sudo[328547]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:09 np0005486759.ooo.test sshd[328546]: Received disconnect from 38.102.83.114 port 52052:11: disconnected by user
Oct 14 10:11:09 np0005486759.ooo.test sshd[328546]: Disconnected from user zuul 38.102.83.114 port 52052
Oct 14 10:11:09 np0005486759.ooo.test sshd[328543]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:09 np0005486759.ooo.test systemd[1]: session-50.scope: Deactivated successfully.
Oct 14 10:11:09 np0005486759.ooo.test systemd-logind[759]: Session 50 logged out. Waiting for processes to exit.
Oct 14 10:11:09 np0005486759.ooo.test systemd-logind[759]: Removed session 50.
Oct 14 10:11:09 np0005486759.ooo.test sshd[328565]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:09 np0005486759.ooo.test sshd[328565]: Accepted publickey for zuul from 38.102.83.114 port 52058 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:09 np0005486759.ooo.test systemd-logind[759]: New session 51 of user zuul.
Oct 14 10:11:09 np0005486759.ooo.test systemd[1]: Started Session 51 of User zuul.
Oct 14 10:11:09 np0005486759.ooo.test sshd[328565]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:09 np0005486759.ooo.test sudo[328569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Oct 14 10:11:09 np0005486759.ooo.test sudo[328569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:09 np0005486759.ooo.test sudo[328569]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:09 np0005486759.ooo.test sshd[328568]: Received disconnect from 38.102.83.114 port 52058:11: disconnected by user
Oct 14 10:11:09 np0005486759.ooo.test sshd[328568]: Disconnected from user zuul 38.102.83.114 port 52058
Oct 14 10:11:09 np0005486759.ooo.test sshd[328565]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:09 np0005486759.ooo.test systemd[1]: session-51.scope: Deactivated successfully.
Oct 14 10:11:09 np0005486759.ooo.test systemd-logind[759]: Session 51 logged out. Waiting for processes to exit.
Oct 14 10:11:09 np0005486759.ooo.test systemd-logind[759]: Removed session 51.
Oct 14 10:11:10 np0005486759.ooo.test sshd[328587]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:10 np0005486759.ooo.test sshd[328587]: Accepted publickey for zuul from 38.102.83.114 port 52068 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:10 np0005486759.ooo.test systemd-logind[759]: New session 52 of user zuul.
Oct 14 10:11:10 np0005486759.ooo.test systemd[1]: Started Session 52 of User zuul.
Oct 14 10:11:10 np0005486759.ooo.test sshd[328587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:10 np0005486759.ooo.test sudo[328591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Oct 14 10:11:10 np0005486759.ooo.test sudo[328591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:10 np0005486759.ooo.test sudo[328591]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:10 np0005486759.ooo.test sshd[328590]: Received disconnect from 38.102.83.114 port 52068:11: disconnected by user
Oct 14 10:11:10 np0005486759.ooo.test sshd[328590]: Disconnected from user zuul 38.102.83.114 port 52068
Oct 14 10:11:10 np0005486759.ooo.test sshd[328587]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:10 np0005486759.ooo.test systemd[1]: session-52.scope: Deactivated successfully.
Oct 14 10:11:10 np0005486759.ooo.test systemd-logind[759]: Session 52 logged out. Waiting for processes to exit.
Oct 14 10:11:10 np0005486759.ooo.test systemd-logind[759]: Removed session 52.
Oct 14 10:11:10 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:10.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:10 np0005486759.ooo.test sshd[328609]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:10 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:11:10 np0005486759.ooo.test sshd[328609]: Accepted publickey for zuul from 38.102.83.114 port 52072 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:10 np0005486759.ooo.test systemd-logind[759]: New session 53 of user zuul.
Oct 14 10:11:10 np0005486759.ooo.test systemd[1]: Started Session 53 of User zuul.
Oct 14 10:11:10 np0005486759.ooo.test sshd[328609]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:10 np0005486759.ooo.test systemd[1]: tmp-crun.iBwqr9.mount: Deactivated successfully.
Oct 14 10:11:10 np0005486759.ooo.test podman[328611]: 2025-10-14 10:11:10.970774097 +0000 UTC m=+0.080095531 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:11:11 np0005486759.ooo.test podman[328611]: 2025-10-14 10:11:11.004410415 +0000 UTC m=+0.113731839 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Oct 14 10:11:11 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:11:11 np0005486759.ooo.test sudo[328632]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Oct 14 10:11:11 np0005486759.ooo.test sudo[328632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:11 np0005486759.ooo.test sudo[328632]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:11 np0005486759.ooo.test sshd[328624]: Received disconnect from 38.102.83.114 port 52072:11: disconnected by user
Oct 14 10:11:11 np0005486759.ooo.test sshd[328624]: Disconnected from user zuul 38.102.83.114 port 52072
Oct 14 10:11:11 np0005486759.ooo.test sshd[328609]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:11 np0005486759.ooo.test systemd[1]: session-53.scope: Deactivated successfully.
Oct 14 10:11:11 np0005486759.ooo.test systemd-logind[759]: Session 53 logged out. Waiting for processes to exit.
Oct 14 10:11:11 np0005486759.ooo.test systemd-logind[759]: Removed session 53.
Oct 14 10:11:11 np0005486759.ooo.test sshd[328650]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:11 np0005486759.ooo.test sshd[328650]: Accepted publickey for zuul from 38.102.83.114 port 52088 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:11 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:11:11 np0005486759.ooo.test systemd-logind[759]: New session 54 of user zuul.
Oct 14 10:11:11 np0005486759.ooo.test systemd[1]: Started Session 54 of User zuul.
Oct 14 10:11:11 np0005486759.ooo.test sshd[328650]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:11 np0005486759.ooo.test podman[328653]: 2025-10-14 10:11:11.568555676 +0000 UTC m=+0.052218118 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 14 10:11:11 np0005486759.ooo.test sudo[328672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Oct 14 10:11:11 np0005486759.ooo.test sudo[328672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:11 np0005486759.ooo.test podman[328653]: 2025-10-14 10:11:11.603273827 +0000 UTC m=+0.086936289 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:11:11 np0005486759.ooo.test sudo[328672]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:11 np0005486759.ooo.test sshd[328659]: Received disconnect from 38.102.83.114 port 52088:11: disconnected by user
Oct 14 10:11:11 np0005486759.ooo.test sshd[328659]: Disconnected from user zuul 38.102.83.114 port 52088
Oct 14 10:11:11 np0005486759.ooo.test sshd[328650]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:11 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:11:11 np0005486759.ooo.test systemd[1]: session-54.scope: Deactivated successfully.
Oct 14 10:11:11 np0005486759.ooo.test systemd-logind[759]: Session 54 logged out. Waiting for processes to exit.
Oct 14 10:11:11 np0005486759.ooo.test systemd-logind[759]: Removed session 54.
Oct 14 10:11:11 np0005486759.ooo.test sshd[328698]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:12 np0005486759.ooo.test sshd[328698]: Accepted publickey for zuul from 38.102.83.114 port 52102 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:12 np0005486759.ooo.test systemd-logind[759]: New session 55 of user zuul.
Oct 14 10:11:12 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:12.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:12 np0005486759.ooo.test systemd[1]: Started Session 55 of User zuul.
Oct 14 10:11:12 np0005486759.ooo.test sshd[328698]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:12 np0005486759.ooo.test sudo[328702]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Oct 14 10:11:12 np0005486759.ooo.test sudo[328702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:12 np0005486759.ooo.test sudo[328702]: pam_unix(sudo:session): session closed for user root
Oct 14 10:11:12 np0005486759.ooo.test sshd[328701]: Received disconnect from 38.102.83.114 port 52102:11: disconnected by user
Oct 14 10:11:12 np0005486759.ooo.test sshd[328701]: Disconnected from user zuul 38.102.83.114 port 52102
Oct 14 10:11:12 np0005486759.ooo.test sshd[328698]: pam_unix(sshd:session): session closed for user zuul
Oct 14 10:11:12 np0005486759.ooo.test systemd[1]: session-55.scope: Deactivated successfully.
Oct 14 10:11:12 np0005486759.ooo.test systemd-logind[759]: Session 55 logged out. Waiting for processes to exit.
Oct 14 10:11:12 np0005486759.ooo.test systemd-logind[759]: Removed session 55.
Oct 14 10:11:12 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:11:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:11:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:11:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:11:12 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:11:12 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16706 "" "Go-http-client/1.1"
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:14 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:14 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:14 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:14 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:11:14 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:11:15 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:15.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:17 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:17.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:19 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51029 DF PROTO=TCP SPT=49838 DPT=9102 SEQ=3301020688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAAB1480000000001030307) 
Oct 14 10:11:20 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51030 DF PROTO=TCP SPT=49838 DPT=9102 SEQ=3301020688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAAB5420000000001030307) 
Oct 14 10:11:20 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=18917 DF PROTO=TCP SPT=33016 DPT=19885 SEQ=2934267484 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A01221FD30000000001030307) 
Oct 14 10:11:20 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:20.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:21 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:21.210 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:21 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=18918 DF PROTO=TCP SPT=33016 DPT=19885 SEQ=2934267484 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A012223C30000000001030307) 
Oct 14 10:11:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:22.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:22.185 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:22 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:22.189 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:22 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51031 DF PROTO=TCP SPT=49838 DPT=9102 SEQ=3301020688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAABD410000000001030307) 
Oct 14 10:11:23 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=18919 DF PROTO=TCP SPT=33016 DPT=19885 SEQ=2934267484 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A01222BC30000000001030307) 
Oct 14 10:11:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:24.186 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:24 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:24.206 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:11:24 np0005486759.ooo.test podman[328722]: 2025-10-14 10:11:24.495615401 +0000 UTC m=+0.112341386 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251009, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:11:24 np0005486759.ooo.test podman[328722]: 2025-10-14 10:11:24.501515072 +0000 UTC m=+0.118241047 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: tmp-crun.FlHe7b.mount: Deactivated successfully.
Oct 14 10:11:24 np0005486759.ooo.test podman[328723]: 2025-10-14 10:11:24.553293495 +0000 UTC m=+0.165954275 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 14 10:11:24 np0005486759.ooo.test podman[328723]: 2025-10-14 10:11:24.590219424 +0000 UTC m=+0.202880164 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.build-date=20251009, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct 14 10:11:24 np0005486759.ooo.test podman[328720]: 2025-10-14 10:11:24.602775939 +0000 UTC m=+0.223413524 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:11:24 np0005486759.ooo.test podman[328720]: 2025-10-14 10:11:24.611135764 +0000 UTC m=+0.231773329 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:11:24 np0005486759.ooo.test podman[328721]: 2025-10-14 10:11:24.664977801 +0000 UTC m=+0.285964196 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, container_name=iscsid)
Oct 14 10:11:24 np0005486759.ooo.test podman[328721]: 2025-10-14 10:11:24.677256896 +0000 UTC m=+0.298243221 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 14 10:11:24 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:11:25 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:25.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.190 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.220 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.220 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.221 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.222 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Auditing locally available compute resources for np0005486759.ooo.test (node: np0005486759.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.300 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.374 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.376 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.452 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.453 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.507 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.509 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.581 2 DEBUG oslo_concurrency.processutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4408214d-dae5-4452-92e9-eb4abd6589d4/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 14 10:11:26 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51032 DF PROTO=TCP SPT=49838 DPT=9102 SEQ=3301020688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAACD010000000001030307) 
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.794 2 WARNING nova.virt.libvirt.driver [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.796 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Hypervisor/Node resource view: name=np0005486759.ooo.test free_ram=12267MB free_disk=386.6765823364258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.797 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.797 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.877 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Instance 4408214d-dae5-4452-92e9-eb4abd6589d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.878 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.878 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Final resource view: name=np0005486759.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.936 2 DEBUG nova.compute.provider_tree [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed in ProviderTree for provider: 2da4b4c2-8401-4cdb-85a2-115635137a6d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.955 2 DEBUG nova.scheduler.client.report [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Inventory has not changed for provider 2da4b4c2-8401-4cdb-85a2-115635137a6d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.958 2 DEBUG nova.compute.resource_tracker [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Compute_service record updated for np0005486759.ooo.test:np0005486759.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 14 10:11:26 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:26.958 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:11:27 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:27.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:27 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=4287 DF PROTO=TCP SPT=33166 DPT=19885 SEQ=3175566695 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A01223AC50000000001030307) 
Oct 14 10:11:28 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=4288 DF PROTO=TCP SPT=33166 DPT=19885 SEQ=3175566695 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A01223EC30000000001030307) 
Oct 14 10:11:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:29.959 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:29.959 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 14 10:11:29 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:29.960 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 14 10:11:30 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=4289 DF PROTO=TCP SPT=33166 DPT=19885 SEQ=3175566695 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A012246C30000000001030307) 
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.597 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquiring lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.597 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Acquired lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.597 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.597 2 DEBUG nova.objects.instance [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4408214d-dae5-4452-92e9-eb4abd6589d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.974 2 DEBUG nova.network.neutron [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updating instance_info_cache with network_info: [{"id": "eee08de8-f983-4ebe-a654-f67f48659e50", "address": "fa:16:3e:8e:cf:16", "network": {"id": "9197abc5-07db-4abf-9578-9360b49aea49", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "8bf64e81a4214f9490d231a2e79ab3d8", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee08de8-f9", "ovs_interfaceid": "eee08de8-f983-4ebe-a654-f67f48659e50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.996 2 DEBUG oslo_concurrency.lockutils [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Releasing lock "refresh_cache-4408214d-dae5-4452-92e9-eb4abd6589d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.996 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] [instance: 4408214d-dae5-4452-92e9-eb4abd6589d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.997 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.997 2 DEBUG oslo_service.periodic_task [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 14 10:11:30 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:30.998 2 DEBUG nova.compute.manager [None req-007ac93c-0be0-4c9e-a0dc-ff1eb3243d87 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 14 10:11:31 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=35085 DF PROTO=TCP SPT=33168 DPT=19885 SEQ=2580216683 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A012249EF0000000001030307) 
Oct 14 10:11:32 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:32.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.
Oct 14 10:11:32 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.
Oct 14 10:11:32 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=35086 DF PROTO=TCP SPT=33168 DPT=19885 SEQ=2580216683 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A01224E030000000001030307) 
Oct 14 10:11:32 np0005486759.ooo.test podman[328811]: 2025-10-14 10:11:32.461182689 +0000 UTC m=+0.083910617 container health_status 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251009, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 14 10:11:32 np0005486759.ooo.test podman[328812]: 2025-10-14 10:11:32.512667943 +0000 UTC m=+0.132929446 container health_status 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal)
Oct 14 10:11:32 np0005486759.ooo.test podman[328811]: 2025-10-14 10:11:32.532365355 +0000 UTC m=+0.155093283 container exec_died 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 14 10:11:32 np0005486759.ooo.test systemd[1]: 1c82a774a7a52164ddce91ccefa49dd8b0e26418cc8bd63fceb2770f9527091b.service: Deactivated successfully.
Oct 14 10:11:32 np0005486759.ooo.test podman[328812]: 2025-10-14 10:11:32.552837542 +0000 UTC m=+0.173099015 container exec_died 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Oct 14 10:11:32 np0005486759.ooo.test systemd[1]: 60180fc82177990d9914704f8cfee689ed807de1535190de9e9631b4c6040f44.service: Deactivated successfully.
Oct 14 10:11:34 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=35087 DF PROTO=TCP SPT=33168 DPT=19885 SEQ=2580216683 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A012256030000000001030307) 
Oct 14 10:11:35 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:35.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:37 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:37.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:38 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=35088 DF PROTO=TCP SPT=33168 DPT=19885 SEQ=2580216683 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A012265C30000000001030307) 
Oct 14 10:11:40 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:40.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:41 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.
Oct 14 10:11:41 np0005486759.ooo.test podman[328856]: 2025-10-14 10:11:41.457566151 +0000 UTC m=+0.081439122 container health_status d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 14 10:11:41 np0005486759.ooo.test podman[328856]: 2025-10-14 10:11:41.467466314 +0000 UTC m=+0.091339295 container exec_died d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251009, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1e4eeec18f8da2b364b39b7a7358aef5, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:11:41 np0005486759.ooo.test systemd[1]: d540e051c3e16b12d77e9acc3eac1e75d294ab5b42e04f8d114f88e4b727f85c.service: Deactivated successfully.
Oct 14 10:11:42 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:42.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:42 np0005486759.ooo.test podman[265505]: time="2025-10-14T10:11:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 14 10:11:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:11:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 130673 "" "Go-http-client/1.1"
Oct 14 10:11:42 np0005486759.ooo.test podman[265505]: @ - - [14/Oct/2025:10:11:42 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16704 "" "Go-http-client/1.1"
Oct 14 10:11:42 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.
Oct 14 10:11:42 np0005486759.ooo.test podman[328875]: 2025-10-14 10:11:42.447529843 +0000 UTC m=+0.080009018 container health_status 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:11:42 np0005486759.ooo.test podman[328875]: 2025-10-14 10:11:42.483458801 +0000 UTC m=+0.115937966 container exec_died 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 14 10:11:42 np0005486759.ooo.test systemd[1]: 8de221af1bd13ef1fbf0d0b5f3517689c3d9b01e7a446c7de0130349c85744dd.service: Deactivated successfully.
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:44 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:44 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:44 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: ERROR   10:11:44 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 14 10:11:44 np0005486759.ooo.test openstack_network_exporter[267388]: 
Oct 14 10:11:44 np0005486759.ooo.test sshd[328897]: main: sshd: ssh-rsa algorithm is disabled
Oct 14 10:11:44 np0005486759.ooo.test sshd[328897]: Accepted publickey for zuul from 192.168.122.10 port 53206 ssh2: RSA SHA256:hBX/r/Lm3s7zKwm3myAVWQeLqpMOTacswshcRK5fhoY
Oct 14 10:11:44 np0005486759.ooo.test systemd-logind[759]: New session 56 of user zuul.
Oct 14 10:11:44 np0005486759.ooo.test systemd[1]: Started Session 56 of User zuul.
Oct 14 10:11:44 np0005486759.ooo.test sshd[328897]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Oct 14 10:11:45 np0005486759.ooo.test sudo[328901]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Oct 14 10:11:45 np0005486759.ooo.test sudo[328901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Oct 14 10:11:45 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:45.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:46 np0005486759.ooo.test kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:16:3e:77:13:69 MACDST=fa:16:3e:ba:52:20 MACPROTO=0800 SRC=38.102.83.114 DST=38.102.83.246 LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=35089 DF PROTO=TCP SPT=33168 DPT=19885 SEQ=2580216683 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080A012286030000000001030307) 
Oct 14 10:11:47 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:47.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:49 np0005486759.ooo.test ovs-vsctl[329065]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 14 10:11:49 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29946 DF PROTO=TCP SPT=45014 DPT=9102 SEQ=3185207427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAB26770000000001030307) 
Oct 14 10:11:49 np0005486759.ooo.test systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 328918 (sos)
Oct 14 10:11:49 np0005486759.ooo.test systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 14 10:11:49 np0005486759.ooo.test systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 14 10:11:50 np0005486759.ooo.test virtqemud[225922]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 14 10:11:50 np0005486759.ooo.test virtqemud[225922]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 14 10:11:50 np0005486759.ooo.test virtqemud[225922]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 14 10:11:50 np0005486759.ooo.test systemd[1]: efi.automount: Got automount request for /efi, triggered by 329221 (lsinitrd)
Oct 14 10:11:50 np0005486759.ooo.test systemd[1]: Mounting EFI System Partition Automount...
Oct 14 10:11:50 np0005486759.ooo.test systemd[1]: Mounted EFI System Partition Automount.
Oct 14 10:11:50 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29947 DF PROTO=TCP SPT=45014 DPT=9102 SEQ=3185207427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAB2A810000000001030307) 
Oct 14 10:11:50 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:50.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:51 np0005486759.ooo.test crontab[329329]: (root) LIST (root)
Oct 14 10:11:52 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:52.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:52 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29948 DF PROTO=TCP SPT=45014 DPT=9102 SEQ=3185207427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAB32820000000001030307) 
Oct 14 10:11:53 np0005486759.ooo.test systemd[1]: Starting Hostname Service...
Oct 14 10:11:53 np0005486759.ooo.test systemd[1]: Started Hostname Service.
Oct 14 10:11:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:11:54.177 183328 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 14 10:11:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:11:54.178 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 14 10:11:54 np0005486759.ooo.test ovn_metadata_agent[183323]: 2025-10-14 10:11:54.179 183328 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: Started /usr/bin/podman healthcheck run f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.
Oct 14 10:11:55 np0005486759.ooo.test podman[329576]: 2025-10-14 10:11:55.459417132 +0000 UTC m=+0.081460522 container health_status b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 14 10:11:55 np0005486759.ooo.test podman[329576]: 2025-10-14 10:11:55.470046337 +0000 UTC m=+0.092089757 container exec_died b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd)
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: b7c58376f74ea7a9c3235324bd7c945e637871fd08355807344f045c716cc319.service: Deactivated successfully.
Oct 14 10:11:55 np0005486759.ooo.test podman[329574]: 2025-10-14 10:11:55.536042316 +0000 UTC m=+0.159786327 container health_status 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 14 10:11:55 np0005486759.ooo.test podman[329577]: 2025-10-14 10:11:55.566403784 +0000 UTC m=+0.183842893 container health_status f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Oct 14 10:11:55 np0005486759.ooo.test podman[329575]: 2025-10-14 10:11:55.574046927 +0000 UTC m=+0.198370376 container health_status 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251009, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 14 10:11:55 np0005486759.ooo.test podman[329577]: 2025-10-14 10:11:55.578259446 +0000 UTC m=+0.195698565 container exec_died f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251009, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d)
Oct 14 10:11:55 np0005486759.ooo.test podman[329575]: 2025-10-14 10:11:55.588263212 +0000 UTC m=+0.212586681 container exec_died 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=0468cb21803d466b2abfe00835cf1d2d, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251009)
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: f6a21c4690d19fc12d6a8532f7085a9f6a85badec5f973ddc74620686bd14f89.service: Deactivated successfully.
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: 895e92c87ba08819478c417092dae26799610d22378bf5ca33b2e0494fad3687.service: Deactivated successfully.
Oct 14 10:11:55 np0005486759.ooo.test podman[329574]: 2025-10-14 10:11:55.644056448 +0000 UTC m=+0.267800459 container exec_died 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 14 10:11:55 np0005486759.ooo.test systemd[1]: 347a38a998cd8b0c4e36d52e776c4f042788bbf8e385120e54db0c3ac83ecfa3.service: Deactivated successfully.
Oct 14 10:11:55 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:55.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 14 10:11:56 np0005486759.ooo.test kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b8:de:c0 MACDST=fa:16:3e:21:a1:c6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29949 DF PROTO=TCP SPT=45014 DPT=9102 SEQ=3185207427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3FAB42410000000001030307) 
Oct 14 10:11:56 np0005486759.ooo.test kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 14 10:11:56 np0005486759.ooo.test kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 14 10:11:56 np0005486759.ooo.test kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 14 10:11:56 np0005486759.ooo.test kernel: cfg80211: failed to load regulatory.db
Oct 14 10:11:56 np0005486759.ooo.test systemd-journald[35787]: Field hash table of /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Oct 14 10:11:56 np0005486759.ooo.test systemd-journald[35787]: /run/log/journal/8e1d5208cffec42b50976967e1d1cfd0/system.journal: Journal header limits reached or header out-of-date, rotating.
Oct 14 10:11:56 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 10:11:56 np0005486759.ooo.test rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Oct 14 10:11:57 np0005486759.ooo.test nova_compute[310511]: 2025-10-14 10:11:57.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
